Pretty sure C is the way to go here. Throttle with Cloud Armor seems like it’d limit requests per client, which fits if you want to reduce spikes over a time window. Not 100% though since I'm not sure if it's as aggressive as a ban. Agree or am I missing something?
Q: 1
Your application is deployed as a highly available cross-region solution behind a global external
HTTP(S) load balancer. You notice significant spikes in traffic from multiple IP addresses but it is
unknown whether the IPs are malicious. You are concerned about your application's availability. You
want to limit traffic from these clients over a specified time interval.
What should you do?
Options
Discussion
C/D? Firewall rule trap, but throttle with Cloud Armor (C) fits rate-limiting over time for each IP.
Guessing A this time. Cloud Armor's rate_based_ban directly blocks clients for a set interval if they go over the limit, which sounds stronger than just throttling. Not totally sure but that's how I read it.
Be respectful. No spam.
Q: 2
You are asked to recommend a solution to store and retrieve sensitive configuration data from an
application that runs on Compute Engine. Which option should you recommend?
Options
Discussion
Option D fits here. Secret Manager is designed to securely store and manage sensitive config data, unlike custom metadata or guest attributes. I remember similar scenarios in the official guide and Google’s practice tests. Open to corrections if I missed something obvious.
B tbh
D
Be respectful. No spam.
Q: 3
An organization adopts Google Cloud Platform (GCP) for application hosting services and needs
guidance on setting up password requirements for their Cloud Identity account. The organization has
a password policy requirement that corporate employee passwords must have a minimum number
of characters.
Which Cloud Identity password guidelines can the organization use to inform their new
requirements?
Options
Discussion
A/B? Google lets you set minimum 8, but lots of orgs go for 10+.
Looks like B since I keep seeing 10 as a suggested minimum in security circles, especially for enterprise policies. A is technically correct per Google’s guidelines, but B tempts people with better security. Anyone else tripped up by that?
A is what shows up in the official docs and Google Admin console, so stick with 8 characters minimum. If you want to double-check, the admin guide or GCP practice sets usually have this info too. Agree?
Be respectful. No spam.
Q: 4
Your organization uses a microservices architecture based on Google Kubernetes Engine (GKE).
Security reviews recommend tighter controls around deployed container images to reduce potential
vulnerabilities and maintain compliance. You need to implement an automated system by using
managed services to ensure that only approved container images are deployed to the GKE clusters.
What should you do?
Options
Discussion
Probably D, nice straightforward question for container security.
Be respectful. No spam.
Q: 5
Applications often require access to “secrets” - small pieces of sensitive data at build or run time. The
administrator managing these secrets on GCP wants to keep a track of “who did what, where, and
when?” within their GCP projects.
Which two log streams would provide the information that the administrator is looking for? (Choose
two.)
Options
Discussion
Not sure about B, it usually trips people up. Wouldn't Admin Activity (A) make more sense for tracking admin changes?
C and B tbh. Data Access logs are for tracking who accessed what, but I thought System Event logs would help for seeing changes at the system level. Seen similar advice in the official guide and GCP practice questions.
Be respectful. No spam.
Q: 6
You are in charge of migrating a legacy application from your company datacenters to GCP before the
current maintenance contract expires. You do not know what ports the application is using and no
documentation is available for you to check. You want to complete the migration without putting
your environment at risk.
What should you do?
Options
Discussion
Its B for me. Disabling all traffic in the VPC first and checking the firewall logs sounds like a safe play since you’re not sure which ports are needed. This way nothing slips by accident. Not totally sure though, maybe A is better if you want traffic to flow right away. Thoughts?
Probably A, since lift & shift into an isolated project lets you monitor with VPC Flow logs before tightening firewall rules. No need to refactor right away. Pretty sure that fits best for unknown port situations.
Be respectful. No spam.
Q: 7
You work for an organization in a regulated industry that has strict data protection requirements. The
organization backs up their data in the cloud. To comply with data privacy regulations, this data can
only be stored for a specific length of time and must be deleted after this specific period.
You want to automate the compliance with this regulation while minimizing storage costs. What
should you do?
Options
Discussion
D . Cloud Storage buckets with Object Lifecycle Management let you set automated rules to delete files after a certain period, which handles retention compliance and saves costs. The other options aren't as flexible or cost-effective for this use case.
C or D, but I don't think C covers object-level retention as well. BigQuery tables do expire but Cloud Storage's Lifecycle rules seem more purpose-built for this. A is definitely a trap here.
Be respectful. No spam.
Q: 8
An organization is migrating from their current on-premises productivity software systems to G Suite.
Some network security controls were in place that were mandated by a regulatory body in their
region for their previous on-premises system. The organization’s risk team wants to ensure that
network security controls are maintained and effective in G Suite. A security architect supporting this
migration has been asked to ensure that network security controls are in place as part of the new
shared responsibility model between the organization and Google Cloud.
What solution would help meet the requirements?
Options
Discussion
Actually, it's C here. For G Suite (now Google Workspace), Google handles network security since it's SaaS, so network controls are mostly Google's job in the shared model. Let me know if you see it differently.
D imo, based on some exam reports and checking Google documentation. Probably worth looking at the official guide and practice test too.
Be respectful. No spam.
Q: 9
You need to follow Google-recommended practices to leverage envelope encryption and encrypt
data at the application layer.
What should you do?
Options
Discussion
A. Store the encrypted DEK alongside the data, KEK goes in Cloud KMS. Follows envelope encryption and Google guidance.
Its A seen a similar one in exam reports, KMS KEK wraps local DEK then store both encrypted.
Be respectful. No spam.
Q: 10
A batch job running on Compute Engine needs temporary write access to a Cloud Storage bucket.
You want the batch job to use the minimum permissions necessary to complete the task. What
should you do?
Options
Discussion
Yeah, B lines up with least privilege since storage.objectCreator gives just enough rights for writing. No need for full admin or managing keys. I think it’s pretty standard practice, but let me know if you see a catch I missed.
I don’t think D fits since it’s extra steps, B makes more sense for just giving write access to the bucket.
B , seen similar question in practice sets, minimum privilege with storage.objectCreator fits here.
Be respectful. No spam.
Question 1 of 20 · Page 1 / 2