Q: 11
A BigQuery table is used for real-time dashboards and requires a high volume of
small updates and deletes to individual rows. Which BigQuery feature or capability
should be leveraged to ensure efficient and performant data manipulation?
Options
Discussion
B imo
B here. Clustered and partitioned tables make those row-level updates way faster in BigQuery, especially with high DML volume.
Probably B, since if you aren't clustering/partitioning your table, DML on BigQuery gets super slow for row updates.
A . B looks tempting but the firewall thing doesn't give you Private Google Access, so the VM wouldn't actually reach Cloud Storage without an external IP. A is the only way that matches what they want, internal only, still able to pull from GCS. Anyone see it differently?
B - had something like this in a mock exam and clustering plus partitioning made DML on BigQuery way more performant for lots of row-level changes. Others worked, but nowhere near as efficient in practice.
A saw something close in a practice test. Private Google Access lets internal-only VMs reach GCS via gsutil even in locked-down environments.
Pretty sure A is right here. If you give the VM only an internal IP and enable Private Google Access, it can use gsutil to pull stuff from Cloud Storage without needing general public internet. That's exactly what Google recommends for secure zones like this. Saw a similar question in some practice sets. Open to other takes if I'm missing something.
Be respectful. No spam.
Question 11 of 35