Q: 6
You need to train a computer vision model that predicts the type of government ID present in a given
image using a GPU-powered virtual machine on Compute Engine. You use the following parameters:
• Optimizer: SGD
• Image shape 224x224
• Batch size 64
• Epochs 10
• Verbose 2
During training you encounter the following error: ResourceExhaustedError: out of Memory (oom)
when allocating tensor. What should you do?
Options
Discussion
You’re right, batch size is the main thing affecting GPU memory during training. B
B tbh, since batch size eats up a lot of GPU memory fast. D might be tempting but you'd lose image detail for IDs, so not ideal here. Saw a similar question in practice and B was correct. Trap is thinking optimizer or learning rate helps!
Reducing batch size (B) is usually the first thing to try for a ResourceExhaustedError since it scales down tensor allocations pretty quickly. Lowering image shape (D) works too but risks losing critical features in ID images. Pretty sure B is expected, unless resolution drops are acceptable. Agree?
Why not D? Smaller image shape means less memory per input, might solve OOM too.
A is wrong, B. Batch size directly affects how much data the GPU has to hold at once, so lowering it helps with OOM errors right away. Changing optimizer or learning rate won’t really cut memory use. I think B’s the obvious move unless the question limits you.
B , option D looks tempting but reducing batch size hits GPU memory use directly. Seen similar in exam reports.
Reduce batch size or image shape if OOM, but B is what I usually see required on exam.
I don’t think D is right here. B.
B vs D. When I've seen this type on other exams, B is the go-to but D is tempting as a trap since changing the image shape also cuts memory. Pretty sure B is still best for OOM on GPU but open to counterpoints if anyone’s tried both.
Why does Google keep putting these OOM batch size questions everywhere? Every similar practice I've seen, the answer is B since batch size is the main lever for memory use on GPU.
Be respectful. No spam.