Q: 20
You are working on a Neural Network-based project. The dataset provided to you has columns with
different ranges. While preparing the data for model training, you discover that gradient
optimization is having difficulty moving weights to a good solution. What should you do?
Options
Discussion
Definitely B. Normalizing gets all your inputs on similar scales which makes gradient descent work way better, especially with neural nets. Without normalization, bigger ranged features can totally mess up convergence. Pretty sure this is what Google's looking for here but open to debate if anyone disagrees.
Yeah, normalization is the move here. B is correct since getting all features on a similar scale will help gradient descent optimize smoothly. Seen this in a couple similar practice sets, pretty sure that's the intended fix.
B here. Neural nets learn better when features have similar scales, so normalization helps gradients converge. This comes up in practice tests and in the official guide basics. Anyone see other resources suggest A for this case?
Its A, but if the question said 'best first step,' I'd rethink for B.
Be respectful. No spam.