Thanks for using MLGrind! To view this question you must subscribe to premium.
Tests understanding of training stability techniques – essential knowledge for deep learning roles.
Batch normalization normalizes the inputs of each layer to have zero mean and unit variance, then applies a learnable scale and shift.