Tests understanding of training stability techniques – essential knowledge for deep learning roles.
Batch normalization normalizes the inputs of each layer to have zero mean and unit variance, then applies a learnable scale and shift.