Neural Network Normalization
Normalization is done to make all the features to same scale, so that when gradient is calculated it is not dominated by the large values. Normally, when we do input data normalization we call it Normalization. There are different type of data normalization.
Also, when the data is passed through the layers and the activation function, we get different features. We can think of these features as the input of the next layer. So there is a need of normalization on those features also. Based on which scale that normalization is done, there are different type of normalization