What is the purpose of batch normalization in deep neural networks?

Data Science with Python Hard

Data Science with Python — Hard

What is the purpose of batch normalization in deep neural networks?

Key points

  • Batch normalization stabilizes training by normalizing activations
  • It allows for higher learning rates and acts as a regularizer
  • Learnable scale and shift parameters are included
  • Zero mean and unit variance are maintained within each mini-batch

Ready to go further?

Related questions