What is ‘batch normalization’ and why is it used in deep neural networks?

AI Fundamentals Medium

AI Fundamentals — Medium

What is ‘batch normalization’ and why is it used in deep neural networks?

Key points

  • Batch normalization normalizes inputs across mini-batches
  • It stabilizes training and speeds up convergence
  • Acts as a form of regularization in deep neural networks

Ready to go further?

Related questions