What is ‘knowledge distillation’ in deep learning?

AI Fundamentals Medium

AI Fundamentals — Medium

What is ‘knowledge distillation’ in deep learning?

Key points

  • Knowledge distillation involves training a smaller model to mimic a larger model's behavior.
  • It helps reduce the size and complexity of neural networks.
  • The goal is to retain the performance of the larger model in a more efficient manner.

Ready to go further?

Related questions