AI Fundamentals — Medium
Key points
- Knowledge distillation involves training a smaller model to mimic a larger model's behavior.
- It helps reduce the size and complexity of neural networks.
- The goal is to retain the performance of the larger model in a more efficient manner.
Ready to go further?
Related questions
