What is ‘catastrophic forgetting’ in neural networks and how is it addressed in continual learning?

AI Fundamentals Hard

AI Fundamentals — Hard

What is ‘catastrophic forgetting’ in neural networks and how is it addressed in continual learning?

Key points

  • Neural networks abruptly lose past knowledge in catastrophic forgetting
  • Techniques like elastic weight consolidation help combat this issue
  • Continual learning strategies aim to prevent catastrophic forgetting
  • Addressing this is crucial for adapting models to new tasks

Ready to go further?

Related questions