Info

  • Paper library

Misc Papers

Elastic Weight Consolidation

Selectively slows learning on parameters critical to earlier tasks, reducing catastrophic forgetting via a quadratic regulariser.


iCaRL – Incremental Classifier and Representation Learning

Learns new classes incrementally with exemplar rehearsal + nearest-mean classifier; no catastrophic forgetting.


Learning without Forgetting

Uses knowledge distillation to preserve performance on old tasks while training on new tasks — no access to old data.


← Back to projects