Misc Papers
8/15/2025
Elastic Weight Consolidation
Selectively slows learning on parameters critical to earlier tasks, reducing catastrophic forgetting via a quadratic regulariser.
iCaRL – Incremental Classifier and Representation Learning
Learns new classes incrementally with exemplar rehearsal + nearest-mean classifier; no catastrophic forgetting.
Learning without Forgetting
Uses knowledge distillation to preserve performance on old tasks while training on new tasks — no access to old data.
← Back to projects