Feature Expansion and enhanced Compression for Class Incremental Learning - Equipe Models and AlgoriThms for pRocessIng and eXtracting information
Pré-Publication, Document De Travail Année : 2024

Feature Expansion and enhanced Compression for Class Incremental Learning

Résumé

Class incremental learning consists in training discriminative models to classify an increasing number of classes over time. However, doing so using only the newly added class data leads to the known problem of catastrophic forgetting of the previous classes. Recently, dynamic deep learning architectures have been shown to exhibit a better stability-plasticity trade-off by dynamically adding new feature extractors to the model in order to learn new classes followed by a compression step to scale the model back to its original size, thus avoiding a growing number of parameters. In this context, we propose a new algorithm that enhances the compression of previous class knowledge by cutting and mixing patches of previous class samples with the new images during compression using our Rehearsal-CutMix method. We show that this new data augmentation reduces catastrophic forgetting by specifically targeting past class information and improving its compression. Extensive experiments performed on the CIFAR and ImageNet datasets under diverse incremental learning evaluation protocols demonstrate that our approach consistently outperforms the state-of-the-art . The code will be made available upon publication of our work.
Fichier principal
Vignette du fichier
FECIL.pdf (1007.37 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04562851 , version 1 (30-04-2024)

Licence

Identifiants

Citer

Quentin Ferdinand, Gilles Le Chenadec, Benoit Clement, Panagiotis Papadakis, Quentin Oliveau. Feature Expansion and enhanced Compression for Class Incremental Learning. 2024. ⟨hal-04562851⟩
91 Consultations
52 Téléchargements

Altmetric

Partager

More