Expandable Subspace Ensemble For Pre-trained Model-based Class-incremental Learning | Awesome LLM Papers Add your paper to Awesome LLM Papers

Expandable Subspace Ensemble For Pre-trained Model-based Class-incremental Learning

da-Wei Zhou, Hai-Long Sun, Han-Jia Ye, de-Chuan Zhan . 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2024 – 53 citations

[Code] [Paper]   Search on Google Scholar   Search on Semantic Scholar
CVPR Scalability

Class-Incremental Learning (CIL) requires a learning system to continually learn new classes without forgetting. Despite the strong performance of Pre-Trained Models (PTMs) in CIL, a critical issue persists: learning new classes often results in the overwriting of old ones. Excessive modification of the network causes forgetting, while minimal adjustments lead to an inadequate fit for new classes. As a result, it is desired to figure out a way of efficient model updating without harming former knowledge. In this paper, we propose ExpAndable Subspace Ensemble (EASE) for PTM-based CIL. To enable model updating without conflict, we train a distinct lightweight adapter module for each new task, aiming to create task-specific subspaces. These adapters span a high-dimensional feature space, enabling joint decision-making across multiple subspaces. As data evolves, the expanding subspaces render the old class classifiers incompatible with new-stage spaces. Correspondingly, we design a semantic-guided prototype complement strategy that synthesizes old classes’ new features without using any old class instance. Extensive experiments on seven benchmark datasets verify EASE’s state-of-the-art performance. Code is available at: https://github.com/sun-hailong/CVPR24-Ease

Similar Work