Skip to content

The code repository for "Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning"(CVPR24) in PyTorch.

Notifications You must be signed in to change notification settings

sun-hailong/CVPR24-Ease

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning

🎉The code repository for "Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning" [paper](CVPR24) in PyTorch. If you use any content of this repo for your work, please cite the following bib entry:

  @inproceedings{zhou2024expandable,
    title={Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning},
    author={Zhou, Da-Wei and Sun, Hai-Long and Ye, Han-Jia and Zhan, De-Chuan},
    booktitle={CVPR},
    pages={23554--23564},
    year={2024}
  }

Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning

Class-Incremental Learning (CIL) requires a learning system to continually learn new classes without forgetting. Despite the strong performance of Pre-Trained Models (PTMs) in CIL, a critical issue persists: learning new classes often results in the overwriting of old ones. Excessive modification of the network causes forgetting, while minimal adjustments lead to an inadequate fit for new classes. As a result, it is desired to figure out a way of efficient model updating without harming former knowledge.

In this paper, we propose ExpAndable Subspace Ensemble (EASE) for PTM-based CIL. To enable model updating without conflict, we train a distinct lightweight adapter module for each new task, aiming to create task-specific subspaces. These adapters span a high-dimensional feature space, enabling joint decision-making across multiple subspaces. As data evolves, the expanding subspaces render the old class classifiers incompatible with new-stage spaces. Correspondingly, we design a semantic-guided prototype complement strategy that synthesizes old classes’ new features without using any old class instance. Extensive experiments on seven benchmark datasets verify EASE’s state-of-the-art performance.

🎊 Results

We conducted experiments on seven benchmark datasets to verify the competitive performance of EASE.

Requirements

🗂️ Environment

  1. torch 2.0.1
  2. torchvision 0.15.2
  3. timm 0.6.12

🔎 Dataset

We provide the processed datasets as follows:

  • CIFAR100: will be automatically downloaded by the code.
  • CUB200: Google Drive: link or Onedrive: link
  • ImageNet-R: Google Drive: link or Onedrive: link
  • ImageNet-A: Google Drive: link or Onedrive: link
  • OmniBenchmark: Google Drive: link or Onedrive: link
  • VTAB: Google Drive: link or Onedrive: link
  • ObjectNet: Onedrive: link You can also refer to the filelist if the file is too large to download.

You need to modify the path of the datasets in ./utils/data.py according to your own path.

These datasets are referenced in the ADAM

🔑 Running scripts

Please follow the settings in the exps folder to prepare json files, and then run:

python main.py --config ./exps/[filename].json

Here is an example of how to run the code

if you want to run the cifar dataset using ViT-B/16-IN1K, you can follow the script:

python main.py --config ./exps/ease_cifar.json

if you want to run the cifar dataset using ViT-B/16-IN21K, you can follow the script:

python main.py --config ./exps/ease_cifar_in21k.json

After running the code, you will get a log file in the logs/ease/cifar224/ folder.

👨‍🏫 Acknowledgment

We would like to express our gratitude to the following repositories for offering valuable components and functions that contributed to our work.

About

The code repository for "Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning"(CVPR24) in PyTorch.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages