Official implementation of the paper "Learning Prompt with Distribution-Based Feature Replay for Few-Shot Class-Incremental Learning"

Our code is mainly based on CoOp and SHIP. Sincerely thanks for their contribution.
Please refer to CoOp to install the requirements.
Please follow CEC to download mini-ImageNet, CUB-200 and CIFAR-100.
Please download SUN-397 dataset from SUN397.
Create ./data folder under this projects
mkdir dataMove or link those unzip datasets folder into this ./data, and make folder to the structure below:
./data/
CUB_200_2011/
images/
001.Black_footed_Albatross/
Black_Footed_Albatross_0001_796111.jpg
Black_Footed_Albatross_0002_55.jpg
...
002.Laysan_Albatross/
...
images.txt
image_class_labels.txt
train_test_split.txt
miniimagenet/
images/
._n0153282900000005.jpg
...
index_list/
mini_imagenet/
session_1.txt
session_2.txt
...
split/
train.csv
test.csv
SUN397/
images/
a/
abbey/
sun_aaalbzqrimafwbiv.jpg
sun_aaaulhwrhqgejnyt.jpg
...
airplane_cabin/
...
b/
...
split/
ClassName.txt
Training_01.txt
Testing_01.txt
Note that the CIFAR100 dataset is automatically downloaded by the torchvision's code, so there is no need to manually configure it.
The Gaussian Distribution of Old Classes of each dataset are release in google drive.
Download these .pkl files in ./pre_calculate_GD/ in the root of this project:
./pre_calculate_GD/
cifar100.pkl
cub200.pkl
miniImageNet.pkl
cub200_wo_base.pkl
sun397.pkl
In addition, you can use ./generate_GD.py to generate Gaussian Distribution for each class. The training images features can be easily extracted by image encoder of CLIP model, and the VAE, which is responsible to generate synthesized features, can be training by using SHIP.
Simply run script file in ./scripts/
For example, for training LP_DiF on CUB-200 dataset, just execute:
bash scripts/script_cub200.shFor training LP_DiF on mini-ImageNet dataset, execute:
bash scripts/script_miniImageNet.shFor training LP_DiF on CIFAR-100 dataset, execute:
bash scripts/script_cifar100.shFor training LP_DiF on SUN-397 dataset, execute:
bash scripts/script_sun397.shFor training LP_DiF on CUB-200* (CUB-200 w/o base session) dataset, execute:
bash scripts/script_cub200_wo_base.shThe fine-tuned prompt parameters on the last session of each dataset can be downloaded from this link: google drive.
You can download them and mv them into a created folder output. Then you can launch the evalution by run the train.py with args --eval-only.