This is a pytorch implementation of the paper "PTQ-AttnDM: An Enhanced implementation of Post Training Quantisation with Self-attention on Diffusion Models".
- python>=3.8
- pytorch>=1.12.1
- torchvision>=0.13.0
- other packages like numpy, tqdm and math
Step-by-step installation instructions:
git clone https://github.com/aqilmarwan/attentionDM.git
cd attentionDMThe following experiments were performed in NVIDIA A500 with 24GB memory.
You can run the following command to generate 50000 CIFAR-10 32*32 images in low bitwidths with differentiable group-wise quantization and active timestep selection.
sh sample_cifar.sh
After generation, you can run the following command to evaluate IS and FID.
python -m pytorch_fid <dataset path> <image path>