Skip to content

Pytorch implementation of Self Attention Enhanced Post Training Quantisation for Diffusion Models

License

Notifications You must be signed in to change notification settings

aqilmarwan/attentionDM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

59 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PTQ-AttnDM

License Build Status Coverage Status Version

Overview

This is a pytorch implementation of the paper "PTQ-AttnDM: An Enhanced implementation of Post Training Quantisation with Self-attention on Diffusion Models".

Prerequisites

  • python>=3.8
  • pytorch>=1.12.1
  • torchvision>=0.13.0
  • other packages like numpy, tqdm and math

Installation

Step-by-step installation instructions:

Clone the repository

git clone https://github.com/aqilmarwan/attentionDM.git
cd attentionDM

Configure environment variables from LDM

Usage

Basic Examples

Training and Testing

The following experiments were performed in NVIDIA A500 with 24GB memory.

Generate CIFAR-10 Images

You can run the following command to generate 50000 CIFAR-10 32*32 images in low bitwidths with differentiable group-wise quantization and active timestep selection.

sh sample_cifar.sh

Calculate FID

After generation, you can run the following command to evaluate IS and FID.

python -m pytorch_fid <dataset path> <image path>

Acknowledgements

About

Pytorch implementation of Self Attention Enhanced Post Training Quantisation for Diffusion Models

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published