Skip to content

πŸ€— [ICLR 2024] Disentangling Time Series Representations via Contrastive based l-Variational Inference

Notifications You must be signed in to change notification settings

Institut-Polytechnique-de-Paris/time-disentanglement-lib

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

55 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Time-Disentanglement-Lib

A comprehensive PyTorch library for Disentangled Representation Learning in Time Series.

News πŸ“£: Our paper "Disentangling Time Series Representations via Contrastive Independence-of-Support on $l$-Variational Inference" has been published at ICLR.

Overview

Time-Disentanglement-Lib is a modular framework designed to facilitate research in disentangled representation learning for sequential data. Unlike standard static disentanglement libraries, this repository focuses on the temporal dynamics, offering state-of-the-art (SOTA) models, including our proposed DIoSC framework.

This library provides:

  1. SOTA Baselines: Implementations of leading time-series and static disentanglement models.
  2. Evaluation Metrics: A suite of quantitative metrics to measure disentanglement quality.
  3. Reproducibility: Pre-configured experiments for datasets like UK-DALE, MNIST, dSprites, and more.

πŸ—οΈ Supported Models

We support a wide range of models, categorized into Time-Series specific architectures and General VAE-based frameworks.

Time-Series & Sequential Models

Model Type Paper / Citation
DIoSC (Ours) Contrastive/HVAE Disentangling Time Series Representations via Contrastive Independence-of-Support
S3VAE Sequential VAE Self-Supervised Sequential VAE (Li et al., 2018)
CoST Contrastive Contrastive Seasonal-Trend Decomposition (Woo et al., ICLR 2022)
D3VAE Diffusion/VAE Generative Time Series Forecasting with Diffusion, Denoise, and Disentanglement (Yang et al., 2023)
RNN-VAE Recurrent A Recurrent Latent Variable Model for Sequential Data (Chung et al., NeurIPS 2015)
Autoformer Transformer Autoformer: Decomposition Transformers for Long-Term Series Forecasting (Wu et al., NeurIPS 2021)
Probabilistic Transformer Transformer Deep Transformer Models for Time Series Forecasting (Wen et al., 2020)

General Disentanglement Baselines

Model Paper / Citation
Standard VAE Auto-Encoding Variational Bayes (Kingma & Welling, 2013)
Ξ²-VAE (H) Ξ²-VAE: Learning Basic Visual Concepts (Higgins et al., ICLR 2017)
Ξ²-VAE (B) Understanding disentangling in Ξ²-VAE (Burgess et al., 2018)
Ξ²-TCVAE Isolating Sources of Disentanglement (Chen et al., NeurIPS 2018)
FactorVAE Disentangling by Factorising (Kim & Mnih, ICML 2018)

ebox πŸ“ Evaluation Metrics

To rigorously evaluate the quality of the learned representations, we implement the following standard disentanglement metrics:

Metric Description Citation
Beta-VAE Score Measures accuracy of a linear classifier predicting the fixed factor of variation. Higgins et al., 2017
FactorVAE Score Majority vote classifier accuracy on the index of the fixed generative factor. Kim & Mnih, 2018
MIG Mutual Information Gap: The difference in mutual information between the top two latent variables sharing info with a factor. Chen et al., 2018
DCI Disentanglement, Completeness, Informativeness: Uses importance weights from a regressor to quantify disentanglement. Eastwood & Williams, 2018
SAP Score Separated Attribute Predictability: The difference in prediction error between the top two most predictive latent dimensions. Kumar et al., 2017
Modularity Measures if each latent dimension depends on at most one factor of variation. Ridgeway & Mozer, 2018
UDR Unsupervised Disentanglement Ranking: A correlation-based metric for model selection without ground truth. Duan et al., 2020

πŸ’» Installation

To get started with the codebase, clone the repository and install the dependencies:

# Clone the repository
git clone https://github.com/yourusername/time-disentanglement-lib.git
cd time-disentanglement-lib

# Install requirements
pip install -r requirements.txt

Alternatively, if you are installing as a package (coming soon to PyPI):

pip install time-disentanglement

πŸš€ Quick Start

You can train and evaluate models using the main.py entry point.

Basic Training

Train the DIoSC model on the UK-DALE dataset:

python main.py DIoSC_ukdal_mini -d ukdal -l DIoSC --lr 0.001 -b 256 -e 50

Running Predefined Experiments

We provide configuration files for reproducibility. Use the -x flag to run specific benchmarks:

# Run Beta-TCVAE on Temporal CausalIden
python main.py -x btcvae_causalIden

# Run CoST on a time series dataset
python main.py -x cost_ukdal

Note: Hyperparameters are stored in hyperparam.ini. Pretrained models will be saved in results/<experiment_name>/.

Command Line Arguments

usage: main.py [-h] [-d DATASET] [-x EXPERIMENT] [-l LOSS] ...

Time-Disentanglement-Lib: A library for sequential representation learning.

Options:
  -h, --help            Show this help message.
  -d, --dataset         Dataset to use (e.g., mnist, dsprites, ukdal).
  -x, --experiment      Predefined experiment name (loads config from .ini).
  -l, --loss            Loss/Model type (e.g., DIoSC, CoST, betaH, btcvae).
  --lr LR               Learning rate.
  -b, --batch-size      Batch size.
  -e, --epochs          Number of training epochs.
  -s, --seed            Random seed for reproducibility.
  --no-cuda             Force CPU execution.

πŸ“œ Citation

If you use DIoSC or this library time-disentnaglement-lib in your research, please cite our ICLR 2024 paper:

@inproceedings{oublal2024disentangling,
  title={Disentangling time series representations via contrastive independence-of-support on l-variational inference},
  author={Oublal, Khalid and Ladjal, Said and Benhaiem, David and LE BORGNE, Emmanuel and Roueff, Fran{\c{c}}ois},
  booktitle={The Twelfth International Conference on Learning Representations},
  year={2024}
}

About

πŸ€— [ICLR 2024] Disentangling Time Series Representations via Contrastive based l-Variational Inference

Topics

Resources

Stars

Watchers

Forks