This repository contains the implementation of the paper Training Latent Diffusion Models with Interacting Particle Algorithms.
We use miniforge to manage the environment.
If you only need to run the single GPU version of IPLD, run the following command to create a new environment:
cd IPLD/
conda create -n ipld python=3.10
conda activate ipld
pip install -r requirements.txtAlternatively, we provide a Dockerfile to create a docker image.
If you would like to run the distributed version of IPLD (sec 3.3 of the paper), run the following command to create a new environment:
cd IPLD/
conda create -n ipld_dist python=3.10
conda activate ipld_dist
pip install torch --index-url https://download.pytorch.org/whl/cu121
pip install fbgemm-gpu --index-url https://download.pytorch.org/whl/cu121
pip install torchmetrics==1.0.3
pip install torchrec --index-url https://download.pytorch.org/whl/cu121
pip install -r requirements_dist.txtAfter installing the requirements, you can install ipld as a package:
cd IPLD/
pip install -e .We provide configuration files for training IPLD in configs/experiments. An example command to train IPLD on SVHN is:
python -m scripts.run_ldm --config-name experiments/pldm/pldm_svhn \
logging.project_name="your_project_name" \
logging.run_name="your_run_name" \
logging.use_wandb=true \
model.num_particles=1To run the distributed version of IPLD for CIFAR-10 and SVHN on 2 GPUs, you can run:
export CUDA_VISIBLE_DEVICES=1,2
torchrun --nproc_per_node=2 --module scripts.train_ipld_distributed_multi --config-name=experiments/pldm_distwe only provide the implementation for built-in datasets (CIFAR-10 and SVHN) in the distributed version.
The configurations for training IPLD on synthetic data are similarly provided in `configs/experiments/:
python -m scripts.run_gmm --config-name experiments/pldm/pldm_synthThe code is organized into several directories:
configs: Contains the configuration files for different models and datasets.lvm: Contains the implementation of the latent variable models (LVMs).trainer: Contains the implementation of the trainer classes for different models.
@misc{wang2025traininglatentdiffusionmodels,
title={Training Latent Diffusion Models with Interacting Particle Algorithms},
author={Tim Y. J. Wang and Juan Kuntz and O. Deniz Akyildiz},
year={2025},
eprint={2505.12412},
archivePrefix={arXiv},
primaryClass={stat.ML},
url={https://arxiv.org/abs/2505.12412},
}