Skip to content

isomorphicdude/InteractingParticleLDM

Repository files navigation

Interacting Particle Latent Diffusion (IPLD)

This repository contains the implementation of the paper Training Latent Diffusion Models with Interacting Particle Algorithms.

Installation

We use miniforge to manage the environment.

Single GPU

If you only need to run the single GPU version of IPLD, run the following command to create a new environment:

cd IPLD/
conda create -n ipld python=3.10
conda activate ipld
pip install -r requirements.txt

Alternatively, we provide a Dockerfile to create a docker image.

Multi GPU

If you would like to run the distributed version of IPLD (sec 3.3 of the paper), run the following command to create a new environment:

cd IPLD/
conda create -n ipld_dist python=3.10
conda activate ipld_dist

pip install torch --index-url https://download.pytorch.org/whl/cu121
pip install fbgemm-gpu --index-url https://download.pytorch.org/whl/cu121
pip install torchmetrics==1.0.3
pip install torchrec --index-url https://download.pytorch.org/whl/cu121

pip install -r requirements_dist.txt

After installing the requirements, you can install ipld as a package:

cd IPLD/
pip install -e .

Running the code

Single GPU

We provide configuration files for training IPLD in configs/experiments. An example command to train IPLD on SVHN is:

python -m scripts.run_ldm --config-name experiments/pldm/pldm_svhn \
    logging.project_name="your_project_name" \
    logging.run_name="your_run_name" \
    logging.use_wandb=true \
    model.num_particles=1

Multiple GPUs

To run the distributed version of IPLD for CIFAR-10 and SVHN on 2 GPUs, you can run:

export CUDA_VISIBLE_DEVICES=1,2
torchrun --nproc_per_node=2 --module scripts.train_ipld_distributed_multi --config-name=experiments/pldm_dist

we only provide the implementation for built-in datasets (CIFAR-10 and SVHN) in the distributed version.

Synthetic Data

The configurations for training IPLD on synthetic data are similarly provided in `configs/experiments/:

python -m scripts.run_gmm --config-name experiments/pldm/pldm_synth

Code Structure

The code is organized into several directories:

  • configs: Contains the configuration files for different models and datasets.
  • lvm: Contains the implementation of the latent variable models (LVMs).
  • trainer: Contains the implementation of the trainer classes for different models.

Citation

@misc{wang2025traininglatentdiffusionmodels,
      title={Training Latent Diffusion Models with Interacting Particle Algorithms},
      author={Tim Y. J. Wang and Juan Kuntz and O. Deniz Akyildiz},
      year={2025},
      eprint={2505.12412},
      archivePrefix={arXiv},
      primaryClass={stat.ML},
      url={https://arxiv.org/abs/2505.12412},
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages