Forward-only Diffusion Probabilistic Models

  Uppsala University     Karolinska Institutet  

Forward-only Diffusion (FoD)


FoD introduces the mean reversion term $\mu - x_t$ into both the drift and diffusion functions, enabling high-quality data samples with a single diffusion process.

Summary

This work presents a forward-only diffusion (FoD) approach for generative modelling. In contrast to traditional diffusion models that rely on a coupled forward-backward diffusion scheme, FoD directly learns data generation through a single forward diffusion process, yielding a simple yet efficient generative framework. The core of FoD is a state-dependent linear stochastic differential equation that involves a mean-reverting term in both the drift and diffusion functions. This mean-reversion property guarantees the convergence to clean data, naturally simulating a stochastic interpolation between source and target distributions. More importantly, FoD is analytically tractable and is trained using a simple stochastic flow matching objective, enabling a few-step non-Markov chain sampling during inference. The proposed FoD model, despite its simplicity, achieves competitive performances on various image-conditioned (e.g., image restoration) and unconditional generation tasks, demonstrating its effectiveness in generative modelling.

Forward-only Diffusion Process

$\mathrm{d}x_t = \theta_t \, (\mu - x_t) \mathrm{d}t + \sigma_t (x_t - \mu) \mathrm{d}w_t$


where $\mu \sim p_\text{data}$ is the clean data and $x_0 \sim p_\text{prior}$ is the source data. The diffusion volatility increases in the beginning steps and then decreases to zero when $x_t$ converges to $\mu$.


Stochastic Flow Matching

The FoD process is analytically tractable and follows a multiplicative stochastic structure. We show that this model can be learned by approximating the vector field from each noisy state to the final clean data, called the Stochastic Flow Matching:


$L_\text{SFM}(\phi)=\mathbb{E}_{\mu,x_t} [ \| (\mu - x_t) - f_\phi(x_t, t) \|^2 ]$

Algorithms

The standard training and sampling (via the Euler–Maruyama method) procedures are provided in Algorithm 1 and Algorithm 2. In addition, we also provide fast sampling with Markov and non-Markov chains in Algorithm 3 and Algorithm 4.

Learning Mean-reverting ODEs for Unconditional Generation

We consider a deterministic version of FoD, omitting the diffusion term or setting $\sigma_t = 0$ for all times. As a result, we obtain a mean-reverting ODE that transforms source data to target data without the multiplicative noise injection, as


$\mathrm{d}x_t = \theta_t \, (\mu - x_t) \mathrm{d}{t}$


with solution


$x_t = \bigl(x_s - \mu \bigr) \, e^{-\int_{s}^t \theta_z \mathrm{d}{z}} + \mu.$


Note: Our primary FoD model can be regarded as a stochastic extension of flow matching models.

Results

Image Restoration



Unconditional Image Generation

Fast Sampling with Markov and non-Markov Chains

Thanks for your interest!

BibTeX

If our code helps your research or work, please consider citing our paper. The following are BibTeX references:

@article{luo2025forward,
  title={Forward-only Diffusion Probabilistic Models},
  author={Luo, Ziwei and Gustafsson, Fredrik K and Sj{\"o}lund, Jens and Sch{\"o}n, Thomas B},
  journal={arXiv preprint arXiv:xxx},
  year={2025}
}