Rethinking Diffusion Models with Symmetries through Canonicalization with Applications to Molecular Graph Generation

2026-02-16Machine Learning

Machine LearningArtificial Intelligence
AI summary

The authors explore a new way to generate chemical and scientific data that respects symmetries like rotations and permutations. Instead of building special models that always follow these symmetries, they first transform data into a standard form, then use simpler models, and finally apply random symmetry changes to get back to the original distribution. They prove this approach is more expressive, trains faster, and works well for generating 3D molecular structures, outperforming existing symmetry-aware methods on a drug-like molecule dataset.

group symmetryequivarianceinvariancediffusion modelscanonicalizationgenerative modelingmolecular graph generationSE(3) symmetryoptimal transportGEOM-DRUG dataset
Authors
Cai Zhou, Zijie Chen, Zian Li, Jike Wang, Kaiyi Jiang, Pan Li, Rose Yu, Muhan Zhang, Stephen Bates, Tommi Jaakkola
Abstract
Many generative tasks in chemistry and science involve distributions invariant to group symmetries (e.g., permutation and rotation). A common strategy enforces invariance and equivariance through architectural constraints such as equivariant denoisers and invariant priors. In this paper, we challenge this tradition through the alternative canonicalization perspective: first map each sample to an orbit representative with a canonical pose or order, train an unconstrained (non-equivariant) diffusion or flow model on the canonical slice, and finally recover the invariant distribution by sampling a random symmetry transform at generation time. Building on a formal quotient-space perspective, our work provides a comprehensive theory of canonical diffusion by proving: (i) the correctness, universality and superior expressivity of canonical generative models over invariant targets; (ii) canonicalization accelerates training by removing diffusion score complexity induced by group mixtures and reducing conditional variance in flow matching. We then show that aligned priors and optimal transport act complementarily with canonicalization and further improves training efficiency. We instantiate the framework for molecular graph generation under $S_n \times SE(3)$ symmetries. By leveraging geometric spectra-based canonicalization and mild positional encodings, canonical diffusion significantly outperforms equivariant baselines in 3D molecule generation tasks, with similar or even less computation. Moreover, with a novel architecture Canon, CanonFlow achieves state-of-the-art performance on the challenging GEOM-DRUG dataset, and the advantage remains large in few-step generation.