Parallel Scan Recurrent Neural Quantum States for Scalable Variational Monte Carlo
2026-05-13 • Machine Learning
Machine Learning
AI summaryⓘ
The authors show that a type of neural network called recurrent neural networks (RNNs), previously thought to be too sequential and slow for simulating quantum systems, can actually be made efficient and scalable. They create a new method called parallel scan recurrent neural quantum states (PSR-NQS) that can run simulations quickly on quantum spin lattices of significant size. Their results match well with established quantum Monte Carlo data, proving that RNNs can be practical for studying complex quantum systems with reasonable computational power. This challenges the common belief that only highly parallel models like transformers are suitable for this task.
Neural-network quantum statesRecurrent neural networksVariational Monte CarloAutoregressive modelsParallel scanSpin latticesQuantum many-body systemsVariational ansatzTransformer architectures
Authors
Ejaaz Merali, Mohamed Hibat-Allah, Mohammad Kohandel, Richard T. Scalettar, Ehsan Khatami
Abstract
Neural-network quantum states have emerged as a powerful variational framework for quantum many-body systems, with recent progress often driven by massively parallel architectures such as transformers. Recurrent neural network quantum states, however, are frequently regarded as intrinsically sequential and therefore less scalable. Here we revisit this view by showing that modern recurrent architectures can support fast, accurate, and computationally accessible neural quantum state simulations. Using autoregressive recurrent wave functions together with recent advances in parallelizable recurrence, we develop variational ansätze, called parallel scan recurrent neural quantum states (PSR-NQS), which can be trained efficiently within variational Monte Carlo in one and two spatial dimensions. We demonstrate accurate benchmark results and show that, with iterative retraining, our approach reaches two-dimensional spin lattices as large as $52\times52$ while remaining in agreement with available quantum Monte Carlo data. Our results establish recurrent architectures as a practical and promising route toward scalable neural quantum state simulations with modest computational resources.