Scaling Up Machine Learning For Quantum Field Theory with Equivariant
Continuous Flows
P. de Haan, C. Rainone, M. Cheng, and R. Bondesan. (2021)cite arxiv:2110.02673Comment: 8 pages, 5 figures. Fourth Workshop on Machine Learning and the Physical Sciences (NeurIPS 2021).
Abstract
We propose a continuous normalizing flow for sampling from the
high-dimensional probability distributions of Quantum Field Theories in
Physics. In contrast to the deep architectures used so far for this task, our
proposal is based on a shallow design and incorporates the symmetries of the
problem. We test our model on the $\phi^4$ theory, showing that it
systematically outperforms a realNVP baseline in sampling efficiency, with the
difference between the two increasing for larger lattices. On the largest
lattice we consider, of size $3232$, we improve a key metric, the
effective sample size, from 1% to 66% w.r.t. the realNVP baseline.
Description
Scaling Up Machine Learning For Quantum Field Theory with Equivariant Continuous Flows
%0 Generic
%1 dehaan2021scaling
%A de Haan, Pim
%A Rainone, Corrado
%A Cheng, Miranda C. N.
%A Bondesan, Roberto
%D 2021
%K hep-th machine_learning phd qft
%T Scaling Up Machine Learning For Quantum Field Theory with Equivariant
Continuous Flows
%U http://arxiv.org/abs/2110.02673
%X We propose a continuous normalizing flow for sampling from the
high-dimensional probability distributions of Quantum Field Theories in
Physics. In contrast to the deep architectures used so far for this task, our
proposal is based on a shallow design and incorporates the symmetries of the
problem. We test our model on the $\phi^4$ theory, showing that it
systematically outperforms a realNVP baseline in sampling efficiency, with the
difference between the two increasing for larger lattices. On the largest
lattice we consider, of size $3232$, we improve a key metric, the
effective sample size, from 1% to 66% w.r.t. the realNVP baseline.
@misc{dehaan2021scaling,
abstract = {We propose a continuous normalizing flow for sampling from the
high-dimensional probability distributions of Quantum Field Theories in
Physics. In contrast to the deep architectures used so far for this task, our
proposal is based on a shallow design and incorporates the symmetries of the
problem. We test our model on the $\phi^4$ theory, showing that it
systematically outperforms a realNVP baseline in sampling efficiency, with the
difference between the two increasing for larger lattices. On the largest
lattice we consider, of size $32\times 32$, we improve a key metric, the
effective sample size, from 1% to 66% w.r.t. the realNVP baseline.},
added-at = {2022-12-31T14:40:52.000+0100},
author = {de Haan, Pim and Rainone, Corrado and Cheng, Miranda C. N. and Bondesan, Roberto},
biburl = {https://www.bibsonomy.org/bibtex/254dbc87005d38ce66032b3ae1b77419c/intfxdx},
description = {Scaling Up Machine Learning For Quantum Field Theory with Equivariant Continuous Flows},
interhash = {727db336f3b1a691f099227946d5c864},
intrahash = {54dbc87005d38ce66032b3ae1b77419c},
keywords = {hep-th machine_learning phd qft},
note = {cite arxiv:2110.02673Comment: 8 pages, 5 figures. Fourth Workshop on Machine Learning and the Physical Sciences (NeurIPS 2021)},
timestamp = {2022-12-31T14:40:52.000+0100},
title = {Scaling Up Machine Learning For Quantum Field Theory with Equivariant
Continuous Flows},
url = {http://arxiv.org/abs/2110.02673},
year = 2021
}