Loading…

Statistical Predictions in String Theory and Deep Generative Models

Generative models in deep learning allow for sampling probability distributions that approximate data distributions. We propose using generative models for making approximate statistical predictions in the string theory landscape. For vacua admitting a Lagrangian description this can be thought of a...

Full description

Saved in:
Bibliographic Details
Published in:Fortschritte der Physik 2020-05, Vol.68 (5), p.n/a
Main Authors: Halverson, James, Long, Cody
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Generative models in deep learning allow for sampling probability distributions that approximate data distributions. We propose using generative models for making approximate statistical predictions in the string theory landscape. For vacua admitting a Lagrangian description this can be thought of as learning random tensor approximations of couplings. As a concrete proof‐of‐principle, we demonstrate in a large ensemble of Calabi‐Yau manifolds that Kähler metrics evaluated at points in Kähler moduli space are well‐approximated by ensembles of matrices produced by a deep convolutional Wasserstein GAN. Accurate approximations of the Kähler metric eigenspectra are achieved with far fewer than h11 Gaussian draws. Accurate extrapolation to values of h11 outside the training set are achieved via a conditional GAN. Together, these results implicitly suggest the existence of strong correlations in the data, as might be expected if Reid's fantasy is correct. Generative models in deep learning allow for sampling probability distributions that approximate data distributions. The authors propose using generative models for making approximate statistical predictions in the string theory landscape. For vacua admitting a Lagrangian description this can be thought of as learning random tensor approximations of couplings. As a concrete proof‐of‐principle, it is demonstrated in a large ensemble of Calabi‐Yau manifolds that Kähler metrics evaluated at points in Kähler moduli space are well‐approximated by ensembles of matrices produced by a deep convolutional Wasserstein GAN. Accurate approximations of the Kähler metric eigenspectra are achieved with far fewer than h11 Gaussian draws. Accurate extrapolation to values of h11 outside the training set are achieved via a conditional GAN. Together, these results implicitly suggest the existence of strong correlations in the data, as might be expected if Reid's fantasy is correct.
ISSN:0015-8208
1521-3978
DOI:10.1002/prop.202000005