Here are some research projects that I am excited about. For more details, see my publications and my github page.
Artificial Intelligence
Global Optimization and Neural Network Surrogate Models
String theory is the best known theory of quantum gravity. However, it has an astronomical number of different solutions, describing different mathematically consistent universes. But how do we find the solution that describes our universe?
Solutions of string theory with desirable properties can be found using global optimization algorithms. These algorithms can be further accelerated by using surrogate models: instead of performing computationally expensive exact calculations, one can use a machine learning model to predict the quantity being optimized. We apply such algorithms to search in the vast space of Calabi-Yau manifolds and the corresponding solutions of string theory.
Image from: https://www.forbes.com/
Image from: https://scienceexchange.caltech.edu/
Neural Networks and Field Theory
The field of artificial intelligence has seen remarkable progress in the past decade, much of which was powered by deep neural networks. However, a complete understanding of how neural networks work remains elusive. One way to make progress in this direction is to view neural networks as random functions: given an input, they return an output that is a random variable, as both the initialization and training of neural networks involve randomness. One can then determine how this random variable depends on the hyperparameters of the network to gain valuable insights. For example, one can show that the output of a feed-forward neural network becomes more chaotic as the network gets deeper.
Statistical/Quantum field theory is a (wildly successful) framework that physicists use to describe complex systems. At its core, a field theory describes the behavior of random fields, i.e. random functions.
This is a duality! We have two descriptions (neural networks and field theory) of the same complex system (a distribution over the space of random functions). We started constructing a dictionary between these two descriptions, but there is much work left to be done. One tantalizing question is: can one engineer a neural network that is equivalent to the Standard Model of particle physics, i.e. a neural network that describes our universe?
String Theory
Calabi-Yau Compactifications
Superstring theory (or M-theory) requires 10 spacetime dimensions, while we observe only 4 (3 space + 1 time). One way to reconcile this apparent discrepancy is to consider solutions of the theory where 6 of the dimensions are curled up into an unobservably small compact space, such that spacetime is made out of 4 large dimensions that we observe + a 6-dimensional compact space. The rules of physics that we observe in our large 4 dimensions depend on the topology, shape and size of this 6-dimensional space.
A class of solutions that are particularly well understood and suitable to describe our universe are those where this 6-dimensional space is a "Calabi-Yau manifold". The image depicts a 2D projection of one such space (this is not an artist's rendition!).
Historically, it has only been possible to study Calabi-Yau manifolds (and the associated solutions of string theory) with simple topologies. Over the years, we've improved the computational efficiency of several key calculations by many orders of magnitude and can now handle much more complex topologies.
One general method that has been fruitful is to generate large datasets of Calabi-Yau manifolds, allowing us to extract insights from data, train machine learning algorithms to further accelerate calculations, and find solutions with desirable properties.
Image from: Geoffrey Fatin
CYTools
The computational advances described above ultimately culminated in the open-source software package CYTools, created by me and Andres Rios-Tascon. With CYTools, generating a Calabi-Yau manifold is as easy as
poly = Polytope(vertices)
tri = poly.triangulate()
cy = tri.get_cy()