Professor of Mathematics and Statistics
Stanford University, USA
Department of Statistics
390 Jane Stanford Way
Stanford, CA 94305, USA
I am interested in probability theory, statistics, and mathematical physics. The following are some of my favorite works. For the complete list of publications and preprints, click here or visit my page in Google Scholar.
The confinement of quarks is one of the enduring mysteries of modern physics. This article shows that if a pure lattice gauge theory has mass gap in a strong sense, and the gauge group has a nontrivial center, then the theory confines quarks. This is the first rigorous result that connects mass gap with confinement, and also the first to give a rigorous justification for the role of center symmetry.
Average Gromov hyperbolicity and the Parisi ansatz. To appear in Adv. Math.
(Coauthored with Leila Sloman.)
Gromov hyperbolicity of a metric space measures the distance of the space from a perfect tree-like structure. The measure has a worst-case aspect to it, in the sense that it detects a region in the space which sees the maximum deviation from tree-like structure. This article introduces an "average-case" version of Gromov hyperbolicity, which detects whether the "most of the space", with respect to a given probability measure, looks like a tree. The main result of the paper is that if this average hyperbolicity is small, then the space can be approximately embedded in a tree. The result is applied to construct hierarchically organized pure states in any model of a spin glass that satisfies the Parisi ultrametricity ansatz. Previously, such constructions were available only in specific models.
A new coefficient of correlation. To appear in J. Amer. Statist. Assoc.
Is it possible to define a coefficient of correlation which is (a) as simple as the classical coefficients like Pearson's correlation or Spearman's correlation, and yet (b) consistently estimates some simple and interpretable measure of the degree of dependence between the variables, which is 0 if and only if the variables are independent and 1 if and only if one is a measurable function of the other, and (c) has a simple asymptotic theory under the hypothesis of independence, like the classical coefficients? This article answers this question in the affirmative, by producing such a coefficient. No assumptions are needed on the distributions of the variables. There are several coefficients in the literature that converge to 0 if and only if the variables are independent, but none that satisfy any of the other properties mentioned above.
Rigorous solution of strongly coupled SO(N) lattice gauge theory in the large N limit. Comm. Math. Phys. 2019.
Gauge-string duality is a general notion in physics which claims that gauge theories — which are theories of the quantum world — are sometimes dual to certain string theories, which are theories of gravity. The main result of this paper is a rigorous computation of Wilson loop expectations in strongly coupled SO(N) lattice gauge theory in the large N limit, in any dimension. The formula appears as an absolutely convergent sum over trajectories in a kind of string theory on the lattice, demonstrating an explicit gauge-string duality. This is the first rigorous formulation and proof of a gauge-string duality in large N lattice gauge theories.
The sample size required in importance sampling. Ann. App. Probab. 2018.
(Coauthored with Persi Diaconis.)
The goal of importance sampling is to estimate the expected value of a given function with respect to a probability measure Q using a random sample of size n drawn from a different probability measure P. If the two measures are nearly singular with respect to each other, which is often the case in practice, the sample size required for accurate estimation is large. The main result of this article shows that in a fairly general setting, a sample of size approximately exp(D) is necessary and sufficient for accurate estimation by importance sampling, where D is the Kullback-Leibler divergence of P from Q. In particular, the required sample size exhibits a kind of cut-off in the logarithmic scale. In spite of being a very old and popular method, this exact sample size requirement for importance sampling was not known prior to this work.
Nonlinear large deviations. Adv. Math. 2016.
(Coauthored with Amir Dembo.)
This is the first paper to compute large deviations for sparse random graphs. The techniques for dense graphs, based on Szemerédi's regularity lemma, are not applicable in the sparse regime. The technology developed here applies more broadly to large deviations for nonlinear functions of independent random variables, going beyond classical methods which cater mostly to linear functions.
Matrix estimation by Universal Singular Value Thresholding. Ann. Statist. 2015.
Consider the problem of estimating the entries of a large matrix, when the observed entries are noisy versions of a small random fraction of the original entries. This problem has received widespread attention in recent times. This paper introduces a simple estimation procedure, called Universal Singular Value Thresholding (USVT), that works for any matrix that has "a little bit of structure". Surprisingly, this simple estimator achieves the minimax optimal error rate up to a constant factor. Numerous examples are worked out.
Invariant measures and the soliton resolution conjecture. Comm. Pure Appl. Math. 2014.
The soliton resolution conjecture for the focusing nonlinear Schrödinger (NLS) equation is the vaguely worded claim that a global solution of the NLS equation, for generic initial data, will eventually resolve into a radiation component that disperses like a linear solution, plus a localized component that behaves like a soliton or multi-soliton solution. This paper gives a tentative formulation and proof of the soliton resolution conjecture for the discrete NLS equation.
Superconcentration and Related Topics. Springer, Cham. 2014.
This monograph studies three features of Gaussian random fields, called superconcentration, chaos, and multiple valleys, and explores the relations between them. It is shown that superconcentration is equivalent to chaos, and chaos implies multiple valleys. Superconcentration has been a known feature in probability theory for a while (under different names). This book is the first to connect it to chaos and multiple valleys. Two main achievements of the book are proofs of the disorder chaos and multiple valley conjectures for mean-field spin glasses, by first proving superconcentration using a novel spectral approach, and then using the equivalence theorems.
It is a longstanding conjecture that in the model of first-passage percolation on a lattice, two important numbers, known as the fluctuation exponent and the wandering exponent, are related through a universal relation that does not depend on the dimension. This is sometimes called the KPZ relation. This paper gives a rigorous proof of the KPZ relation assuming that the exponents exist in a certain sense.
The missing log in large deviations for triangle counts. Random Structures Algorithms. 2012.
This paper solves a well-known conjecture in random graph theory about the upper tail probabilities for the number of triangles in a random graph.
A new approach to strong embeddings. Probab. Theory Related Fields. 2012.
The KMT embedding theorem is a fundamental result of probability theory. It is also a famously hard result with an exceptionally complex proof. This paper gives a new and arguably simpler proof of the KMT embedding theorem for simple random walks.
The large deviation principle for the Erdős-Rényi random graph. European J. Comb. 2011.
(Coauthored with S. R. S. Varadhan.)
This paper gives the first development of a theory of large deviations for random graphs, using Szemerédi's regularity lemma and the theory of graph limits.
Gravitational allocation to Poisson points. Ann. Math. 2010.
Phase Transitions in Gravitational Allocation. Geom. Funct. Anal. 2010.
(Both papers coauthored with Ron Peled, Yuval Peres and Dan Romik.)
These two papers introduce and analyze a novel technique for allocating regions in space to the points of a Poisson point process, such that the regions have equal volume, are connected, and collectively form a partition of the whole space. Previous constructions of fair allocations had the undesirable feature of being too unwieldy in various senses.
A new method of normal approximation. Ann. Probab. 2008.
Fluctuations of eigenvalues and second order Poincaré inequalities. Probab. Theory Related Fields. 2009.
These two papers introduce a new approach to proving central limit theorems for general nonlinear functions of independent random variables, and a new class of probabilistic inequalities known as second order Poincaré inequalities.
A generalization of the Lindeberg principle. Ann. Probab. 2006.
This paper generalizes Lindeberg's proof of the central limit theorem to an invariance principle for arbitrary smooth functions of independent and weakly dependent random variables. The result is then used to prove universality of limiting spectral distributions of certain kinds of random matrices. This was the first application of Lindeberg's method in random matrix theory, which has now developed into a standard approach.
Concentration inequalities with exchangeable pairs. PhD thesis, Stanford University. 2005.
Stein's method for concentration inequalities. Probab. Theory Related Fields. 2007.
The PhD thesis introduces the first version of Stein's method for proving concentration inequalities. Previously, Stein's method was used only for proving central limit theorems. The paper develops it further with several new applications.