Skip to yearly menu bar Skip to main content


Poster

Sampling from the Random Linear Model via Stochastic Localization Up to the AMP Threshold

Joshua Agterberg


Abstract: Recently, Approximate Message Passing (AMP) has been integrated with stochastic localization (diffusion model) by providing a computationally efficient estimator of the posterior mean. Existing (rigorous) analysis typically proves the success of sampling for sufficiently small noise, but determining the exact threshold involves several challenges. In this paper, we focus on sampling from the posterior in the linear inverse problem, with an i.i.d. random design matrix, and show that the threshold for sampling coincide with that of posterior mean estimation. We give a proof for the convergence in smoothed KL divergence whenever the noise variance Δ is below ΔAMP, which is the computation threshold for mean estimation introduced in (Barbier et al., 2020). We also show convergence in the Wasserstein distance under the same threshold assuming a dimension-free bound on the operator norm of the posterior covariance matrix, a condition strongly suggested by recent breakthroughs on operator norm bounds in similar replica symmetric systems. A key observation in our analysis is that phase transition does not occur along the sampling and interpolation paths assuming Δ<ΔAMP.

Live content is unavailable. Log in and register to view live content