SEMINAR by Mikolaj Kasprzak (MIT and University of Luxembourg)

November 18, 2022

11:30 - 12:30


ISBA - C115 (1st Floor)

"How good is your Laplace approximation? Finite-sample error bounds for a variety of useful divergences"

Abstract :The Laplace approximation is a popular method for providing mean and variance estimates for a Bayesian posterior. But can we trust these estimates for practical use? One might consider using rate-of-convergence bounds for the Bayesian Central Limit Theorem (BCLT) to provide quality guarantees for the Laplace approximation. But the bounds in existing versions of the BCLT either: require knowing the true data-generating parameter, are asymptotic in the number of samples, do not control the Bayesian posterior mean, or apply only to narrow classes of models. Our work provides the first closed-form, finite-sample quality bounds for the Laplace approximation that simultaneously (1) do not require knowing the true parameter, (2) control posterior means and variances, and (3) apply generally to models that satisfy the conditions of the asymptotic BCLT. In fact, our bounds work even in the presence of misspecification. We compute exact constants in our bounds for a variety of standard models, including logistic regression, and numerically demonstrate their utility. And we provide a framework for analysis of more complex models.

This is joint work with Ryan Giordano (MIT) and Tamara Broderick (MIT).

Categories Events: