18 Feb 2021 » The Extended Schwartz Theorem and strong posterior consistency
Schwartz’s theorem is a classical tool used to derive posterior consistency and is the foundation of many modern results for establishing frequentist consistency of Bayesian methods. An extension of Schwartz’s original theorem makes it much more broadly applicable. In this post, we review the extended Schwartz theorem, show how it can be used to recover the classical Schwartz theorem, and how it can be used to establish “strong” posterior consistency….
14 Feb 2021 » Schwartz’s Theorem and weak posterior consistency
Schwartz’s theorem and its extensions have been instrumental in the development of a rich suite of tools for analyzing frequentist consistency of Bayesian methods. Under mild regularity conditions on the prior, Schwartz’s theorem leads directly to posterior consistency with respect to the weak topology. In this post, we will state the theorem, discuss the conditions of the theorem, show how the conditions are satisfied for the weak topology as well as a few situations where its easier to satisfy the conditions, and then present a proof of the theorem….
18 Feb 2021 » The Extended Schwartz Theorem and strong posterior consistency
Schwartz’s theorem is a classical tool used to derive posterior consistency and is the foundation of many modern results for establishing frequentist consistency of Bayesian methods. An extension of Schwartz’s original theorem makes it much more broadly applicable. In this post, we review the extended Schwartz theorem, show how it can be used to recover the classical Schwartz theorem, and how it can be used to establish “strong” posterior consistency….
14 Feb 2021 » Schwartz’s Theorem and weak posterior consistency
Schwartz’s theorem and its extensions have been instrumental in the development of a rich suite of tools for analyzing frequentist consistency of Bayesian methods. Under mild regularity conditions on the prior, Schwartz’s theorem leads directly to posterior consistency with respect to the weak topology. In this post, we will state the theorem, discuss the conditions of the theorem, show how the conditions are satisfied for the weak topology as well as a few situations where its easier to satisfy the conditions, and then present a proof of the theorem….
07 Apr 2020 » Reading list on probabilistic numerics
Probabilistic numerical methods have seen a recent surge of interest. However, the methods date back to many key contributions made in the 60s-80s. The goal here is to collect a relevant, organized reading list here. I hope to update…
16 Feb 2018 » Natural gradient descent and mirror descent
In this post, we discuss the natural gradient and present the main result of Raskutti and Mukherjee (2014), which shows that the mirror descent algorithm is equivalent to natural gradient descent in the dual Riemannian manifold….
21 Nov 2017 » The Johnson-Lindenstrauss Lemma
The so-called curse of dimensionality reflects the idea that many methods are more difficult in higher-dimensions. This difficulty may be due a number of issues that become more complicated in higher-dimensions…
15 Nov 2017 » References on Bayesian nonparametrics
This post is a collection of references for Bayesian nonparametrics that I’ve found helpful or wish that I had known about earlier….
21 Oct 2017 » Wavelets and adaptive data analysis
For data that have a high signal-to-noise ratio, a nonparametric, adaptive method might be appropriate. In particular, we may want to fit the data to functions that are spatially imhomogenous, i.e., the smoothness of the function \(f(x)\) varies a lot with \(x\). In this post, we will discuss wavelets, which can be used an adaptive nonparametric estimation method….