## statistics and machine learning posts

### probability

• 04 Jan 2015 » Measure theory basics

Those interested in machine learning may be wondering why they should be familiar with measure theory. One of the main reasons has to do with why we need measure theory for probability in general….

### statistical modeling and inference

• 16 Feb 2018 » Natural gradient descent and mirror descent

In this post, we discuss the natural gradient and present the main result of Raskutti and Mukherjee (2014), which shows that the mirror descent algorithm is equivalent to natural gradient descent in the dual Riemannian manifold….

• 21 Nov 2017 » The Johnson-Lindenstrauss Lemma

The so-called curse of dimensionality reflects the idea that many methods are more difficult in higher-dimensions. This difficulty may be due a number of issues that become more complicated in higher-dimensions…

• 15 Nov 2017 » References on Bayesian nonparametrics

This post is a collection of references for Bayesian nonparametrics that I’ve found helpful or wish that I had known about earlier….

• 21 Oct 2017 » Wavelets and adaptive data analysis

For data that have a high signal-to-noise ratio, a nonparametric, adaptive method might be appropriate. In particular, we may want to fit the data to functions that are spatially imhomogenous, i.e., the smoothness of the function $f(x)$ varies a lot with $x$. In this post, we will discuss wavelets, which can be used an adaptive nonparametric estimation method….

• 25 Jan 2015 » Important inequalities for probabilistic methods

In statistics, machine learning, theoretical computer science, and anything that incorporates randomness, we are interested in studying its behavior (e.g., asymptotic convergence rates, approximation error) using probabilistic tail bounds….