I develop flexible and interpretable nonparametric models and studying their properties, as well as scalable algorithms for these models. Some of the fundamental building blocks for nonparametric Bayesian models include the Dirichlet process and the Gaussian process.
Some existing work focuses on developing models and algorithms for:
- mixture models with growing number of components
- graphs and relational data modeling,
- reducing computational cost in inference in Gaussian processes
- Bayesian count sketching via nonparametric priors, and
- modeling with conditional kernel density estimation for inverse reinforcement learning applications
See also our workshop at NeurIPS 2018: All of Bayesian Nonparametrics and References on Bayesian Nonparametrics for additional references.