Photo by rawpixel on Unsplash

Bayesian nonparametrics

Bayesian nonparametrics

I develop flexible and interpretable nonparametric models and studying their properties, as well as scalable algorithms for these models. Some of the fundamental building blocks for nonparametric Bayesian models include the Dirichlet process and the Gaussian process.

Some existing work focuses on developing models and algorithms for:

See also our workshop at NeurIPS 2018: All of Bayesian Nonparametrics and References on Bayesian Nonparametrics for additional references.

Publications

Kernel density Bayesian inverse reinforcement learning

Submitted, 2022

Finite mixture models do not reliably learn the number of components

Proceedings of the 38th International Conference on Machine Learning (ICML), 2021
Oral presentation (short)

PDF Poster ICML arXiv ICML talk BibTeX

A Bayesian nonparametric view on count-min sketch

Advances in Neural Information Processing Systems (NeurIPS), 2018

PDF Video BibTeX Poster

Exchangeable trait allocations

Electronic Journal of Statistics, 2018

PDF arXiv BibTeX

Edge-exchangeable graphs and sparsity

Advances in Neural Information Processing Systems (NeurIPS), 2016
ISBA@NeurIPS Award at NeurIPS Workshop on Bayesian Nonparametrics

PDF Video arXiv BibTeX Poster

Paintboxes and probability functions for edge-exchangeable graphs

NeurIPS Workshop on Adaptive and Scalable Nonparametric Methods in Machine Learning, 2016
Oral presentation

PDF Poster

Priors on exchangeable directed graphs

Electronic Journal of Statistics, 2016

PDF arXiv BibTeX

Completely random measures for modeling power laws in graphs

NeurIPS 2015 Workshop on Networks in the Social and Information Sciences, 2015
Spotlight presentation

Avatar
Diana Cai

I am broadly interested in machine learning and statistics, and in particular, developing robust and reliable methods for modeling and inference.