-
Richard Samworth
Log-concave density estimation with applications
Log-concave densities on $\mathbb{R}^d$ form an attractive infinite-dimensional class
that includes many standard parametric families and has several useful properties.
For instance, in the context of density estimation, the log-concave maximum likelihood
estimator is a fully automatic nonparametric estimator, with no smoothing parameters
to choose. More generally, I will discuss ideas of log-concave projection and its relevance
for density estimation, regression, testing and Independent Component Analysis problems.
Download the slides
-
Subhashis Ghosal
Bayesian methods for high dimensional models: Convergence issues and
computational challenges
In modern statistical applications, it is very common to encounter observations
that can be only represented as objects of very high dimension, typically far
exceeding the available sample size. In spite of the very high complexity of the
data, a key feature that allows valid statistical analysis is sparsity, which
essentially means irrelevance of many components of the observations. Classical
statistical methods, typically based on penalization techniques, have been
developed and their convergence properties have been studied. More recently,
Bayesian methods for high dimensional observations have been considered. A
particularly satisfying feature of a Bayesian method is assessment of uncertainty
of each conceivable submodel arising out of all possible sparsity structures.
Sparsity is easily induced in a Bayesian framework by a mixture of a point mass
and a continuous distribution as a prior on the underlying parameters.
However, in most situations, the resulting posterior computation involves
Markov chain Monte-Carlo sampling which is required to move over an enormous
space of models, and hence is not practically feasible. We discuss some examples
such as additive regression, covariance estimation using graphical model,
covariance estimation using Bayesian graphical lasso and nonparametric density
regression using finite random series prior where posterior computation does
not require Markov chain Monte-Carlo methods, but can be computed using
conjugacy, Laplace approximation or direct sampling. We investigate convergence
rates and oracle properties for these problems.
Download the slides
|