In an indirect Gaussian sequence space model lower and
upper bounds are derived for the concentration rate of the posterior
distribution of the parameter of interest shrinking to the parameter
value $\theta^\circ$ that generates the data. While this establishes posterior
consistency, however, the concentration rate depends on both $\theta^\circ$
and a tuning parameter which enters the prior distribution. We first
provide an oracle optimal choice of the tuning parameter, i.e.,
optimized for each $\theta^\circ$ separately. The optimal choice of the
prior distribution allows us to derive an oracle optimal
concentration rate of the associated posterior distribution.
Moreover, for a given class of parameters and a suitable choice of the
tuning parameter, we show that the resulting uniform
concentration rate over the given class is optimal in a minimax
sense. Finally, we construct a hierarchical prior that is adaptive.
This means that, given a parameter $\theta^\circ$ or a
class of parameters, respectively, the posterior distribution
contracts at the oracle rate or at the minimax rate over the
class. Notably, the hierarchical prior does not depend neither on $\theta^\circ$ nor
on the given class. Moreover, convergence of the fully data-driven Bayes
estimator at the oracle or at the minimax rate is established.
Reference:
Johannes, J., Simoni, A. and Schenk R. (2015). Adaptive Bayesian estimation in indirect Gaussian sequence space models (arXiv:1502.00184)