Authors: Shashank Singh,Barnabás P óczos
Where published:
NeurIPS 2014 12
ArXiv: 1603.08584
Document:
PDF
DOI
Abstract URL: http://arxiv.org/abs/1603.08584v1
We analyze a plug-in estimator for a large class of integral functionals of
one or more continuous probability densities. This class includes important
families of entropy, divergence, mutual information, and their conditional
versions. For densities on the $d$-dimensional unit cube $[0,1]^d$ that lie in
a $\beta$-H\"older smoothness class, we prove our estimator converges at the
rate $O \left( n^{-\frac{\beta}{\beta + d}} \right)$. Furthermore, we prove the
estimator is exponentially concentrated about its mean, whereas most previous
related results have proven only expected error bounds on estimators.