Estimation of information-theoretic quantities and discrete
distributions
A basic problem in coding theory is to estimate the mutual information
between two random variables, given limited samples (e.g., how much
information does a given visual cell carry about the pattern of light
on the retina?).
Paninski, L. (2003). Estimation of entropy and mutual
information. Neural Computation 15: 1191-1254. Matlab code for
entropy estimation (implementing the method described here) is
available here. Code for computing the exact
bias and a useful bound on the variance of a large class of entropy
estimators, for any discrete probability distribution, here.
Paninski, L. (2004). Estimating
entropy on m bins given fewer than m samples. IEEE
Transactions on Information Theory 50: 2200-2203.
Paninski, L. & Yajima, M. (2008). Undersmoothed kernel
entropy estimators. In press, IEEE Transactions on Information
Theory.
It is interesting to note that similar ideas may be used
to analyze the problem of directly estimating the underlying
discrete distribution (instead of its entropy).
Paninski (2004). Variational
minimax estimation of discrete distributions under Kullback-Leibler
loss. NIPS 17.
Paninski, L. (2008). A
coincidence-based test for uniformity given very sparsely-sampled
discrete data. In press, IEEE Transactions on Information Theory.
Liam Paninski's home