Entropy and Information in Probability

Spring 2009

**TUESDAY, MAY 5, 2-5pm**

Johannes Ruf *The** central limit theorem in relative entropy*

Qinghua Li *Entropy and free probability*

Tomoyuki Ichiba *Extreme value distributions and relative
entropy*

Petr Novotny *Universal portfolios*

Subhankar Sadhukhan ss *Large** deviations for AR processes*

Timothy Teravainen *Freidlin-Wentzel large deviations theory*

Li Song *Entropy in time series models*

Chun Yip Yau *Weak consistency of the MDL principle*

**WEDNESDAY, MAY 6,
2-5pm**

Emilio Seijo *Concentration of measure via the entropy
method*

Ivor Cribben *Bootstrap and maximum entropy distributions*

Henry Lam *Rare event simulation*

George Fellouris *Distributed hypothesis testing*

Greg Wayne *Information-theoretic ideas in control theory*

Kamiar Rahnama *Mutual information expansions*

G 8325 is a topics course offered by the Statistics Department.

CLASS TIMES: Tues/Thur 2:40-3:55 p.m.

LOCATION: room 1025 SSW bldg.

**Instructor:** Ioannis
Kontoyiannis

Email: ik2241 at columbia.edu

Office hours: Tursdays 4-6 p.m., or by arrangement

**Description:**

Course will contain a subset of the following:

*Entropy and
information*: typical strings and the "asymptotic equipartition
property"; entropy as the fundamental compression limit; relative entropy
as the optimal error exponent in hypothesis testing; Fisher information as the
derivative of the entropy; maximum entropy distributions; basic inequalities

*Probability*: The
method of types; the strong law of large numbers via the entropy, the central
limit theorem as a version of the second law of thermodynamics, large
deviations, Sanov's theorem, high-dimensional projections and statistical
mechanics; convergence of Markov chains

*Special topics*:
Ergodicity, recurrence properties; the Shannon-McMillan-Breiman theorem;
Poisson approximation bounds in terms of relative entropy; information in
sigma-algebras and the Hewitt-Savage 0-1 law; entropy and the distribution of
prime numbers.

**Reference texts**

Material will be drawn from various places in the literature, including the
books:

*Elements
of Information Theory* by Cover and Thomas

*Information
Theory* by Csiszar and Korner, and

*Information
theory and Statistics* by Kullback

**Course requirements/exams**

There will be homework assignments every 2-3 weeks. Instead of a final exam, students will have the option of either:

Giving an oral presentation in class; or

Doing a project and writing a project report.

A list of possible topics for presentations and projects will be provided by the instructor. Possible projects will cover the whole range from applied computational projects to purely theoretical questions in probability. New research topics will also be introduced along the way.

**Prerequisites**

Knowledge of basic probability and random processes. No previous knowledge of information theory will be required.

*Last modified: May 4, 2009*