Entropy and Information Theory
by Robert M. Gray
Publisher: Springer 2008
ISBN/ASIN: 1441979697
Number of pages: 313
Description:
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
Download or read it online for free here:
Download link
(1.5MB, PDF)
Similar books
Algorithmic Information Theoryby Peter D. Gruenwald, Paul M.B. Vitanyi - CWI
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain this quantitative approach to defining information and discuss the extent to which Kolmogorov's and Shannon's theory have a common purpose.
(12297 views)
Information Theory, Excess Entropy and Statistical Complexityby David Feldman - College of the Atlantic
This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.
(15864 views)
Information Theory and Statistical Physicsby Neri Merhav - arXiv
Lecture notes for a graduate course focusing on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, or graduate students in Physics.
(14983 views)
Quantum Information Theoryby Robert H. Schumann - arXiv
A short review of ideas in quantum information theory. Quantum mechanics is presented together with some useful tools for quantum mechanics of open systems. The treatment is pedagogical and suitable for beginning graduates in the field.
(18586 views)