A Short Course in Information Theory
by David J. C. MacKay
Publisher: University of Cambridge 1995
Description:
Is it possible to communicate reliably from one point to another if we only have a noisy communication channel? How can the information content of a random variable be measured? This course will discuss the remarkable theorems of Claude Shannon, starting from the source coding theorem, which motivates the entropy as the measure of information, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
Download or read it online for free here:
Download link
(multiple PDF,PS files)
Similar books
A Mathematical Theory of Communicationby Claude Shannon
Shannon presents results previously found nowhere else, and today many professors refer to it as the best exposition on the subject of the mathematical limits on communication. It laid the modern foundations for what is now coined Information Theory.
(65673 views)
Quantum Information Theoryby Renato Renner - ETH Zurich
Processing of information is necessarily a physical process. It is not surprising that physics and the theory of information are inherently connected. Quantum information theory is a research area whose goal is to explore this connection.
(14803 views)
Conditional Rate Distortion Theoryby Robert M. Gray - Information Systems Laboratory
The conditional rate-distortion function has proved useful in source coding problems involving the possession of side information. This book represents an early work on conditional rate distortion functions and related theory.
(11255 views)
Generalized Information Measures and Their Applicationsby Inder Jeet Taneja - Universidade Federal de Santa Catarina
Contents: Shannon's Entropy; Information and Divergence Measures; Entropy-Type Measures; Generalized Information and Divergence Measures; M-Dimensional Divergence Measures and Their Generalizations; Unified (r,s)-Multivariate Entropies; etc.
(13066 views)