A Short Course in Information Theory
by David J. C. MacKay
Publisher: University of Cambridge 1995
Is it possible to communicate reliably from one point to another if we only have a noisy communication channel? How can the information content of a random variable be measured? This course will discuss the remarkable theorems of Claude Shannon, starting from the source coding theorem, which motivates the entropy as the measure of information, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
Home page url
Download or read it online for free here:
(multiple PDF,PS files)
by Matt Mahoney - mattmahoney.net
This book is for the reader who wants to understand how data compression works, or who wants to write data compression software. Prior programming ability and some math skills will be needed. This book is intended to be self contained.
by David J. C. MacKay - Cambridge University Press
A textbook on information theory, Bayesian inference and learning algorithms, useful for undergraduates and postgraduates students, and as a reference for researchers. Essential reading for students of electrical engineering and computer science.
by Robert M. Gray - Information Systems Laboratory
The conditional rate-distortion function has proved useful in source coding problems involving the possession of side information. This book represents an early work on conditional rate distortion functions and related theory.
by Abbas El Gamal, Young-Han Kim - arXiv
Network information theory deals with the fundamental limits on information flow in networks and optimal coding and protocols. These notes provide a broad coverage of key results, techniques, and open problems in network information theory.