A Short Course in Information Theory
by David J. C. MacKay
Publisher: University of Cambridge 1995
Description:
Is it possible to communicate reliably from one point to another if we only have a noisy communication channel? How can the information content of a random variable be measured? This course will discuss the remarkable theorems of Claude Shannon, starting from the source coding theorem, which motivates the entropy as the measure of information, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
Download or read it online for free here:
Download link
(multiple PDF,PS files)
Similar books

by Karl Petersen - AMS
The aim is to review the many facets of information, coding, and cryptography, including their uses throughout history and their mathematical underpinnings. Prerequisites included high-school mathematics and willingness to deal with unfamiliar ideas.
(6669 views)

by Raymond Yeung, S-Y Li, N Cai - Now Publishers Inc
A tutorial on the basics of the theory of network coding. It presents network coding for the transmission from a single source node, and deals with the problem under the more general circumstances when there are multiple source nodes.
(17620 views)

by Martin Tomlinson, et al. - Springer
This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies.
(7470 views)

by Alexander Shen - arXiv.org
Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.
(7351 views)