A Short Course in Information Theory
by David J. C. MacKay
Publisher: University of Cambridge 1995
Is it possible to communicate reliably from one point to another if we only have a noisy communication channel? How can the information content of a random variable be measured? This course will discuss the remarkable theorems of Claude Shannon, starting from the source coding theorem, which motivates the entropy as the measure of information, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
Home page url
Download or read it online for free here:
(multiple PDF,PS files)
by Frederic Barbaresco, Ali Mohammad-Djafari - MDPI AG
The aim of this book is to provide an overview of current work addressing topics of research that explore the geometric structures of information and entropy. This survey will motivate readers to explore the emerging domain of Science of Information.
by Claude Shannon
Shannon presents results previously found nowhere else, and today many professors refer to it as the best exposition on the subject of the mathematical limits on communication. It laid the modern foundations for what is now coined Information Theory.
by Robert M. Gray - Springer
The book covers the theory of probabilistic information measures and application to coding theorems for information sources and noisy channels. This is an up-to-date treatment of traditional information theory emphasizing ergodic theory.
by Gregory J. Chaitin - World Scientific
In this mathematical autobiography, Gregory Chaitin presents a technical survey of his work and a non-technical discussion of its significance. The technical survey contains many new results, including a detailed discussion of LISP program size.