
Information Theory and Coding
by John Daugman
Publisher: University of Cambridge 2009
Number of pages: 75
Description:
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; how these are used to calculate the capacity of a communication channel, with and without noise; coding schemes, including error correcting codes; how discrete channels and measures of information generalize to their continuous forms; etc.
Download or read it online for free here:
Download link
(1.4MB, PDF)
Similar books
Generalized Information Measures and Their Applicationsby Inder Jeet Taneja - Universidade Federal de Santa Catarina
Contents: Shannon's Entropy; Information and Divergence Measures; Entropy-Type Measures; Generalized Information and Divergence Measures; M-Dimensional Divergence Measures and Their Generalizations; Unified (r,s)-Multivariate Entropies; etc.
(12656 views)
Data Compression Explainedby Matt Mahoney - mattmahoney.net
This book is for the reader who wants to understand how data compression works, or who wants to write data compression software. Prior programming ability and some math skills will be needed. This book is intended to be self contained.
(12188 views)
The Limits of Mathematicsby Gregory J. Chaitin - Springer
The final version of a course on algorithmic information theory and the epistemology of mathematics. The book discusses the nature of mathematics in the light of information theory, and sustains the thesis that mathematics is quasi-empirical.
(14414 views)
A Mathematical Theory of Communicationby Claude Shannon
Shannon presents results previously found nowhere else, and today many professors refer to it as the best exposition on the subject of the mathematical limits on communication. It laid the modern foundations for what is now coined Information Theory.
(64275 views)