Information Theory and Coding
by John Daugman
Publisher: University of Cambridge 2009
Number of pages: 75
Description:
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; how these are used to calculate the capacity of a communication channel, with and without noise; coding schemes, including error correcting codes; how discrete channels and measures of information generalize to their continuous forms; etc.
Download or read it online for free here:
Download link
(1.4MB, PDF)
Similar books

by Peter D. Gruenwald, Paul M.B. Vitanyi - CWI
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain this quantitative approach to defining information and discuss the extent to which Kolmogorov's and Shannon's theory have a common purpose.
(9465 views)

by Matt Mahoney - mattmahoney.net
This book is for the reader who wants to understand how data compression works, or who wants to write data compression software. Prior programming ability and some math skills will be needed. This book is intended to be self contained.
(9329 views)

by Robert M. Gray - Springer
The book covers the theory of probabilistic information measures and application to coding theorems for information sources and noisy channels. This is an up-to-date treatment of traditional information theory emphasizing ergodic theory.
(15686 views)

by Frederic Barbaresco, Ali Mohammad-Djafari - MDPI AG
The aim of this book is to provide an overview of current work addressing topics of research that explore the geometric structures of information and entropy. This survey will motivate readers to explore the emerging domain of Science of Information.
(6621 views)