Information Theory and Coding
by John Daugman
Publisher: University of Cambridge 2009
Number of pages: 75
Description:
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; how these are used to calculate the capacity of a communication channel, with and without noise; coding schemes, including error correcting codes; how discrete channels and measures of information generalize to their continuous forms; etc.
Download or read it online for free here:
Download link
(1.4MB, PDF)
Similar books

by Venkatesan Guruswami, Atri Rudra, Madhu Sudan - University at Buffalo
Error-correcting codes are clever ways of representing data so that one can recover the original information even if parts of it are corrupted. The basic idea is to introduce redundancy so that the original information can be recovered ...
(10032 views)

by Karl Petersen - AMS
The aim is to review the many facets of information, coding, and cryptography, including their uses throughout history and their mathematical underpinnings. Prerequisites included high-school mathematics and willingness to deal with unfamiliar ideas.
(6447 views)

by Renato Renner - ETH Zurich
Processing of information is necessarily a physical process. It is not surprising that physics and the theory of information are inherently connected. Quantum information theory is a research area whose goal is to explore this connection.
(13110 views)

by Alexander Shen - arXiv.org
Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.
(7078 views)