Logo

Entropy and Information Theory

Large book cover: Entropy and Information Theory

Entropy and Information Theory
by

Publisher: Springer
ISBN/ASIN: 1441979697
Number of pages: 313

Description:
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.

Home page url

Download or read it online for free here:
Download link
(1.5MB, PDF)

Similar books

Book cover: A Short Course in Information TheoryA Short Course in Information Theory
by - University of Cambridge
This text discusses the theorems of Claude Shannon, starting from the source coding theorem, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
(13635 views)
Book cover: A primer on information theory, with applications to neuroscienceA primer on information theory, with applications to neuroscience
by - arXiv
This chapter is supposed to give a short introduction to the fundamentals of information theory, especially suited for people having a less firm background in mathematics and probability theory. The focus will be on neuroscientific topics.
(8844 views)
Book cover: Essential Coding TheoryEssential Coding Theory
by - University at Buffalo
Error-correcting codes are clever ways of representing data so that one can recover the original information even if parts of it are corrupted. The basic idea is to introduce redundancy so that the original information can be recovered ...
(9136 views)
Book cover: Information Theory, Excess Entropy and Statistical ComplexityInformation Theory, Excess Entropy and Statistical Complexity
by - College of the Atlantic
This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.
(13701 views)