Logo

Entropy and Information Theory

Large book cover: Entropy and Information Theory

Entropy and Information Theory
by

Publisher: Springer
ISBN/ASIN: 1441979697
Number of pages: 313

Description:
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.

Home page url

Download or read it online for free here:
Download link
(1.5MB, PDF)

Similar books

Book cover: Around Kolmogorov Complexity: Basic Notions and ResultsAround Kolmogorov Complexity: Basic Notions and Results
by - arXiv.org
Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.
(1809 views)
Book cover: Network Coding TheoryNetwork Coding Theory
by - Now Publishers Inc
A tutorial on the basics of the theory of network coding. It presents network coding for the transmission from a single source node, and deals with the problem under the more general circumstances when there are multiple source nodes.
(12273 views)
Book cover: Information Theory, Inference, and Learning AlgorithmsInformation Theory, Inference, and Learning Algorithms
by - Cambridge University Press
A textbook on information theory, Bayesian inference and learning algorithms, useful for undergraduates and postgraduates students, and as a reference for researchers. Essential reading for students of electrical engineering and computer science.
(20145 views)
Book cover: A Short Course in Information TheoryA Short Course in Information Theory
by - University of Cambridge
This text discusses the theorems of Claude Shannon, starting from the source coding theorem, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
(8705 views)