Logo

Entropy and Information Theory

Large book cover: Entropy and Information Theory

Entropy and Information Theory
by

Publisher: Springer
ISBN/ASIN: 1441979697
Number of pages: 313

Description:
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.

Home page url

Download or read it online for free here:
Download link
(1.5MB, PDF)

Similar books

Book cover: The Limits of MathematicsThe Limits of Mathematics
by - Springer
The final version of a course on algorithmic information theory and the epistemology of mathematics. The book discusses the nature of mathematics in the light of information theory, and sustains the thesis that mathematics is quasi-empirical.
(8421 views)
Book cover: Network Coding TheoryNetwork Coding Theory
by - Now Publishers Inc
A tutorial on the basics of the theory of network coding. It presents network coding for the transmission from a single source node, and deals with the problem under the more general circumstances when there are multiple source nodes.
(12863 views)
Book cover: Logic and InformationLogic and Information
by - ESSLLI
An introductory, comparative account of three mathematical approaches to information: the classical quantitative theory of Claude Shannon, a qualitative theory developed by Fred Dretske, and a qualitative theory introduced by Barwise and Perry.
(7838 views)
Book cover: Around Kolmogorov Complexity: Basic Notions and ResultsAround Kolmogorov Complexity: Basic Notions and Results
by - arXiv.org
Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.
(2457 views)