Logo

Entropy and Information Theory

Large book cover: Entropy and Information Theory

Entropy and Information Theory
by

Publisher: Springer
ISBN/ASIN: 1441979697
Number of pages: 313

Description:
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.

Home page url

Download or read it online for free here:
Download link
(1.5MB, PDF)

Similar books

Book cover: Information, Entropy and Their Geometric StructuresInformation, Entropy and Their Geometric Structures
by - MDPI AG
The aim of this book is to provide an overview of current work addressing topics of research that explore the geometric structures of information and entropy. This survey will motivate readers to explore the emerging domain of Science of Information.
(7550 views)
Book cover: The Limits of MathematicsThe Limits of Mathematics
by - Springer
The final version of a course on algorithmic information theory and the epistemology of mathematics. The book discusses the nature of mathematics in the light of information theory, and sustains the thesis that mathematics is quasi-empirical.
(12659 views)
Book cover: Around Kolmogorov Complexity: Basic Notions and ResultsAround Kolmogorov Complexity: Basic Notions and Results
by - arXiv.org
Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.
(6296 views)
Book cover: Information Theory and CodingInformation Theory and Coding
by - University of Cambridge
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; etc.
(23469 views)