Algorithmic Information Theory
by Gregory. J. Chaitin
Publisher: Cambridge University Press 2003
Number of pages: 236
The aim of this book is to present the strongest possible version of Gödel's incompleteness theorem, using an information-theoretic approach based on the size of computer programs. One half of the book is concerned with studying Omega, the halting probability of a universal computer if its program is chosen by tossing a coin. The other half of the book is concerned with encoding Omega as an algebraic equation in integers, a so-called exponential diophantine equation. Although the ideas in this book are not easy, this book has tried to present the material in the most concrete and direct fashion possible. It gives many examples, and computer programs for key algorithms. In particular, the theory of program-size in LISP presented in Chapter 5 and Appendix B, which has not appeared elsewhere, is intended as an illustration of the more abstract ideas in the following chapters.
Download or read it online for free here:
by Peter D. Gruenwald, Paul M.B. Vitanyi - CWI
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain this quantitative approach to defining information and discuss the extent to which Kolmogorov's and Shannon's theory have a common purpose.
by Abbas El Gamal, Young-Han Kim - arXiv
Network information theory deals with the fundamental limits on information flow in networks and optimal coding and protocols. These notes provide a broad coverage of key results, techniques, and open problems in network information theory.
by David J. C. MacKay - University of Cambridge
This text discusses the theorems of Claude Shannon, starting from the source coding theorem, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
by David Feldman - College of the Atlantic
This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.