**Algorithmic Information Theory**

by Gregory. J. Chaitin

**Publisher**: Cambridge University Press 2003**ISBN/ASIN**: 0521616042**ISBN-13**: 9780521616041**Number of pages**: 236

**Description**:

The aim of this book is to present the strongest possible version of GĂ¶del's incompleteness theorem, using an information-theoretic approach based on the size of computer programs. One half of the book is concerned with studying Omega, the halting probability of a universal computer if its program is chosen by tossing a coin. The other half of the book is concerned with encoding Omega as an algebraic equation in integers, a so-called exponential diophantine equation. Although the ideas in this book are not easy, this book has tried to present the material in the most concrete and direct fashion possible. It gives many examples, and computer programs for key algorithms. In particular, the theory of program-size in LISP presented in Chapter 5 and Appendix B, which has not appeared elsewhere, is intended as an illustration of the more abstract ideas in the following chapters.

Download or read it online for free here:

**Download link**

(0.9MB, PDF)

## Similar books

**Algorithmic Information Theory**

by

**Peter D. Gruenwald, Paul M.B. Vitanyi**-

**CWI**

We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain this quantitative approach to defining information and discuss the extent to which Kolmogorov's and Shannon's theory have a common purpose.

(

**4938**views)

**Lecture Notes on Network Information Theory**

by

**Abbas El Gamal, Young-Han Kim**-

**arXiv**

Network information theory deals with the fundamental limits on information flow in networks and optimal coding and protocols. These notes provide a broad coverage of key results, techniques, and open problems in network information theory.

(

**8267**views)

**A Short Course in Information Theory**

by

**David J. C. MacKay**-

**University of Cambridge**

This text discusses the theorems of Claude Shannon, starting from the source coding theorem, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.

(

**7357**views)

**Information Theory, Excess Entropy and Statistical Complexity**

by

**David Feldman**-

**College of the Atlantic**

This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.

(

**7116**views)