**Algorithmic Information Theory**

by Peter D. Gruenwald, Paul M.B. Vitanyi

**Publisher**: CWI 2007**Number of pages**: 37

**Description**:

We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining 'information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are fundamentally different.

Download or read it online for free here:

**Download link**

(330KB, PDF)

## Similar books

**A primer on information theory, with applications to neuroscience**

by

**Felix Effenberger**-

**arXiv**

This chapter is supposed to give a short introduction to the fundamentals of information theory, especially suited for people having a less firm background in mathematics and probability theory. The focus will be on neuroscientific topics.

(

**4655**views)

**The Limits of Mathematics**

by

**Gregory J. Chaitin**-

**Springer**

The final version of a course on algorithmic information theory and the epistemology of mathematics. The book discusses the nature of mathematics in the light of information theory, and sustains the thesis that mathematics is quasi-empirical.

(

**7875**views)

**Algorithmic Information Theory**

by

**Gregory. J. Chaitin**-

**Cambridge University Press**

The book presents the strongest possible version of GĂ¶del's incompleteness theorem, using an information-theoretic approach based on the size of computer programs. The author tried to present the material in the most direct fashion possible.

(

**8449**views)

**Logic and Information**

by

**Keith Devlin**-

**ESSLLI**

An introductory, comparative account of three mathematical approaches to information: the classical quantitative theory of Claude Shannon, a qualitative theory developed by Fred Dretske, and a qualitative theory introduced by Barwise and Perry.

(

**7176**views)