**Algorithmic Information Theory**

by Peter D. Gruenwald, Paul M.B. Vitanyi

**Publisher**: CWI 2007**Number of pages**: 37

**Description**:

We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining 'information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are fundamentally different.

Download or read it online for free here:

**Download link**

(330KB, PDF)

## Similar books

**Network Coding Theory**

by

**Raymond Yeung, S-Y Li, N Cai**-

**Now Publishers Inc**

A tutorial on the basics of the theory of network coding. It presents network coding for the transmission from a single source node, and deals with the problem under the more general circumstances when there are multiple source nodes.

(

**12841**views)

**Information Theory and Coding**

by

**John Daugman**-

**University of Cambridge**

The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; etc.

(

**16065**views)

**A Short Course in Information Theory**

by

**David J. C. MacKay**-

**University of Cambridge**

This text discusses the theorems of Claude Shannon, starting from the source coding theorem, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.

(

**9258**views)

**A primer on information theory, with applications to neuroscience**

by

**Felix Effenberger**-

**arXiv**

This chapter is supposed to give a short introduction to the fundamentals of information theory, especially suited for people having a less firm background in mathematics and probability theory. The focus will be on neuroscientific topics.

(

**5145**views)