**Logic and Information**

by Keith Devlin

**Publisher**: ESSLLI 2001**ISBN/ASIN**: 0521499712

**Description**:

An introductory, comparative account of three mathematical approaches to information: the classical quantitative theory of Claude Shannon, developed in the 1940s and 50s, a quantitative-based, qualitative theory developed by Fred Dretske in the 1970s, and a qualitative theory introduced by Jon Barwise and John Perry in the early 1980s and pursued by Barwise, Israel, Devlin, Seligman and others in the 1990s.

Download or read it online for free here:

**Download link**

(multiple PDF files)

## Similar books

**Generalized Information Measures and Their Applications**

by

**Inder Jeet Taneja**-

**Universidade Federal de Santa Catarina**

Contents: Shannon's Entropy; Information and Divergence Measures; Entropy-Type Measures; Generalized Information and Divergence Measures; M-Dimensional Divergence Measures and Their Generalizations; Unified (r,s)-Multivariate Entropies; etc.

(

**6436**views)

**Essential Coding Theory**

by

**Venkatesan Guruswami, Atri Rudra, Madhu Sudan**-

**University at Buffalo**

Error-correcting codes are clever ways of representing data so that one can recover the original information even if parts of it are corrupted. The basic idea is to introduce redundancy so that the original information can be recovered ...

(

**3944**views)

**A Mathematical Theory of Communication**

by

**Claude Shannon**

Shannon presents results previously found nowhere else, and today many professors refer to it as the best exposition on the subject of the mathematical limits on communication. It laid the modern foundations for what is now coined Information Theory.

(

**53320**views)

**Information Theory and Coding**

by

**John Daugman**-

**University of Cambridge**

The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; etc.

(

**14703**views)