Logo

Logic and Information by Keith Devlin

Large book cover: Logic and Information

Logic and Information
by

Publisher: ESSLLI
ISBN/ASIN: 0521499712

Description:
An introductory, comparative account of three mathematical approaches to information: the classical quantitative theory of Claude Shannon, developed in the 1940s and 50s, a quantitative-based, qualitative theory developed by Fred Dretske in the 1970s, and a qualitative theory introduced by Jon Barwise and John Perry in the early 1980s and pursued by Barwise, Israel, Devlin, Seligman and others in the 1990s.

Home page url

Download or read it online for free here:
Download link
(multiple PDF files)

Similar books

Book cover: Generalized Information Measures and Their ApplicationsGeneralized Information Measures and Their Applications
by - Universidade Federal de Santa Catarina
Contents: Shannon's Entropy; Information and Divergence Measures; Entropy-Type Measures; Generalized Information and Divergence Measures; M-Dimensional Divergence Measures and Their Generalizations; Unified (r,s)-Multivariate Entropies; etc.
(6436 views)
Book cover: Essential Coding TheoryEssential Coding Theory
by - University at Buffalo
Error-correcting codes are clever ways of representing data so that one can recover the original information even if parts of it are corrupted. The basic idea is to introduce redundancy so that the original information can be recovered ...
(3944 views)
Book cover: A Mathematical Theory of CommunicationA Mathematical Theory of Communication
by
Shannon presents results previously found nowhere else, and today many professors refer to it as the best exposition on the subject of the mathematical limits on communication. It laid the modern foundations for what is now coined Information Theory.
(53320 views)
Book cover: Information Theory and CodingInformation Theory and Coding
by - University of Cambridge
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; etc.
(14703 views)