Logic and Information
by Keith Devlin
Publisher: ESSLLI 2001
An introductory, comparative account of three mathematical approaches to information: the classical quantitative theory of Claude Shannon, developed in the 1940s and 50s, a quantitative-based, qualitative theory developed by Fred Dretske in the 1970s, and a qualitative theory introduced by Jon Barwise and John Perry in the early 1980s and pursued by Barwise, Israel, Devlin, Seligman and others in the 1990s.
Home page url
Download or read it online for free here:
(multiple PDF files)
by Inder Jeet Taneja - Universidade Federal de Santa Catarina
Contents: Shannon's Entropy; Information and Divergence Measures; Entropy-Type Measures; Generalized Information and Divergence Measures; M-Dimensional Divergence Measures and Their Generalizations; Unified (r,s)-Multivariate Entropies; etc.
by Venkatesan Guruswami, Atri Rudra, Madhu Sudan - University at Buffalo
Error-correcting codes are clever ways of representing data so that one can recover the original information even if parts of it are corrupted. The basic idea is to introduce redundancy so that the original information can be recovered ...
by Claude Shannon
Shannon presents results previously found nowhere else, and today many professors refer to it as the best exposition on the subject of the mathematical limits on communication. It laid the modern foundations for what is now coined Information Theory.
by John Daugman - University of Cambridge
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; etc.