Logic and Information
by Keith Devlin
Publisher: ESSLLI 2001
An introductory, comparative account of three mathematical approaches to information: the classical quantitative theory of Claude Shannon, developed in the 1940s and 50s, a quantitative-based, qualitative theory developed by Fred Dretske in the 1970s, and a qualitative theory introduced by Jon Barwise and John Perry in the early 1980s and pursued by Barwise, Israel, Devlin, Seligman and others in the 1990s.
Home page url
Download or read it online for free here:
(multiple PDF files)
by Gregory. J. Chaitin - Cambridge University Press
The book presents the strongest possible version of Gödel's incompleteness theorem, using an information-theoretic approach based on the size of computer programs. The author tried to present the material in the most direct fashion possible.
by David Feldman - College of the Atlantic
This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.
by John Daugman - University of Cambridge
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; etc.
by Martin Tomlinson, et al. - Springer
This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies.