Conditional Rate Distortion Theory
by Robert M. Gray
Publisher: Information Systems Laboratory 1972
Number of pages: 22
The conditional rate-distortion function has proved useful in source coding problems involving the possession of side information. This book is an early work on conditional rate distortion functions and related theory. The report provided supporting details for published papers relevant to the topic.
Home page url
Download or read it online for free here:
by Peter D. Gruenwald, Paul M.B. Vitanyi - CWI
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain this quantitative approach to defining information and discuss the extent to which Kolmogorov's and Shannon's theory have a common purpose.
by Gregory J. Chaitin - World Scientic
In this mathematical autobiography, Gregory Chaitin presents a technical survey of his work and a non-technical discussion of its significance. The technical survey contains many new results, including a detailed discussion of LISP program size.
by Gregory J. Chaitin - Springer
The final version of a course on algorithmic information theory and the epistemology of mathematics. The book discusses the nature of mathematics in the light of information theory, and sustains the thesis that mathematics is quasi-empirical.
by Inder Jeet Taneja - Universidade Federal de Santa Catarina
Contents: Shannon's Entropy; Information and Divergence Measures; Entropy-Type Measures; Generalized Information and Divergence Measures; M-Dimensional Divergence Measures and Their Generalizations; Unified (r,s)-Multivariate Entropies; etc.