Around Kolmogorov Complexity: Basic Notions and Results
by Alexander Shen
Publisher: arXiv.org 2015
Number of pages: 51
Algorithmic information theory studies description complexity and randomness and is now a well known field of theoretical computer science and mathematical logic. This report covers the basic notions of algorithmic information theory: Kolmogorov complexity (plain, conditional, prefix), Solomonoff universal a priori probability, notions of randomness, effective Hausdorff dimension.
Home page url
Download or read it online for free here:
by Inder Jeet Taneja - Universidade Federal de Santa Catarina
Contents: Shannon's Entropy; Information and Divergence Measures; Entropy-Type Measures; Generalized Information and Divergence Measures; M-Dimensional Divergence Measures and Their Generalizations; Unified (r,s)-Multivariate Entropies; etc.
by John Watrous - University of Calgary
The focus is on the mathematical theory of quantum information. We will begin with basic principles and methods for reasoning about quantum information, and then move on to a discussion of various results concerning quantum information.
by David J. C. MacKay - Cambridge University Press
A textbook on information theory, Bayesian inference and learning algorithms, useful for undergraduates and postgraduates students, and as a reference for researchers. Essential reading for students of electrical engineering and computer science.
by Robert M. Gray - Information Systems Laboratory
The conditional rate-distortion function has proved useful in source coding problems involving the possession of side information. This book represents an early work on conditional rate distortion functions and related theory.