
A primer on information theory, with applications to neuroscience
by Felix Effenberger
Publisher: arXiv 2013
Number of pages: 58
Description:
This chapter is supposed to give a short introduction to the fundamentals of information theory; not only, but especially suited for people having a less firm background in mathematics and probability theory. Regarding applications, the focus will be on neuroscientific topics.
Download or read it online for free here:
Download link
(1MB, PDF)
Similar books
Conditional Rate Distortion Theoryby Robert M. Gray - Information Systems Laboratory
The conditional rate-distortion function has proved useful in source coding problems involving the possession of side information. This book represents an early work on conditional rate distortion functions and related theory.
(11264 views)
Generalized Information Measures and Their Applicationsby Inder Jeet Taneja - Universidade Federal de Santa Catarina
Contents: Shannon's Entropy; Information and Divergence Measures; Entropy-Type Measures; Generalized Information and Divergence Measures; M-Dimensional Divergence Measures and Their Generalizations; Unified (r,s)-Multivariate Entropies; etc.
(13073 views)
A Short Course in Information Theoryby David J. C. MacKay - University of Cambridge
This text discusses the theorems of Claude Shannon, starting from the source coding theorem, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
(15515 views)
Data Compression- Wikibooks
Data compression is useful in some situations because 'compressed data' will save time (in reading and on transmission) and space if compared to the unencoded information it represent. In this book, we describe the decompressor first.
(11381 views)