Information Theory
Jump to navigation
Jump to search
This page contains resources about Information Theory in general.
More specific information is included in each subfield.
Subfields and Concepts[edit]
See Category:Information Theory for some of its subfields.
- Shannon entropy / Information entropy
- Cross entropy / Joint entropy
- Conditional entropy
- Differential entropy
- Information content
- Mutual Information
- Relative entropy / Kullback-Leibler divergence / Information gain
- Perplexity
- Entropy encoding
- Huffman coding
- Arithmetic coding
- Algorithmic Information Theory
- Kolmogorov Complexity / Algorithmic Complexity
- Rademacher Complexity
- Algorithmic Probability / Solomonoff Probability
- Universal Search (by Levin)
- Algorithmic Randomness (by Martin-Lof)
- Solomonoff's Theory of Inductive Inference
- Epicurus' Principle of Multiple Explanations
- Occam's Razor
- Bayes' rule
- Universality probability
- Universal Turning Machine
- Minimum Description Length (MDL) principle
- Minimum Message Length (MML)
- Algorithmic Statistics
- Principle of Maximum Entropy
- Hamming distance
- Hamming code
- Wavelets
- Information bottleneck
- Neural Network Compression / Model Compression
- Nodes pruning
- Weight pruning / Connection pruning
- Quantization of weights
- Deep Compression
- Dynamic Network Surgery
- SqueezeNet Architecture
- Structured Sparsity Learning
- Soft-weight sharing
- Bayesian Compression
- Variational Dropout
- Coding Theory
- Data Compression / Source Coding
- Lossy Compression
- Lossless Compression
- Probabilistic Data Compression
- Prediction by partial matching (PPM)
- Sequence Memoizer
- Bayesian Networks
- Probabilistic Data Compression
- Shannon's Source Coding Theorem / Noiseless Coding Theorem
- Error Correction / Channel Coding
- Cryptographic Coding
- Data Compression / Source Coding
- Shannon–Hartley Theorem
- Noisy-Channel Coding Theorem
- Shannon Limit / Shannon Capacity
- Applications
Online Courses[edit]
Video Lectures[edit]
- Information Theory and Coding by S.N.Merchant
- Information Theory, Pattern Recognition, and Neural Networks by David MacKay
- Information Theory by Raymond W. Yeung
- Probability, Information Theory and Bayesian Inference by Joaquin Quiñonero Candela
- Information, Entropy and Computation by Paul Penfield and Seth Lloyd (Notes)
Lecture Notes[edit]
- Information Theory by Tsachy Weissman
- Information Theory by Muriel Médard
- Information Theory by Yao Xie
- Advanced Topics in Information Theory by Stefan M. Moser
- A Short Course in Information Theory by David J.C. MacKay
- Information Theory by Radford Neal
- Information theory in computer science by Mark Braverman
- Information Theory in Computer Science by Anup Rao
- Information Theory and its applications in theory of computation by Venkatesan Guruswami and Mahdi Cheraghchi
- Information Theory by Iain Murray
- Information Theory by Cong Ling
- Network Information Theory by Abbas El Gamal
Books[edit]
See also Textbooks.
Introductory[edit]
- Moser, S. M., & Chen, P. N. (2012). A Student's Guide to Coding and Information Theory. Cambridge University Press.
- Gray, R. M. (2011). Entropy and Information Theory. Springer.
- Yeung, R. W. (2008). Information Theory and Network Coding. Springer.
- Cover, T. M., & Thomas, J. A. (2006). Elements of Information Theory. John Wiley & Sons.
Specialized[edit]
- El Gamal, A., & Kim, Y. H. (2011). Network Information Theory. Cambridge University Press.
- Merhav, N. (2010). Lecture Notes on Information Theory and Statistical Physics. Foundations and Trends® in Communications and Information Theory 6(1-2): 1-212 .
- Anderson, D. R. (2008). "Chapter 3: Information Theory and Entropy". Model Based Inference in the Life Sciences. Springer New York.
- MacKay, D. J. (2003). Information Theory, Inference and Learning Algorithms. Cambridge University Press.
Scholarly Articles[edit]
- Louizos, C., Ullrich, K., & Welling, M. (2017). Bayesian Compression for Deep Learning. In Advances in Neural Information Processing Systems (pp. 3290-3300).
- Ullrich, K., Meeds, E., & Welling, M. (2017). Soft Weight-Sharing for Neural Network Compression. arXiv preprint arXiv:1702.04008.
- Molchanov, D., Ashukha, A., & Vetrov, D. (2017). Variational Dropout Sparsifies Deep Neural Networks. arXiv preprint arXiv:1701.05369.
- Wen, W., Wu, C., Wang, Y., Chen, Y., & Li, H. (2016). Learning Structured Sparsity in Deep Neural Networks. In Advances in Neural Information Processing Systems (pp. 2074-2082).
- Han, S., Mao, H., & Dally, W. J. (2015). Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding. arXiv preprint arXiv:1510.00149.
- Steinruecken, C. (2014). Lossless Data Compression. PhD Diss., University of Cambridge.
- Alajaji, F., & Chen, P. N. (2013). Lecture Notes in Information Theory: Part I.
- Tishby, N., Pereira, F. C., & Bialek, W. (2000). The information Bottleneck Method. arXiv preprint physics/0004057.
Software[edit]
- Information Theory Toolbox - MATLAB
- Octave-Information_Theory - Octave
- Module pyentropy - Python
- List of Compression Algorithms - Python
- Module PyNLPl.statistics - Python
- Information Theory and Signal Processing Library (libit) - C
- NSB Entropy Estimation
See also[edit]
- Compressed Sensing
- Machine Learning
- Signal Processing
- Stochastic Processes
- Probability Theory
- Statistical Learning Theory
Other Resources[edit]
- Information Theory - Google Scholar Metrics (Top Publications)
- Information Theory - Nature
- Video Tutorials - Youtube channel of 'Mathematical Monk'
- Soft weight-sharing for Neural Network Compression - Github
- Bayesian Compression for Deep Learning - Github
- Dynamic Network Surgery - Github
- Software by IEEE Information Theory Society
- Programming notes for Information Theory
- Information Theory by Wikiversity
- Information Theory - Notebooks