admin管理员组

文章数量:1613412

Information theory is the scientific study of the quantification, storage, and communication of information.[1] The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s.[2]: vii  The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.

A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory and information-theoretic security.

Applications of fundamental topics of information theory include source coding/data compression (e.g. for ZIP files), and channel coding/error detection and correction (e.g. for DSL). Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones and the development of the Internet. The theory has also found applications in other areas, including statistical inference,[3] cryptography, neurobiology,[4] perception,[5] linguistics, the evolution[6] and function[7] of molecular codes (bioinformatics), thermal physics,[8] molecular dynamics,[9] quantum computing, black holes, information retrieval, intelligence gathering, plagiarism detection,[10] pattern recognition, anomaly detection[11] and even art creation.

Contents
1 Overview
2 Historical background
3 Quantities of information
3.1 Entropy of an information source
3.2 Joint entropy
3.3 Conditional entropy (equivocation)
3.4 Mutual information (transinformation)
3.5 Kullback–Leibler divergence (information gain)
3.6 Directed Information
3.7 Other quantities
4 Coding theory
4.1 Source theory
4.1.1 Rate
4.2 Channel capacity
4.2.1 Capacity of particular channel models
4.2.2 Channels with memory and directed information
5 Applications to other fields
5.1 Intelligence uses and secrecy applications
5.2 Pseudorandom number generation
5.3 Seismic exploration
5.4 Semiotics
5.5 Integrated process organization of neural information
5.6 Miscellaneous applications
6 See also
6.1 Applications
6.2 History
6.3 Theory
6.4 Concepts
7 References
8 Further reading
8.1 The classic work
8.2 Other journal articles
8.3 Textbooks on information theory
8.4 Other books

本文标签: informationTheory