Timeline of information theory
Jump to navigation
Jump to search
A timeline of events related to File:Red pog.svg information theory, File:Orange pog.svg quantum information theory and statistical physics, File:Green pog.svg data compression, File:Purple pog.svg error correcting codes and related subjects.
- 1872 File:Orange pog.svg – Ludwig Boltzmann presents his H-theorem, and with it the formula Σpi log pi for the entropy of a single gas particle
- 1878 File:Orange pog.svg – J. Willard Gibbs defines the Gibbs entropy: the probabilities in the entropy formula are now taken as probabilities of the state of the whole system
- 1924 File:Red pog.svg – Harry Nyquist discusses quantifying <templatestyles src="Template:Tooltip/styles.css" />"intelligence"Script error: No such module "Check for unknown parameters". and the speed at which it can be transmitted by a communication system
- 1927 File:Orange pog.svg – John von Neumann defines the von Neumann entropy, extending the Gibbs entropy to quantum mechanics
- 1928 File:Red pog.svg – Ralph Hartley introduces Hartley information as the logarithm of the number of possible messages, with information being communicated when the receiver can distinguish one sequence of symbols from any other (regardless of any associated meaning)
- 1929 File:Orange pog.svg – Leó Szilárd analyses Maxwell's demon, showing how a Szilard engine can sometimes transform information into the extraction of useful work
- 1940 File:Red pog.svg – Alan Turing introduces the deciban as a measure of information inferred about the German Enigma machine cypher settings by the Banburismus process
- 1944 File:Red pog.svg – Claude Shannon's theory of information is substantially complete
- 1947 File:Purple pog.svg – Richard W. Hamming invents Hamming codes for error detection and correction (to protect patent rights, the result is not published until 1950)
- 1948 File:Red pog.svg – Claude E. Shannon publishes A Mathematical Theory of Communication
- 1949 File:Red pog.svg – Claude E. Shannon publishes Communication in the Presence of Noise – Nyquist–Shannon sampling theorem and Shannon–Hartley law
- 1949 File:Red pog.svg – Claude E. Shannon's Communication Theory of Secrecy Systems is declassified
- 1949 File:Green pog.svg – Robert M. Fano publishes Transmission of Information. M.I.T. Press, Cambridge, Massachusetts – Shannon–Fano coding
- 1949 File:Green pog.svg – Leon G. Kraft discovers Kraft's inequality, which shows the limits of prefix codes
- 1949 File:Purple pog.svg – Marcel J. E. Golay introduces Golay codes for forward error correction
- 1951 File:Red pog.svg – Solomon Kullback and Richard Leibler introduce the Kullback–Leibler divergence
- 1951 File:Green pog.svg – David A. Huffman invents Huffman encoding, a method of finding optimal prefix codes for lossless data compression
- 1953 File:Green pog.svg – August Albert Sardinas and George W. Patterson devise the Sardinas–Patterson algorithm, a procedure to decide whether a given variable-length code is uniquely decodable
- 1954 File:Purple pog.svg – Irving S. Reed and David E. Muller propose Reed–Muller codes
- 1955 File:Purple pog.svg – Peter Elias introduces convolutional codes
- 1957 File:Purple pog.svg – Eugene Prange first discusses cyclic codes
- 1959 File:Purple pog.svg – Alexis Hocquenghem, and independently the next year Raj Chandra Bose and Dwijendra Kumar Ray-Chaudhuri, discover BCH codes
- 1960 File:Purple pog.svg – Irving S. Reed and Gustave Solomon propose Reed–Solomon codes
- 1962 File:Purple pog.svg – Robert G. Gallager proposes low-density parity-check codes; they are unused for 30 years due to technical limitations
- 1965 File:Purple pog.svg – Dave Forney discusses concatenated codes
- 1966 File:Green pog.svg – Fumitada Itakura (Nagoya University) and Shuzo Saito (Nippon Telegraph and Telephone) develop linear predictive coding (LPC), a form of speech coding[1]
- 1967 File:Purple pog.svg – Andrew Viterbi reveals the Viterbi algorithm, making decoding of convolutional codes practicable
- 1968 File:Purple pog.svg – Elwyn Berlekamp invents the Berlekamp–Massey algorithm; its application to decoding BCH and Reed–Solomon codes is pointed out by James L. Massey the following year
- 1968 File:Red pog.svg – Chris Wallace and David M. Boulton publish the first of many papers on Minimum Message Length (MML) statistical and inductive inference
- 1970 File:Purple pog.svg – Valerii Denisovich Goppa introduces Goppa codes
- 1972 File:Purple pog.svg – Jørn Justesen proposes Justesen codes, an improvement of Reed–Solomon codes
- 1972 File:Green pog.svg – Nasir Ahmed proposes the discrete cosine transform (DCT), which he develops with T. Natarajan and K. R. Rao in 1973;[2] the DCT later became the most widely used lossy compression algorithm, the basis for multimedia formats such as JPEG, MPEG and MP3
- 1973 File:Red pog.svg – David Slepian and Jack Wolf discover and prove the Slepian–Wolf coding limits for distributed source coding[3]
- 1976 File:Purple pog.svg – Gottfried Ungerboeck gives the first paper on trellis modulation; a more detailed exposition in 1982 leads to a raising of analogue modem POTS speeds from 9.6 kbit/s to 33.6 kbit/s
- 1976 File:Green pog.svg – Richard Pasco and Jorma J. Rissanen develop effective arithmetic coding techniques
- 1977 File:Green pog.svg – Abraham Lempel and Jacob Ziv develop Lempel–Ziv compression (LZ77)
- 1982 File:Purple pog.svg – Valerii Denisovich Goppa introduces algebraic geometry codes
- 1989 File:Green pog.svg – Phil Katz publishes the
.zipformat including DEFLATE (LZ77 + Huffman coding); later to become the most widely used archive container - 1993 File:Purple pog.svg – Claude Berrou, Alain Glavieux and Punya Thitimajshima introduce Turbo codes
- 1994 File:Green pog.svg – Michael Burrows and David Wheeler publish the Burrows–Wheeler transform, later to find use in bzip2
- 1995 File:Orange pog.svg – Benjamin Schumacher coins the term qubit and proves the quantum noiseless coding theorem
- 2003 File:Purple pog.svg – David J. C. MacKay shows the connection between information theory, inference and machine learning in his book.
- 2006 File:Green pog.svg – Jarosław Duda introduces first Asymmetric numeral systems entropy coding: since 2014 popular replacement of Huffman and arithmetic coding in compressors like Facebook Zstandard, Apple LZFSE, CRAM or JPEG XL
- 2008 File:Purple pog.svg – Erdal Arıkan introduces polar codes, the first practical construction of codes that achieves capacity for a wide array of channels