• Media type: E-Article
  • Title: Coding theory
  • Contributor: Stine, Robert A.
  • Published: Wiley, 2009
  • Published in: WIREs Computational Statistics, 1 (2009) 3, Seite 261-270
  • Language: English
  • DOI: 10.1002/wics.42
  • ISSN: 1939-5108; 1939-0068
  • Keywords: Statistics and Probability
  • Origination:
  • Footnote:
  • Description: AbstractCoding theory is a portion of information theory concerned with the explicit representation of data as a sequence of symbols, usually a sequence of bits. The entropy of a probability distribution measures the information content of data, giving a lower bound on the number of bits necessary to encode data. Source coding also defines a measure, the divergence, of the cost of using a suboptimal representation. Channel coding describes representations that maximize the rate at which information can be communicated through a noisy medium. Important applications of coding theory include data compression, signal processing, and the comparison of statistical models. Copyright © 2009 Wiley Periodicals, Inc.This article is categorized under:Applications of Computational Statistics > Signal and Image Processing and Coding