nLab
information theory

Information theory

Overview

There are many theories explicating specific features of naive or intuitive notions of information.

The academic sites of these theories range over philosophy, mathematics (type theory, measure theory / probability theory / statistics), computer science, physics, communication sciences, psychology / sociology, economics, semiotics, cybernetics, etc.

Information theories in methodological proximity to mathematics are e.g.

  • statistical information theory

  • semantic information theory

  • algorithmic information theory

  • constructive-type-theoretical information theory?

  • analytic principle?

  • coding theory?

  • computational complexity theory?

  • infon?

  • meaning

  • knowledge

  • structure

Contributors to information theory

  • Statistical information theory: Claude Shannon, Warren Weaver, Ronald Fisher

  • Semantic information theory: Yehoshua Bar-Hillel, Rudolf Carnap

  • Algorithmic information theory: Ray Solomonoff, Andrey Kolmogorov, Gregory Chaitin

  • Constructive-type-theoretical information theory: Giuseppe Primiero, Tijn Borghuis, Fairouz Kamareddine, Rod Nedepelt?

  • Miscellaneous: Fred Dretske, Keith Devlin, Jon Barwise, Jeremy Seligman, Jaakko Hintikka

References

Revised on August 7, 2012 00:36:24 by Toby Bartels (98.16.182.220)