nLab information theory

Information theory

Information theory

Overview

There are many theories explicating specific features of naive or intuitive notions of information.

The academic sites of these theories range over philosophy, mathematics (type theory, measure theory / probability theory / statistics), computer science, physics, communication sciences, psychology / sociology, economics, semiotics, cybernetics, etc.

Information theories in methodological proximity to mathematics are e.g.

  • statistical information theory

  • semantic information theory

  • algorithmic information theory

  • constructive-type-theoretical information theory?

Contributors to information theory

  • Statistical information theory: Claude Shannon, Warren Weaver, Ronald Fisher

  • Semantic information theory: Yehoshua Bar-Hillel, Rudolf Carnap

  • Algorithmic information theory: Ray Solomonoff, Andrey Kolmogorov, Gregory Chaitin

  • Constructive-type-theoretical information theory: Giuseppe Primiero, Tijn Borghuis, Fairouz Kamareddine, Rod Nedepelt?

  • Miscellaneous: Fred Dretske, Keith Devlin, Jon Barwise, Jeremy Seligman, Jaakko Hintikka

References

Last revised on August 30, 2023 at 16:54:49. See the history of this page for a list of all contributions to it.