Information content, the Glossary
In information theory, the information content, self-information, surprisal, or Shannon information is a basic quantity derived from the probability of a particular event occurring from a random variable.[1]
Table of Contents
74 relations: A Mathematical Theory of Communication, A priori and a posteriori, Additive map, Bernoulli trial, Categorical variable, Cauchy's functional equation, Claude Shannon, Coin flipping, Combination, Convolution, Degeneracy (mathematics), Degenerate distribution, Deterministic system, Dice, Differential entropy, Dirac measure, Discrete uniform distribution, E (mathematical constant), Entropy (information theory), Equiprobability, Event (probability theory), Expected value, Fair coin, Finite measure, George Carlin, Hartley (unit), Independence (probability theory), Independent and identically distributed random variables, Information theory, Integer, Isomorphism, Joint probability distribution, Kolmogorov complexity, Law of total probability, Likelihood function, Logarithm, Logit, Lottery, Lottery mathematics, Measure (mathematics), Measure space, Monotonic function, Multinomial distribution, Mutual information, Myron Tribus, Nat (unit), Natural logarithm, Normalization (statistics), Number, Obverse and reverse, ... Expand index (24 more) »
- Entropy and information
A Mathematical Theory of Communication
"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948. Information content and a Mathematical Theory of Communication are information theory.
See Information content and A Mathematical Theory of Communication
A priori and a posteriori
A priori ('from the earlier') and a posteriori ('from the later') are Latin phrases used in philosophy to distinguish types of knowledge, justification, or argument by their reliance on experience.
See Information content and A priori and a posteriori
Additive map
In algebra, an additive map, Z-linear map or additive function is a function f that preserves the addition operation: f(x + y).
See Information content and Additive map
Bernoulli trial
In the theory of probability and statistics, a Bernoulli trial (or binomial trial) is a random experiment with exactly two possible outcomes, "success" and "failure", in which the probability of success is the same every time the experiment is conducted.
See Information content and Bernoulli trial
Categorical variable
In statistics, a categorical variable (also called qualitative variable) is a variable that can take on one of a limited, and usually fixed, number of possible values, assigning each individual or other unit of observation to a particular group or nominal category on the basis of some qualitative property.
See Information content and Categorical variable
Cauchy's functional equation
Cauchy's functional equation is the functional equation: f(x+y).
See Information content and Cauchy's functional equation
Claude Shannon
Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, computer scientist and cryptographer known as the "father of information theory" and as the "father of the Information Age". Information content and Claude Shannon are information theory.
See Information content and Claude Shannon
Coin flipping
Coin flipping, coin tossing, or heads or tails is the practice of throwing a coin in the air and checking which side is showing when it lands, in order to randomly choose between two alternatives, heads or tails, sometimes used to resolve a dispute between two parties.
See Information content and Coin flipping
Combination
In mathematics, a combination is a selection of items from a set that has distinct members, such that the order of selection does not matter (unlike permutations).
See Information content and Combination
Convolution
In mathematics (in particular, functional analysis), convolution is a mathematical operation on two functions (f and g) that produces a third function (f*g).
See Information content and Convolution
Degeneracy (mathematics)
In mathematics, a degenerate case is a limiting case of a class of objects which appears to be qualitatively different from (and usually simpler than) the rest of the class; "degeneracy" is the condition of being a degenerate case.
See Information content and Degeneracy (mathematics)
Degenerate distribution
In mathematics, a degenerate distribution (sometimes also Dirac distribution) is, according to some, a probability distribution in a space with support only on a manifold of lower dimension, and according to others a distribution with support only at a single point.
See Information content and Degenerate distribution
Deterministic system
In mathematics, computer science and physics, a deterministic system is a system in which no randomness is involved in the development of future states of the system.
See Information content and Deterministic system
Dice
Dice (die or dice) are small, throwable objects with marked sides that can rest in multiple positions.
See Information content and Dice
Differential entropy
Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy (a measure of average surprisal) of a random variable, to continuous probability distributions. Information content and Differential entropy are entropy and information and information theory.
See Information content and Differential entropy
Dirac measure
In mathematics, a Dirac measure assigns a size to a set based solely on whether it contains a fixed element x or not.
See Information content and Dirac measure
Discrete uniform distribution
In probability theory and statistics, the discrete uniform distribution is a symmetric probability distribution wherein a finite number of values are equally likely to be observed; every one of n values has equal probability 1/n.
See Information content and Discrete uniform distribution
E (mathematical constant)
The number is a mathematical constant approximately equal to 2.71828 that can be characterized in many ways.
See Information content and E (mathematical constant)
Entropy (information theory)
In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Information content and entropy (information theory) are entropy and information and information theory.
See Information content and Entropy (information theory)
Equiprobability
Equiprobability is a property for a collection of events that each have the same probability of occurring.
See Information content and Equiprobability
Event (probability theory)
In probability theory, an event is a set of outcomes of an experiment (a subset of the sample space) to which a probability is assigned.
See Information content and Event (probability theory)
Expected value
In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first moment) is a generalization of the weighted average.
See Information content and Expected value
Fair coin
In probability theory and statistics, a sequence of independent Bernoulli trials with probability 1/2 of success on each trial is metaphorically called a fair coin.
See Information content and Fair coin
Finite measure
In measure theory, a branch of mathematics, a finite measure or totally finite measure is a special measure that always takes on finite values.
See Information content and Finite measure
George Carlin
George Denis Patrick Carlin (May 12, 1937 – June 22, 2008) was an American stand-up comedian, social critic, actor, and author.
See Information content and George Carlin
Hartley (unit)
The hartley (symbol Hart), also called a ban, or a dit (short for "decimal digit"), is a logarithmic unit that measures information or entropy, based on base 10 logarithms and powers of 10.
See Information content and Hartley (unit)
Independence (probability theory)
Independence is a fundamental notion in probability theory, as in statistics and the theory of stochastic processes.
See Information content and Independence (probability theory)
Independent and identically distributed random variables
In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent.
See Information content and Independent and identically distributed random variables
Information theory
Information theory is the mathematical study of the quantification, storage, and communication of information.
See Information content and Information theory
Integer
An integer is the number zero (0), a positive natural number (1, 2, 3,...), or the negation of a positive natural number (−1, −2, −3,...). The negations or additive inverses of the positive natural numbers are referred to as negative integers.
See Information content and Integer
Isomorphism
In mathematics, an isomorphism is a structure-preserving mapping between two structures of the same type that can be reversed by an inverse mapping.
See Information content and Isomorphism
Joint probability distribution
Given two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs.
See Information content and Joint probability distribution
Kolmogorov complexity
In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a predetermined programming language) that produces the object as output. Information content and Kolmogorov complexity are information theory.
See Information content and Kolmogorov complexity
Law of total probability
In probability theory, the law (or formula) of total probability is a fundamental rule relating marginal probabilities to conditional probabilities.
See Information content and Law of total probability
Likelihood function
A likelihood function (often simply called the likelihood) measures how well a statistical model explains observed data by calculating the probability of seeing that data under different parameter values of the model.
See Information content and Likelihood function
Logarithm
In mathematics, the logarithm is the inverse function to exponentiation.
See Information content and Logarithm
Logit
In statistics, the logit function is the quantile function associated with the standard logistic distribution.
See Information content and Logit
Lottery
A lottery (or lotto) is a form of gambling that involves the drawing of numbers at random for a prize.
See Information content and Lottery
Lottery mathematics
Lottery mathematics is used to calculate probabilities of winning or losing a lottery game.
See Information content and Lottery mathematics
Measure (mathematics)
In mathematics, the concept of a measure is a generalization and formalization of geometrical measures (length, area, volume) and other common notions, such as magnitude, mass, and probability of events.
See Information content and Measure (mathematics)
Measure space
A measure space is a basic object of measure theory, a branch of mathematics that studies generalized notions of volumes.
See Information content and Measure space
Monotonic function
In mathematics, a monotonic function (or monotone function) is a function between ordered sets that preserves or reverses the given order.
See Information content and Monotonic function
Multinomial distribution
In probability theory, the multinomial distribution is a generalization of the binomial distribution.
See Information content and Multinomial distribution
Mutual information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. Information content and mutual information are entropy and information and information theory.
See Information content and Mutual information
Myron Tribus
Myron Tribus (October 30, 1921 – August 31, 2016) was an American organizational theorist, who was the director of the Center for Advanced Engineering Study at MIT from 1974 to 1986.
See Information content and Myron Tribus
Nat (unit)
The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of ''e'', rather than the powers of 2 and base 2 logarithms, which define the shannon.
See Information content and Nat (unit)
Natural logarithm
The natural logarithm of a number is its logarithm to the base of the mathematical constant e, which is an irrational and transcendental number approximately equal to.
See Information content and Natural logarithm
Normalization (statistics)
In statistics and applications of statistics, normalization can have a range of meanings.
See Information content and Normalization (statistics)
Number
A number is a mathematical object used to count, measure, and label.
See Information content and Number
Obverse and reverse
The obverse and reverse are the two flat faces of coins and some other two-sided objects, including paper money, flags, seals, medals, drawings, old master prints and other works of art, and printed fabrics.
See Information content and Obverse and reverse
Odds
In probability theory, odds provide a measure of the probability of a particular outcome.
See Information content and Odds
One half
One half is the irreducible fraction resulting from dividing one (1) by two (2), or the fraction resulting from dividing any number by its double.
See Information content and One half
Outcome (probability)
In probability theory, an outcome is a possible result of an experiment or trial.
See Information content and Outcome (probability)
Polar regions of Earth
The polar regions, also called the frigid zones or polar zones, of Earth are Earth's polar ice caps, the regions of the planet that surround its geographical poles (the North and South Poles), lying within the polar circles.
See Information content and Polar regions of Earth
Probability
Probability is the branch of mathematics concerning events and numerical descriptions of how likely they are to occur.
See Information content and Probability
Probability distribution
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of possible outcomes for an experiment.
See Information content and Probability distribution
Probability mass function
In probability and statistics, a probability mass function (sometimes called probability function or frequency function) is a function that gives the probability that a discrete random variable is exactly equal to some value.
See Information content and Probability mass function
Probability measure
In mathematics, a probability measure is a real-valued function defined on a set of events in a σ-algebra that satisfies measure properties such as countable additivity.
See Information content and Probability measure
Probability space
In probability theory, a probability space or a probability triple (\Omega, \mathcal, P) is a mathematical construct that provides a formal model of a random process or "experiment".
See Information content and Probability space
Probability theory
Probability theory or probability calculus is the branch of mathematics concerned with probability.
See Information content and Probability theory
Random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events.
See Information content and Random variable
Random variate
In probability and statistics, a random variate or simply variate is a particular outcome or ''realization'' of a random variable; the random variates which are other outcomes of the same random variable might have different values (random numbers).
See Information content and Random variate
Relationships among probability distributions
In probability theory and statistics, there are several relationships among probability distributions.
See Information content and Relationships among probability distributions
Sampling (signal processing)
In signal processing, sampling is the reduction of a continuous-time signal to a discrete-time signal.
See Information content and Sampling (signal processing)
Scoring rule
In decision theory, a scoring rule provides evaluation metrics for probabilistic predictions or forecasts.
See Information content and Scoring rule
Shannon (unit)
The shannon (symbol: Sh) is a unit of information named after Claude Shannon, the founder of information theory.
See Information content and Shannon (unit)
Shannon's source coding theorem
In information theory, Shannon's source coding theorem (or noiseless coding theorem) establishes the statistical limits to possible data compression for data whose source is an independent identically-distributed random variable, and the operational meaning of the Shannon entropy. Information content and Shannon's source coding theorem are information theory.
See Information content and Shannon's source coding theorem
Sigma-additive set function
In mathematics, an additive set function is a function \mu mapping sets to numbers, with the property that its value on a union of two disjoint sets equals the sum of its values on these sets, namely, \mu(A \cup B).
See Information content and Sigma-additive set function
Support (mathematics)
In mathematics, the support of a real-valued function f is the subset of the function domain containing the elements which are not mapped to zero.
See Information content and Support (mathematics)
Surprisal analysis
Surprisal analysis is an information-theoretical analysis technique that integrates and applies principles of thermodynamics and maximal entropy. Information content and Surprisal analysis are information theory.
See Information content and Surprisal analysis
Surprise (emotion)
Surprise is a rapid, fleeting, mental and physiological state.
See Information content and Surprise (emotion)
Units of information
In digital computing and telecommunications, a unit of information is the capacity of some standard data storage system or communication channel, used to measure the capacities of other systems and channels. Information content and units of information are information theory.
See Information content and Units of information
Variance
In probability theory and statistics, variance is the expected value of the squared deviation from the mean of a random variable.
See Information content and Variance
Without loss of generality
Without loss of generality (often abbreviated to WOLOG, WLOG or w.l.o.g.; less commonly stated as without any loss of generality or with no loss of generality) is a frequently used expression in mathematics.
See Information content and Without loss of generality
See also
Entropy and information
- Akaike information criterion
- Approximate entropy
- Ascendency
- Binary entropy function
- Conditional entropy
- Conditional mutual information
- Cross-entropy
- Differential entropy
- Dudley's entropy integral
- Entropy (information theory)
- Entropy coding
- Entropy estimation
- Entropy in thermodynamics and information theory
- Entropy of network ensembles
- Gibbs algorithm
- Inequalities in information theory
- Information content
- Information fluctuation complexity
- Information gain ratio
- Joint entropy
- Kullback–Leibler divergence
- Landauer's principle
- Maximum entropy probability distribution
- Mean dimension
- Measure-preserving dynamical system
- Molecular demon
- Mutual information
- Negentropy
- Nonextensive entropy
- Partition function (mathematics)
- Perplexity
- Pointwise mutual information
- Principle of maximum caliber
- Principle of maximum entropy
- Rényi entropy
- Random number generation
- Softplus
- Topological entropy
- Transfer entropy
- Tsallis entropy
- Variation of information
References
[1] https://en.wikipedia.org/wiki/Information_content
Also known as Self inform, Self information, Self informed, Self informing, Self informs, Self-entropy, Self-inform, Self-information, Self-informed, Self-informing, Self-informs, Shannon information, Shannon information content, Surprisal, Surprisals.
, Odds, One half, Outcome (probability), Polar regions of Earth, Probability, Probability distribution, Probability mass function, Probability measure, Probability space, Probability theory, Random variable, Random variate, Relationships among probability distributions, Sampling (signal processing), Scoring rule, Shannon (unit), Shannon's source coding theorem, Sigma-additive set function, Support (mathematics), Surprisal analysis, Surprise (emotion), Units of information, Variance, Without loss of generality.