Statistics/Probability/Bayesian
Bayesian analysis is the branch of statistics based on the idea that we have some knowledge in advance about the probabilities that we are interested in, so called a priori probabilities. This might be your degree of belief in a particular event, the results from previous studies, or a general agreed-upon starting value for a probability. The terminology "Bayesian" comes from the Bayesian rule or law, a law about conditional probabilities. The opposite of "Bayesian" is sometimes referred to as "Classical Statistics."
Example
Consider a box with 3 coins, with probabilities of showing heads respectively 1/4, 1/2 and 3/4. We choose arbitrarily one of the coins. Hence we take 1/3 as the a priori probability of having chosen coin number 1. After 5 throws, in which X=4 times heads came up, it seems less likely that the coin is coin number 1. We calculate the a posteriori probability that the coin is coin number 1, as:
In words:
- The probability that the Coin is the first Coin, given that we know heads came up 4 times... Is equal to the probability that heads came up 4 times given we know its the first coin, times the probability that the coin is the first coin. All divided by the probability that heads comes up 4 times.
In the same way we find:
and
- .
This shows us that after examining the outcome of the five throws, it is most likely we did choose coin number 3.