Statistics/Probability/Bayesian

From testwiki
Jump to navigation Jump to search

Bayesian analysis is the branch of statistics based on the idea that we have some knowledge in advance about the probabilities that we are interested in, so called a priori probabilities. This might be your degree of belief in a particular event, the results from previous studies, or a general agreed-upon starting value for a probability. The terminology "Bayesian" comes from the Bayesian rule or law, a law about conditional probabilities. The opposite of "Bayesian" is sometimes referred to as "Classical Statistics."

Example

Consider a box with 3 coins, with probabilities of showing heads respectively 1/4, 1/2 and 3/4. We choose arbitrarily one of the coins. Hence we take 1/3 as the a priori probability P(C1) of having chosen coin number 1. After 5 throws, in which X=4 times heads came up, it seems less likely that the coin is coin number 1. We calculate the a posteriori probability that the coin is coin number 1, as:

P(C1|X=4)=P(X=4|C1)P(C1)P(X=4)=(54)(14)43413(54)(14)43413+(54)(12)41213+(54)(34)41413=

In words:

The probability that the Coin is the first Coin, given that we know heads came up 4 times... Is equal to the probability that heads came up 4 times given we know its the first coin, times the probability that the coin is the first coin. All divided by the probability that heads comes up 4 times.
33+32+81=3116

In the same way we find:

P(C2|X=4)=323+32+81=32116

and

P(C3|X=4)=813+32+81=81116.


This shows us that after examining the outcome of the five throws, it is most likely we did choose coin number 3.