The chaos and the doubt
By Guillaume Filion, filed under
law of large numbers,
Bayesian statistics,
coherence,
probability.
• 17 August 2012 •
Probability is said to be born of the correspondence between Pierre de Fermat and Blaise Pascal, some time in the middle of the 17th century. Somewhat surprisingly, many texts retrace the history of the concept up until the 20th century; yet it has gone through major transformations since then. Probability always describes what we don't know about the world, but the focus has shifted from the world to what we don't know.
Henri Poincaré investigates in Science et Méthode (1908) why chance would ever happen in a deterministic world. Like most of his contemporaries, Poincaré believed in absolute determinism, there is no phenomenon without a cause, even though our limited minds may fail to understand or see it. He distinguishes two flavors of randomness, of which he gives examples.
If a cone stands on its point we know that it will fall but we do not know which way (...) A very small cause, which escapes us, determines a considerable effect that we can not but see, and then we say that this effect is due to chance.
And a little bit later he continues.
How do we represent a container filled with gas? Countless molecules moving with great velocities, crisscross this vessel in all directions, and every moment they hit the walls, or they collide with each other, and these shocks occur in a variety of conditions. What strikes us most here is not the smallness of the causes but their complexity.
In short, for Poincaré, chance is an illusion produced by the limitation of our measuring instruments. An absolutely precise or complete knowledge of the initial conditions would dissipate the illusion and reveal the truly deterministic nature of the phenomenon. A spectacular example of this was given by the mathematician and illusionist Persi Diaconis who could predict the complex movements of a coin flip using ultra precise measurements.
Yet, in current terms, this view is the definition of chaotic (not stochastic) systems. Chaos theory, of which Henri Poincaré is considered the grandfather, has now moved away from randomness and rather deals with sensitivity to initial conditions, but the above quotes show that in the early 20th century both terms were somewhat synonym.
The modern theory of probability is the work of Paul Lévy and especially Andrey Kolmogorov. In his landmark Grundbegriffe der Wahrscheinlichkeitsrechnung (1933), Kolmogorov lays the axiomatic theory of probability, founded on measure theory. The success of this approach comes from a theorem known as the Strong Law of Large Numbers, which gives a condition for the mean of a measure to converge to its expected value (you can check out the technical section below if you are interested in the formal definition). It turns out that this condition is met for the average frequency of an event. In other words, Kolmogorov proved that under independent sampling, the average frequency tends to the probability of an event. This was a liberating result. It meant that there is no need to understand what probabilities represent because there is an objective way of measuring them.
$$ \sum_{i=1}^\infty \frac{Var(X_k)}{k^2} < \infty. $$
In the case of observed events, every $(X_i)$ is a Bernouilli variable in $(\{0,1\})$ (so the mean is the average observed frequency of the event). The variance of Bernouilli variables is bounded above by $(1/4)$ so the above criterion is always satisfied because
$$ \sum_{i=1}^\infty \frac{Var(X_k)}{k^2} < \sum_{i=1}^\infty \frac{1}{4k^2} = \frac{\pi^2}{24} < \infty. $$
Meanwhile, a young Italian statistician called Bruno de Finetti was following a radically different path. In Sul significato soggettivo della probabilità (1931) he introduced the concept of coherent beliefs (I give the formal definition in the technical section below). According to de Finetti, beliefs are incoherent if they leave possibilities of unconditional gain or loss should you bet on them. For instance, if a bookmaker offers one-to-one odds on a boxer and two-to-one odds on his opponent, you can bet $100 on the first and $50 on the second for an unconditional gain of $50, so his beliefs are incoherent.
$$\max_{\omega \in \Omega} \sum_{i=1}^n c_i \left( 1_{A_i}(\omega) - \tilde{P}(A_i) \right) \geq 0. $$
The numbers $(c_1, ..., c_n)$ can be thought of as losses in a bet, and the formula above can be interpreted as there is no way to secure an unconditional win against you. To follow up on the example given in the text above, say that $(A_1)$ is the event "boxer #1 wins", $(A_2)$ is the event "boxer #2 wins", and that their respective assessments are $(p)$ and $(q)$. Because the events are mutually exclusive, $(\sum_{i=1}^n c_i \left( 1_{A_i}(\omega) - \tilde{P}(A_i) \right))$ is $(c_1 (1 - p) - qc_2)$ if boxer #1 wins, and $(-pc_1 +(1-q) c_2)$ if boxer #2 wins.
If $(q = 1-p)$, it is easy to check that one of these values has to be positive (and both are null if $(c_1 = c_2)$), which means that the beliefs are consistent. But if, say $(q < 1-p)$, by setting $(c_1 = -1)$ and $(c_2 = -1)$ we get $(q - (1 - p))$ and $(p - (1 - q))$ which are both negative, and the beliefs are inconsistent.
Surprisingly, this subjective vision of probability, which only requires coherence in one's beliefs is equivalent to the axiomatic definition of Kolmogorov. A probability measure can be seen as a coherent belief. So what's the difference? you will ask. To me, it is that de Finetti's interpretation of probability does not aim to model the external world, but rather the human way of thinking. It is not seen as the imperfection in our measurements, but rather as a cognitive process at work in our mind. These two views of probability, as either a model for chaos or a model for doubt are part of the ongoing Bayesian vs frequentist fight that I introduced in The reverend's gambit and that I will have other occasions to talk about.
« Previous Post | Next Post »
blog comments powered by Disqus