Mathursday: Dynkin’s π-λ Theorem and CDF (Part 2)

The aim of this post is to prove that a cummulative distribution function (CDF) uniquely determines the probability distribution. This is a well known fundamental result that is quite intuitive but requires some advanced techniques to prove. In this post, we provide a complete proof using Dynkin’s π-λ Theorem. The previous post on this topic covered basic of topology and measure theory. If you are unfamiliar with concepts like measure, open set, Borel algebra then please see the previous post.

The main result we want to prove is the following.

Theorem: Let f: (X_1, \Sigma_1, P_1) \rightarrow (\mathbb{R}, \mathbb{B}) and g: (X_2, \Sigma_2, P_2) \rightarrow (\mathbb{R}, \mathbb{B}) be two random variables with CDF F, G respectively. If F(x) = G(x), \,\, \forall x \in \mathbb{R} then P_1(g^{-1}(A))=P_2(g^{-1}(A)), \,\, \forall A \in \mathbb{B}(\mathbb{R}).

Notice that we want to prove P_1(g^{-1}(A))=P_2(g^{-1}(A)) for every A \in \mathbb{B}(\mathbb{R}). Proving this for every member of a σ-algebra is difficult. Instead we will prove it for a subset of σ-algebra and show that it holds for the entire σ-algebra. The last step will be achieved using the powerful Dynkin’s π-λ Theorem. This is going to be our first stop.

Dynkin’s π-λ Theorem

To state the theorem we will have to define two more things:

Definition (π-system): A collection of subsets S of X is called a π-system if it is nonempty and closed under intersection i.e., \forall A, B \in S \rightarrow A \cap B \in S.

It is trivial to verify that every algebra of subsets is a π-system.

Definition (λ-system): A collection of subsets S of X is called a λ-system if:
1. X \in S.
2. For any A, B \in S such that A \subseteq B we have B - A \in S.
3. For any \{A_i\}_{i=1}^\infty where \forall i \in \mathbb{N}, A_i \in S and A_i \subseteq A_{i+1} then \cup_{i=1}^\infty A_i \in S.

This definition looks quite similar to that of a σ-algebra. Similarly a λ-system is closed under complement. Why? Let S be a λ-system and let A \in S. Then as X \in S and A \subseteq X therefore, A^c = X - A \in S. However, unlike a σ-algebra which is closed under countable union of arbitrary sets, the condition (3) above only holds for increasing sequence of sets (A_i \subseteq A_{i+1}). It is however straightforward to show the following:

Lemma 1: Every σ-algebra is both a π-system and a λ-system
Proof: Trivial

With these two definition we are ready to state the Dynkin’s π-λ Theorem.

Dynkin’s π-λ Theorem: If A is a π-system and B is a λ-system and A \subseteq B then A \subseteq \sigma(A) \subseteq B

The proof of this theorem is short but tricky. We defer the proof for now to first demonstrate the usefulness of the above result.

Usefulness: The way we use Dynkin’s π-λ Theorem is by showing that the property holds for a π-system and which generates the σ-algebra of interest. Then we show that the collection of sets where the property holds is a λ-system. Dynkin’s π-λ Theorem then will show that the property holds for the entire σ-algebra.

CDF Uniquely Determines the Distribution

Theorem: Let f: (X_1, \Sigma_1, P_1) \rightarrow (\mathbb{R}, \mathbb{B}) and g: (X_2, \Sigma_2, P_2) \rightarrow (\mathbb{R}, \mathbb{B}) be two random variables with CDF F, G respectively. If F(x) = G(x), \,\, \forall x \in \mathbb{R} then P_1(g^{-1}(A))=P_2(g^{-1}(A)), \,\,\ \forall A \in \mathbb{B}(\mathbb{R}).

Proof: Let S be the collection of sets where the two probabiltiy measures agree, i.e., S = \left\{ A \in \mathbb{B}(\mathbb{R}) \mid P_1(g^{-1}(A)) = P_2(g^{-1}(A))\right\}. We will construct a π-system where the property holds.

Claim 1: Let T = \left\{ \left(-\infty, x\right] \mid x \in (-\infty, \infty] \right\} be the collection of closed rays on real number line. Then T is a π-system, T \subseteq S and \sigma(T) = \mathbb{B}(\mathbb{R}).
Proof of Claim 1: The proof directly exploits the relation between closed rays and CDF. For any value of x \in (-\infty, \infty] we have

P_1(g^{-1}((-\infty, x])) = F(x) = G(x) = P_2(g^{-1}((-\infty, x])).

Hence, (-\infty, x] \in T i.e., all closed rays are in T. Secondly, T is closed under intersection as intersection of closed rays is another closed ray (\,\, (-\infty, x] \cap (-\infty, y] = (-\infty, \min(x,y)] \,\,). As T is nonempty therefore, it is a π-system. As we proved in the last post, the collection of closed rays generates the Borel σ-algebra i.e., \sigma(T) = \mathbb{B}(R).

Great! so now we have a π-system where the desired result holds and that generates \mathbb{B}(\mathbb{R}) which in this case is the σ-algebra of interest.

Claim 2: S is a λ-system
Proof of Claim 2: We prove each required property of a λ-system below.

  • From unitarity property of a probability measure we have P_1(X) = 1 = P_2(X). Therefore, X \in S.
  • Let A, B \in S and A \subseteq B. Let C = B-A then C and A are disjoint. Therefore, for any probability measure P we have P(C \cup A) = P(A) + P(C) \Rightarrow P(C) = P(B) - P(A). Then
    P_1(C) = P_1(B) - P_1(A) = P_2(B) - P_2(B) = P_2(C). Therefore, C=B-A \in S.
  • Let \{A_i\}_{i=1}^\infty be a sequence of increasing sets and \forall i \in \mathbb{N}, \,\, A_i \in S. For any probability measure P we have: P(\cup_{i=1}^\infty A_i) = \lim_{n \rightarrow \infty} P(A_n). We have P_1(A_n) = P_2(A_n) for every value of n. If two sequences have the same members then they have the same limit. Hence, \lim_{n \rightarrow \infty} P_1(A_n) = \lim_{n \rightarrow \infty} P_2(A_n) \Rightarrow P_1(\cup_{i=1}^\infty A_i) = P_2(\cup_{i=1}^\infty A_i). Therefore, \cup_{i=1}^\infty A_i \in S.

Wrapping up: We showed that T \subseteq S where T is a π-system and S is a  λ-system. Therefore, using Dynkin’s π-λ theorem we get \sigma(T) = \mathbb{B}(\mathbb{R}) \subseteq S. What does this mean? It means that for any A \in \mathbb{B}(\mathbb{R}) we have A \in S and therefore, by definition of S we have P_1(g^{-1}(A)) = P_2(g^{-1}(A)). This is exactly what we wanted to prove.

Proof of Dynkin’s π-λ Theorem

In order to proof this result we will make use of a couple of basic lemma:

Lemma 2: Intersection of λ-systems is a λ-system.
Proof: The proof is trivial. For example, let \{S_i\}_1^\infty be a set of λ-systems (so this is a set of set of set #inception). Now let A, B \in \cap_{i=1}^\infty S_i and A \subseteq B. As each S_i is a λ-system therefore, B-A \in S_i for every i \in \mathbb{N}. This implies, B-A \in \cap_{i=1}^\infty S_i.

Lemma 3: Intersection of σ-algebra is a σ-algebra
Proof: Trivial to verify.

Lemma 4: If a collection of subsets of X is both a π-system and a λ-system then it is a σ-algebra
Proof:  Let S be both a π-system and a λ-system. Then we will show it satisfies all three properties of a σ-algebra

  • As S) is a λ-system therefore, X \in S. Similarly, using the second property of a λ-system we have, X \subseteq X \Rightarrow \emptyset = X - X \in S.
  • Let E \in S then E \subseteq X \Rightarrow E^c = X - A \in S.
  • Let \{E_i\}_{i=1}^\infty be a collection of subsets of X. Define a sequence of increasing sets as follows: Y_1 = E_1 and Y_i = E_i \cup Y_{i-1}. Further, Y_1 \subseteq Y_2 \subseteq Y_3 \cdots. If somehow we could show that Y_i \in S then we can show that \cup_{i=1}^\infty Y_i = \cup_{i=1}^\infty E_i \in S. This would be interesting since we will show that every λ-system is a σ-algebra as we haven’t used the property of a π-system so far. However, as we will see we need the property to show Y_i \in S. We make an inductive argument: Y_1 = E_1 \in S. Assume Y_k \in S for all values of $k <= i$. Now Y_{i+1} = E_{i+1} \cup Y_i = \left( E_{i+1}^c \cap Y_i^c \right)^c. As S is closed under complement (point 2 above) and intersection (S is a π-system) therefore, Y_{i+1} \in S. Hence, proved.

Dynkin’s π-λ Theorem: If A is a π-system and B is a λ-system and A \subseteq then A \subseteq \sigma(A) \subseteq B.

Proof: Let \ell(A) be the smallest λ-system containing A.  As intersection of λ-systems is another λ-system therefore, we can define \ell(A) as the intersection of every λ-system containing A. As the power set of X is a λ-system therefore, \ell(A) is well defined. As B is another λ-system containing A and therefore, |\ell(A)| <= |B|. However, we can go further and claim \ell(A) \subseteq B. As if this wasn't the case then the intersection \ell(A) \cap B will be a smaller λ-system containing A which is a contradiction. This argument will be repeated again and again so make sure you understand it!

Now every σ-algebra is a λ-system and as \sigma(A) contains A therefore, \ell(A) \subseteq \sigma(A). Here we have used the above argument to show containment. At this point we have shown \ell(A) \subseteq B and \ell(A) \subseteq \sigma(A). If we can show that \ell(A) = \sigma(A) then we will be done. This is exactly what we will show.

Now this is where the proof gets tricky. We first define a rather unintuitive set:

For any E \subseteq X we define D_E = \{ F \mid F \subseteq X, E \cap F \in \ell(A) \}. We will now make a couple of claims:

Claim 1: D_E is a λ-system  for any E \in \ell(A).
Proof of Claim 1:
We prove each property of a λ-system below:

  • X \cap E = E \in \ell(A) therefore, X \in D_E
  • Let F_1, F_2 \in D_E and F_1 \subseteq F_2. This implies F_1 \cap E \in \ell(A) and F_2 \cap E \in \ell(A). Further, F_1 \subseteq F_2 \Rightarrow F_1 \cap  E \subseteq F_2 \cap E.
    As \ell(A) is a λ-system therefore, F_2 \cap E - F_1 \cap E = (F_2 - F_1) \cap E \in \ell(A). From the definition of D_E this implies F_2 - F_1 \in D_E.
  • Let \{F_i\}_{i=1}^\infty be a countable collection of sets in D_E. This implies, F_i \cap E \in \ell(A). As \ell(A) is a λ-system therefore, \cap_{i=1}^\infty (F_i \cap E) = \left( \cap_{i=1}^\infty F_i \right) \cap E \in \ell(A). This implies, \cap_{i=1}^\infty F_i \in D_E.

So far we have not use the fact that A is a π-system. The next claim will exploit this property:

Claim 2: A \subseteq D_E for every E \in A.
Proof of Claim 2: As A is a π-system therefore, F \in A \Rightarrow F \cap E \in A. As A \subseteq \ell(A) therefore, F \cap E \in \ell(A). This implies F \in D_E.

Claim 3: Define A \cap \ell(A) = \left\{ E \cap F \mid E \in A, F \in \ell(A)\right\}. Then A \cap \ell(A) \subseteq \ell(A).
Proof of Claim 3:  For E \in A \subseteq \ell(A) we have D_E as a λ-system (Claim 1) containing A (Claim 2). This implies \ell(A) \subseteq D_E for every E \in A. Therefore, for any F \in \ell(A) we have F \in D_E which implies F \cap E \in \ell(A).

Using Claim 3 we will now strengthen Claim 2.

Claim 4: A \subseteq D_E for every E \in \ell(A).
Proof of Claim 4:  Let E \in \ell(A) and F \in A then F \cap E \in \ell(A) (Claim 3). This implies F \in D_E.

Claim 5: \ell(A) is closed under intersection.
Proof of Claim 5: For E \in \ell(A) we have D_E as a λ-system (Claim 1) containing A (Claim 4). This implies \ell(A) \subseteq D_E for every E \in \ell(A). Now let F \in \ell(A). Then F \in D_E and therefore, by definition of D_E we have F \cap E \in \ell(A). Proof is completed by observing that E, F are arbitrary members of \ell(A).

At this point, take a moment to observe the interesting pattern in Claim 2-5.

Wrapping Up: Claim 5 implies \ell(A) is both a π-system and a λ-system. From Lemma 4, this implies it is a σ-algebra. Further, A \subseteq \ell(A) hence we must have \sigma(A) \subseteq \ell(A) otherwise, we can create a smaller σ-algebra \ell(A) \cap \sigma(A) which is smaller than \sigma(A) and contains A (Lemma 3). However, we also showed earlier that \ell(A) \subseteq \sigma(A). This implies \sigma(A) = \ell(A).

At the beginning of the proof, we showed A \subseteq \ell(A) and \ell(A) \subseteq B. Therefore, we have A \subseteq \ell(A) = \sigma(A) \subseteq B. Hence, proved.

Further Reading

If you are interested in learning more about probability theory then John Pike‘s lecture notes are a useful resource:

Leave a comment