The aim of this post is to prove that a cummulative distribution function (CDF) uniquely determines the probability distribution. This is a well known fundamental result that is quite intuitive but requires some advanced techniques to prove. In this post, we provide a complete proof using Dynkin’s π-λ Theorem. The previous post on this topic covered basic of topology and measure theory. If you are unfamiliar with concepts like measure, open set, Borel algebra then please see the previous post.

The main result we want to prove is the following.

**Theorem: **Let and be two random variables with CDF respectively. If then .

Notice that we want to prove something which essentially means proving . Proving this for every member of the σ-algebra is difficult. Instead we will prove it for a subset of σ-algebra and show that it holds for the entire σ-algebra. The last step will be achieved using the powerful Dynkin’s π-λ Theorem. This is going to be our first stop.

## Dynkin’s π-λ Theorem

To state the theorem we will have to define two more things:

**Definition (π-system):** A collection of subsets of is called a π-system if it is nonempty and closed under intersection i.e., .

It is trivial to verify that every algebra of subsets is a π-system.

**Definition (λ-system):** A collection of subsets of is called a **λ**-system if:

1. .

2. For any such that we have .

3. For any where and then .

This definition looks quite similar to that of a σ-algebra. Similarly a **λ**-system is closed under complement. Why? Let be a **λ**-system and let . Then as and therefore, . However, unlike a σ-algebra which is closed under countable union of arbitrary sets, the condition (3) above only holds for increasing sequence of sets (). It is however straightforward to show the following:

**Lemma 1:** Every σ-algebra is both a π-system and a λ-system

**Proof:** Trivial

With these two definition we are ready to state the Dynkin’s π-λ Theorem.

**Dynkin’s π-λ Theorem: **If is a π-system and is a **λ**-system and then

The proof of this theorem is short but tricky. We defer the proof for now to first demonstrate the usefulness of the above result.

**Usefulness: **The way we use Dynkin’s π-λ Theorem is by showing that the property holds for a π-system and which generates the σ-algebra of interest. Then we show that the collection of sets where the property holds is a **λ**-system. Dynkin’s π-λ Theorem then will show that the property holds for the entire σ-algebra.

**CDF Uniquely Determines the Distribution**

**Theorem: **Let and be two random variables with CDF respectively. If then .

**Proof: **Let be the collection of sets where the two probabiltiy measures agree, i.e., . We will construct a π-system where the property holds.

**Claim 1**: Let be the collection of closed rays on real number line. Then is a π-system, and .

**Proof of Claim 1:** The proof directly exploits the relation between closed rays and CDF. For any value of we have

.

Hence, i.e., all closed rays are in . Secondly, is closed under intersection as intersection of closed rays is another closed ray (). As is nonempty therefore, it is a π-system. As we proved in the last post, the collection of closed rays generates the Borel σ-algebra i.e., .

Great! so now we have a π-system where the desired result holds and that generates which in this case is the σ-algebra of interest.

**Claim 2:** is a λ-system

**Proof of Claim 2: **We prove each required property of a λ-system below.

- From unitarity property of a probability measure we have . Therefore, .
- Let and . Let then and are disjoint. Therefore, for any probability measure we have . Then

. Therefore, . - Let be a sequence of increasing sets and . For any probability measure we have: . We have for every value of . If two sequences have the same members then they have the same limit. Hence, . Therefore, .

**Wrapping up:** We showed that where is a π-system and is a **λ**-system. Therefore, using Dynkin’s π-λ theorem we get . What does this mean? It means that for any we have and therefore, by definition of we have . This is exactly what we wanted to prove.

**Proof of Dynkin’s π-λ Theorem**

In order to proof this result we will make use of a couple of basic lemma:

**Lemma 2: **Intersection of λ-systems is a λ-system.

**Proof: **The proof is trivial. For example, let be a set of λ-systems (so this is a set of set of set #inception). Now let and . As each is a λ-system therefore, for every . This implies, .

**Lemma 3:** Intersection of σ-algebra is a σ-algebra

**Proof:** Trivial to verify.

**Lemma 4:** If a collection of subsets of is both a π-system and a λ-system then it is a σ-algebra

**Proof: ** Let be both a π-system and a λ-system. Then we will show it satisfies all three properties of a σ-algebra

- As is a λ-system therefore, . Similarly, using the second property of a λ-system we have, .
- Let then .
- Let be a collection of subsets of . Define a sequence of increasing sets as follows: and . Further, . If somehow we could show that then we can show that . This would be interesting since we will show that every
**λ**-system is a σ-algebra as we haven’t used the property of a π-system so far. However, as we will see we need the property to show . We make an inductive argument: . Assume for all values of $k <= i$. Now . As is closed under complement (point 2 above) and intersection (S is a π-system) therefore, . Hence, proved.

**Dynkin’s π-λ Theorem: **If is a π-system and is a **λ**-system and then .

**Proof: **Let be the smallest λ-system containing . As intersection of λ-systems is another λ-system therefore, we can define as the intersection of every λ-system containing . As the power set of is a λ-system therefore, is well defined. As is another λ-system containing and therefore, . However, we can go further and claim . As if this wasn’t the case then the intersection will be a smaller λ-system containing which is a contradiction. This argument will be repeated again and again so make sure you understand it!

Now every σ-algebra is a λ-system and as contains therefore, . Here we have used the above argument to show containment. At this point we have shown and . If we can show that then we will be done. This is exactly what we will show.

Now this is where the proof gets tricky. We first define a rather unintuitive set:

For any we define . We will now make a couple of claims:

**Claim 1: ** is a λ-system for any .**
Proof of Claim 1:** We prove each property of a λ-system below:

- therefore,
- Let and . This implies and . Further, .

As is a λ-system therefore, . From the definition of this implies . - Let be a countable collection of sets in . This implies, . As is a λ-system therefore, . This implies, .

So far we have not use the fact that is a π-system. The next claim will exploit this property:

**Claim 2:** for every .

**Proof of Claim 2: **As is a π-system therefore, . As therefore, . This implies .

**Claim 3:** Define . Then .

**Proof of Claim 3:** For we have as a λ-system (Claim 1) containing (Claim 2). This implies for every . Therefore, for any we have which implies .

Using Claim 3 we will now strengthen Claim 2.

**Claim 4: ** for every .

**Proof of Claim 4: **Let and then (Claim 3). This implies .

**Claim 5: ** is closed under intersection.

**Proof of Claim 5: **For we have as a λ-system (Claim 1) containing (Claim 4). This implies for every . Now let . Then and therefore, by definition of we have . Proof is completed by observing that are arbitrary members of .

At this point, take a moment to observe the interesting pattern in Claim 2-5.

**Wrapping Up: **Claim 5 implies is both a π-system and a λ-system. From Lemma 4, this implies it is a σ-algebra. Further, hence we must have otherwise, we can create a smaller σ-algebra which is smaller than and contains (Lemma 3). However, we also showed earlier that . This implies .

At the beginning of the proof, we showed and . Therefore, we have . Hence, proved.

**Further Reading**

If you are interested in learning more about probability theory then John Pike‘s lecture notes are a useful resource:

- Elementary Probability Theory, John Pike
- Probability Theory 1, John Pike
- Probability Theory 2, John Pike