May 12, 2022, 1:49 p.m.
Expected Value of a Conditional Prob
By Maurice Ticas
So in Section 3.7, titled "Conditional distributions and conditional expectation", of Geoffrey Grimmett's Probability and Random Processes text, we are introduced to the probabilistic tool that
\[ \mathbb{E}[\psi (X)] = \mathbb{E}[Y].\]
We begin to elaborate and clearly expound the basic idea of this probabilistic tool of that section.
Let \( X,Y\) be two random variables coming from the same probability space \( (\Omega, \cal{B}, \mathbb{P}) \). To understand this tool, denote \( \psi(X) \) to be the conditional expectation of \(Y\) given \(X\). For a realization of \(X\), we have that \( \psi(x) = \mathbb{E}[Y|X=x] \). Thus we have a newly created random variable \( \psi(X) \) from the two random variables \(X, Y\).
From our newly created random variable, we can express its expected value to be the expected value of the conditional expectation of \(Y\) given \(X\). In symbols,
\[ \mathbb{E}[\psi(X)] = \mathbb{E}[\mathbb{E}(Y|X)]. \]
The probabilistic tool is that \( \mathbb{E}[ \psi(X)] \) is equal to \( \mathbb{E}(Y) \). In Grimmett's text, the tool is brought to our attention as a theorem.
Grimmett further illustrates the use of this tool with an example of a chicken that lays \(N\) eggs with \(N\) coming from a Poisson distribution. Each laid egg independently hatches to a chick with probability \(p\), thus defining a binomial distribution once the event of \(N\) laid eggs has occurred.
The interesting question is the one that asks the time traveled question: Given that we know the number \(k\) of chicks, what is the expected value of the number \(n\) of eggs that our mama chick has laid?
In answering this question, we must first find the conditional mass function \(f_{N|K} \) using some heavy algebra to realize that for \(n \geq k\), \(f_{N|K}(n|k) \) is equal to
\[
\dfrac{\binom{n}{k} p^{k} (1-p)^{n-k}(\lambda^n / n!) e^{-
\lambda}}{\sum_{m \geq k} \binom{m}{k} p^{k} (1-p)^{m-k}
(\lambda^{m}/m!) e^{- \lambda}} = \dfrac{(q
\lambda)^{n-k}}{(n-k)!} e^{-q \lambda},
\]
where \(q = (1-p)\).
After obtaining the conditional mass function, further algebra must be performed to finally get the expected value that
\[
\mathbb{E}(N|K = k) = \sum_{n \geq k} n \dfrac{(q
\lambda)^{n-k}}{(n-k)!} e^{-q \lambda} = k + q \lambda.
\]
Therefore, we get our final conclusion that \( \psi(K) = K + q \lambda \).
As we now know, the mechanics of a lot of our probabilistic tools require heavy use of daunting algebra. How should we alleviate such mechanical tasks to better illuminate the probabilistic ideas? Any suggestion is as good as mine.
There are 0 comments. No more comments are allowed.