Variance of Discrete Random Variable from PGF

Theorem

Let $X$ be a discrete random variable whose probability generating function is $\map {\Pi_X} s$.


Then the variance of $X$ can be obtained from the second derivative of $\map {\Pi_X} s$ with respect to $s$ at $x = 1$:

$\var X = \map {\Pi' '_X} 1 + \mu - \mu^2$

where $\mu = \expect X$ is the expectation of $X$.


Proof

From the definition of the probability generating function:

$\ds \map {\Pi_X} s = \sum_{x \mathop \ge 2} \map p x s^x$


From Derivatives of Probability Generating Function at One:

$\ds \map {\Pi' '_X} s = \sum_{x \mathop \ge 2} x \paren {x - 1} \map p x s^{x - 2}$

But it also holds when you include $x = 0$ and $x = 1$ in the sum, as in both cases the term evaluates to zero and therefore vanishes.

So:

$\ds \map {\Pi' '_X} s = \sum_{x \mathop \ge 0} x \paren {x - 1}\map p x s^{x - 2}$


Plugging in $s = 1$ gives:

\(\ds \map {\Pi' '_X} 1\) \(=\) \(\ds \sum_{x \mathop \ge 0} x \paren {x - 1} \map p x 1^{x - 2}\)
\(\ds \) \(=\) \(\ds \sum_{x \mathop \ge 0} x^2 \map p x - \sum_{x \mathop \ge 0} x \map p x\)
\(\ds \) \(=\) \(\ds \expect {X^2} - \expect X\)


The result follows from the definition of variance:

$\var X = \expect {X^2} - \paren {\expect X}^2$

after a little algebra.

$\blacksquare$


Motivation

shows how to find the variance of a discrete random variable without the need to go through the tedious process of what might be a complicated and fiddly summation.

All you need to do is differentiate its PGF twice, and plug in $1$.

Assuming, of course, you know what its PGF is.


Sources

  • 1986: Geoffrey Grimmett and Dominic Welsh: Probability: An Introduction ... (previous) ... (next): $\S 4.3$: Moments: $(20)$
  • 1998: David Nelson: The Penguin Dictionary of Mathematics (2nd ed.) ... (previous) ... (next): probability generating function
  • 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): probability generating function