Definition – Moment Generating Function

Contents
1. Definition
2. M^{(r)}(0) = E(X^r)
3. Moment generating function of sum of iid random variables

1. Definition

Let X_i be a random variable. The moment generating function of X, denoted M(t), is defined as,

M(t) = E(e^{tx})

Ie, in the discrete case,

M(t) = \sum_x e^{xt}p(x)

And in the continuous case,

M(t) = \int^\infty_{-\infty} e^{xt}p(x)~dx

2. \mathbf{M^{(r)}(0) = E(X^r)}

For the discrete case, differentiating with respect to t r times, gives,

M^{(r)}(t) = \sum_x x^r e^{xt}p(x)

And so,

M^{(r)}(0) = \sum_x x^rp(x) = E(X^r)

A similar derivation can be made for the continuous case.

3. Moment generating function of sum of iid random variables

Theorem – Let S be a random variable equal to the sum of n iid random variables, X_i. Then, E(e^{tx_1})^n = M(t)^n.

Proof –

Let S = X_1 + X_2 + ... + X_n be a random variable such that each of the X_i‘s are identically and independently distributed random variables. Then, it follows that

M(t) = M(X_1 + X_2 + ... + X_n) = E(e^{ts}) =  E(e^{t(x_1 + x_2 + ... + x_n)}) \\\\ = E(e^{tx_1})\times E(e^{tx_2}) \times ... \times E(e^{tx_n})\quad\text{(1)}

From the uniqueness of the moment generating function, we have that each E(e^{tx_i}) is the same, and so (1) can be written as M_S(t) = E(e^{tx_1})^n = (M_{X_1}(t))^n