I didn't quite understand before moment generating
function(MGF). Today, I finally dig hard MGF, Make a summary here , And finally helped me understand moment and MGF Role of . If you don't understand, you can take a few minutes to read this article .

first , What is? moment?

moment In statistics, it means expectation , such as E ( X ) , E ( X 2 ) , … , E(X),E(X^2),…, E(X),E(X2),…,
Each represents a distribution first moment,second moment,…

So what is the use of these expectations in statistics , Relatively simple , Everyone knows first moment It expresses some kind of distributed expectation , and second
moment To some extent, it can reflect the distribution variance, Because we have σ 2 = E ( X 2 ) − E ( X ) 2 \sigma^2 =
E(X^2)- E(X)^2σ2=E(X2)−E(X)2. however third and fourth moment It is also an important parameter reflecting the distribution characteristics , such as third
moment It reflects the asymmetry of the distribution (asymmetry), It can also be said to be inclination ?(skewness), The formula is s k e w n e s s = E (
( x − μ ) 3 ) / σ 3 skewness = E((x-\mu)^3)/\sigma^3skewness=E((x−μ)3)/σ3
;fourth moment It reflects kurtosis (kurtosis:how heavy its tails are), k u r t o s i s = E (
( x − μ ) 4 ) / σ 4 kurtosis = E((x-\mu)^4)/\sigma^4kurtosis=E((x−μ)4)/σ4
. So for us ,momnet It is a very important parameter to reflect the distribution of a sample .

Then we go on to see what is moment generating
function(MGF), What I understand is MGF Is to build a solution to help you moment An instrumental function of ( Like its name ,to generate
moment), First, let's see momnet Solution of : E ( X n ) = ∫ − ∞ ∞ π n × p d f ( x ) d x E(X^n)
=\int_{-\infty}^{\infty}{\pi^n}\times pdf(x) dxE(Xn)=∫−∞∞​πn×pdf(x)dx

and MGF yes : M G F ( t ) = E ( e t x ) = MGF(t) = E(e^{tx}) = {} MGF(t)=E(etx)=
M G F ( t ) = E ( e t x ) = { ∑ e t x × p ( x ) , x : d i s c r e t e ; p ( x
) : p m f ∫ e t x × f ( x ) , x : c o n t i n o u s ; f ( x ) : p d f
MGF(t)=E(e^{tx})=\left\{ \begin{aligned} & \sum e^{tx}\times p(x), x: discrete;
p(x): pmf \\ & \int e^{tx}\times f(x), x: continous; f(x): pdf \\ \end{aligned}
\right.MGF(t)=E(etx)=⎩⎪⎨⎪⎧​​∑etx×p(x),x:discrete;p(x):pmf∫etx×f(x),x:continous;f
(x):pdf​

At first glance, people may look confused , What I'm asking is moment E ( X n ) E(X^n) E(Xn), instead of E ( e t x ) E(e^{tx}) E(
etx). So keep looking , From our past knowledge of Statistics , We know the basis MGF seek moment The best way is : yes MDF seek n Order derivative , Then bring in t=0, You can get E ( X n
) E(X^n)E(Xn).
n-th moment:
E ( X n ) = d n d t n M G F ( t ) ∣ t = 0 E(X^n) =
\frac{d^n}{dt^n}MGF(t)|_{t=0}E(Xn)=dtndn​MGF(t)∣t=0​

But why ? Next, we give the derivation process :
Using Taylor's expansion, we can get :

e t x = 1 + t x + ( t x ) 2 2 ! + ( t x ) 3 3 ! + ( t x ) 4 4 ! + . . + ( t
x ) n n ! e^{tx} = 1 + tx + \frac{(tx)^2}{2!}+ \frac{(tx)^3}{3!}+
\frac{(tx)^4}{4!}+...+ \frac{(tx)^n}{n!}etx=1+tx+2!(tx)2​+3!(tx)3​+4!(tx)4​+...+
n!(tx)n​

Then we can expect from this formula

E ( e t x ) = E ( 1 ) + t E ( x ) + t 2 2 ! E ( t 2 ) + t 3 3 ! E ( t 3 ) + t
4 4 ! E ( t 4 ) + . . + t n n ! E ( t n ) E(e^{tx}) = E(1) + tE(x) +
\frac{t^2}{2!}E(t^2)+ \frac{t^3}{3!}E(t^3)+ \frac{t^4}{4!}E(t^4)+...+
\frac{t^n}{n!}E(t^n)E(etx)=E(1)+tE(x)+2!t2​E(t2)+3!t3​E(t3)+4!t4​E(t4)+...+n!tn​
E(tn)

In the expectation 1 Order derivative , Bring in t=0

d d t E ( e t x ) = d d t E ( 1 ) + d d t t E ( x ) + d d t t 2 2 ! E ( t 2 )
+ d d t t 3 3 ! E ( t 3 ) + d d t t 4 4 ! E ( t 4 ) + . . + d d t t n n ! E (
t n ) \frac{d}{dt}E(e^{tx}) = \frac{d}{dt}E(1) + \frac{d}{dt}tE(x) +
\frac{d}{dt}\frac{t^2}{2!}E(t^2)+ \frac{d}{dt}\frac{t^3}{3!}E(t^3)+
\frac{d}{dt}\frac{t^4}{4!}E(t^4)+...+ \frac{d}{dt}\frac{t^n}{n!}E(t^n)dtd​E(etx)
=dtd​E(1)+dtd​tE(x)+dtd​2!t2​E(t2)+dtd​3!t3​E(t3)+dtd​4!t4​E(t4)+...+dtd​n!tn​E(
tn)

Substitute in t = 0

= 0 + E ( x ) + 0 + . . + 0 = E ( X ) = 0 +E(x) + 0 +...+ 0 = E(X) =0+E(x)+0
+...+0=E(X)

What we get in the end is X What are your expectations , If we want to second moment, So what are you bringing in t=0 Continue to find a derivative before , Take it again t=0 Can get second
moment. Similarly ,n-th moment It is on this basis n Order derivative .

Then you may have doubts , Why don't we just ask E ( X n ) E(X^n) E(Xn)
, But to pass MGF Come and ask moment And ? The answer is also obvious , Compared with solving higher-order integral , It's much easier to find the derivative .

Let's take an example , Suppose our pdf Is an exponential distribution , So we have
f ( x ) = { λ e − λ x , x > 0 0 , e l s e f(x)=\left\{ \begin{aligned}
&\lambda e^{-\lambda x}, x>0 \\ &0, else \\ \end{aligned} \right.f(x)={​λe−λx,x>
00,else​

Then his MGF yes

M G F ( t ) = E ( e t x ) = ∫ 0 ∞ e t x × λ e − λ x d x MGF(t) = E(e^{tx})
=\int_{0}^{\infty} e^{tx}\times \lambda e^{-\lambda x}dxMGF(t)=E(etx)=∫0∞​etx×λe
−λxdx
= λ ∫ 0 ∞ e ( t − λ ) x d x , ( t − λ < 0 ) =\lambda
\int_{0}^{\infty}e^{(t-\lambda)x}dx, (t-\lambda < 0)=λ∫0∞​e(t−λ)xdx,(t−λ<0)
= λ ∣ 1 t − λ × e ( t − λ ) x ∣ 0 ∞ =\lambda|\frac{1}{t-\lambda}\times
e^{(t-\lambda)x}|^\infty_{0}=λ∣t−λ1​×e(t−λ)x∣0∞​
= λ λ − t =\frac{\lambda}{\lambda-t} =λ−tλ​

Is the final result very simple , Just take the derivative of the above simple example moment, For example, ask third moment: E ( x 3 ) = d 3 d t 3
( λ λ − t ) E(x^3) = \frac{d^3}{dt^3}(\frac{\lambda}{\lambda-t})E(x3)=dt3d3​(λ−t
λ​), And if according to third moment Definition seeking , E ( x 3 ) = ∫ 0 ∞ x 3 λ e − λ x d x E(x^3) =
\int_{0}^{\infty}x^3\lambda e^{-\lambda x}dxE(x3)=∫0∞​x3λe−λxdx

So you're doing derivative and integral , Which do you prefer ?

Technology