Definition

Let

Then the Fisher information is defined as:

Remark

Important

The bigger the Fisher information , the better the information obtained about .

This equation is derived under Regularity conditions:

For a random sample , the Fisher information is:

I_n(\theta) = nI(\theta) = -nE\left[\frac{\partial^2}{\partial \theta^2} \ln f(X;\theta)\right] $$ > [!note] > > Fisher information measures the amount of information that the sample carries about the parameter $\theta$. It is the weighted mean of $\left(\frac{\partial}{\partial \theta} \ln f(x;\theta)\right)^2$, where the weights are given by the pdf $f(x;\theta)$. > > The greater these derivatives are on average, the more information we get about $\theta$. If the derivatives were equal to zero (so that $\theta$ would not be in $\ln f(x;\theta)$), there would be zero information about $\theta$.