I'm in this course called Statistical Inference and I was hoping someone could help me out with this exercise.

The trouble I'm having with is in part 4, but I'll give the answers to 1-3 to make it easier for the reader.

This is the exercise;

1.

$$\begin{array}{lcl}

\mbox{Let }L(\theta;y) \mbox{ be the likelihood}\\

\mbox{Let }l(\theta;y) \mbox{ be the log-likelihood }(l(\theta;y) = ln(L(\theta;y)))\\

\mbox{Let }u(\theta;y) \mbox{ be the score function}(u(\theta;y) = \frac{\partial l(\theta;y)}{\partial \theta})\\

\mbox{Then it can be shown that}\\

u(\theta;y)= \frac{n}{\theta}-\sum^n_{i=1}ln(y_i)

\end{array}$$

2.

$$\begin{array}{lcl}

\mbox{Let }\hat{\theta} \mbox{ be the MLE}\\

\mbox{Then it should hold that }u(\hat{\theta};y) = 0\\

\mbox{solving this gives}\\

\hat{\theta} = \frac{n}{\sum^n_{i=1}ln(y_i)}

\end{array}$$

3.

$$\begin{array}{lcl}

\mbox{The expected fisher information } I(\theta) \mbox{ is given as }\\

I(\theta) = -E[\frac{\partial u}{\theta} (\theta;y)]\\

\mbox{By computation;}\\

I(\theta) = \frac{n}{\theta^2}

\end{array}$$

So, for exercise 4, I can find that

I also have the answer, which uses the change of variable technique with x:= ln(y), and the expected fisher information associated to a single r.v. (i(theta)), but I don't have a clue what they're doing there.

Could anyone help me out?

reinout-g Oct 30, 2014