Asymptotic approach

As above the likelihood is


\begin{displaymath}
l(\theta \vert {\mbox{\boldmath$x$}}) \propto \exp \left(
\frac{-n}{2\sigma^2} (\mu - \bar x)^2
\right)
\end{displaymath} (34)

so the log-likelihood is


\begin{displaymath}
L(\theta \vert {\mbox{\boldmath$x$}}) = C -
\frac{n}{2\sigma^2} (\mu - \bar x)^2
\end{displaymath} (35)

where $C$ is a constant. Differentiating and equating to zero gives


\begin{displaymath}
\frac{n}{\sigma^2} (\theta - \bar x) = 0
\end{displaymath} (36)

so that


\begin{displaymath}
\hat \theta = \bar x
\end{displaymath} (37)

and a second differentiation gives


\begin{displaymath}
\sigma_n^{-2}=\frac{n}{\sigma^2}
\end{displaymath} (38)

giving, in this case, $\mu = 3.508$ and $\sigma_n = 0.00316$.

A histogram of 1000 samples from $N(3.508, 0.00316^2)$ is shown in figure 3. While not a good density estimator, a histogram is useful for the gross comparison of two samples needed here. Note that this histogram appears to come from a distribution with a smaller standard deviation than that shown in figure 1; the predictive sample is a safer estimate as it allows for the uncertainty in the point density estimation.

Figure:Histogram of estimative density (Maximum likelihood approximation).
\begin{figure}
\centering
\psfig{figure=../../thesis/pics/hist_ass.ps,width=4in,angle=270}
\end{figure}

danny 2009-07-23