Let \(m\left( \sigma \right) = \log \sigma \)
Then
\({m^{'}}\left( \sigma \right) = \frac{1}{\sigma }\)and\({\left( {{m^{'}}\left( \sigma \right)} \right)^{2}} = \frac{1}{{{\sigma ^2}}}\)
First calculate the fisher information\(I\left( \sigma \right)\)in X as follows:
Normal distribution with parameter\(\mu = 0\)and\({\sigma ^{2}}\)is
\(\begin{align}f\left( {x|\mu ,{\sigma ^{2}}} \right) &= \frac{1}{{\sqrt {2\pi \sigma } }}\exp \left( { - \frac{{{x^2}}}{{2{\sigma ^2}}}} \right)\\\lambda \left( {x|\sigma } \right) = \log f\left( {x|\sigma } \right)\\ &= \log \left( {\frac{1}{{\sqrt {2\pi \sigma } }}\exp \left( { - \frac{{{x^2}}}{{2{\sigma ^2}}}} \right)} \right)\end{align}\)
\(\lambda \left( {x|\sigma } \right) = - \log \sigma - \frac{{{x^2}}}{{2{\sigma ^2}}} + const\)
Differentiate \(\lambda \left( {x|\sigma } \right)\) with respect to \(\sigma \)
\(\begin{align}{\lambda ^{'}}\left( {x|\sigma } \right) &= \frac{\partial }{{\partial \sigma }}\left( { - \log \sigma - \frac{{{x^2}}}{{2{\sigma ^2}}} + const} \right)\\ &= - \frac{1}{\sigma } + \frac{{{x^2}}}{{{\sigma ^3}}}\end{align}\)
Again differentiate \({\lambda ^{'}}\left( {x|\sigma } \right)\) with respect to \(\sigma \)
\(\begin{align}{\lambda ^{''}}\left( {x|\sigma } \right) &= \frac{\partial }{{\partial \sigma }}\left( { - \frac{1}{\sigma } + \frac{{{x^2}}}{{{\sigma ^3}}}} \right)\\ &= \frac{1}{{{\sigma ^2}}} + \frac{{3{x^2}}}{{{\sigma ^4}}}\end{align}\)
Hence from the equation of fisher information \(I\left( \sigma \right)\) in X is
\(\begin{align}I\left( \sigma \right) &= - {E_\sigma }\left( {{\lambda ^{''}}\left( {x|\sigma } \right)} \right)\\ &= - {E_\sigma }\left( {\frac{1}{{{\sigma ^2}}} + \frac{{3{x^2}}}{{{\sigma ^4}}}} \right)\\ &= \frac{1}{{{\sigma ^2}}} - \frac{{3E\left( {{x^2}} \right)}}{{{\sigma ^4}}}\end{align}\)
\(\begin{align}I\left( \sigma \right) &= \frac{1}{{{\sigma ^2}}} - \frac{{3\left( {{\sigma ^2}} - 0 \right)}}{{{\sigma ^4}}}\\ &= \frac{2}{{{\sigma ^2}}}\end{align}\)
Hence if T is an unbiased estimator of \(\log \sigma \), it follows from equation of cramer rao inequality that is
\(\begin{align}Var\left( T \right) \ge \frac{{{{\left( {{m^{'}}\left( \theta \right)} \right)}^{2}}}}{{nI\left( \theta \right)}}\\ &= \frac{1}{{{\sigma ^{2}}}} \times \frac{{{\sigma ^{2}}}}{{2n}}\\ &= \frac{1}{{2n}}\end{align}\)
Therefore, lower bound specified by the information inequality for the variance of any biased estimator of \(\log \sigma \) is \(\frac{1}{{2n}}\)