1
$\begingroup$

It is referred to in An invariant form for the prior probability in estimation problems (Jeffreys).

  • 4
    Your link unfortunately returns a "file expired" error for me. I assume you know that $d\sigma/\sigma$ is the same as $d ln(\sigma)$, so that a change of variable may give a useful simplification.2010-12-15

1 Answers 1

3

don't know exactly what you want to know about $d\sigma/\sigma$.

mathematically, it is like lebesgue measure $dx$ on $\mathbb{R}$ - in the following sense:

$dx$ does not change under a simple additive change [or shift] of a variable: $y= x+a$, where $a$ is a constant. that is, $dy=dx$.

similarly, $d\sigma/\sigma$ does not change under a multiplicative change [or rescaling] of a variable: $y=c\sigma$, where $c>0$. that is, $dy/y=d\sigma/\sigma$.

both $dx$ on $\mathbb{R}$ and $d\sigma/\sigma$ on $\mathbb{R}^+$ are examples of haar measures [i.e., invariant measures] on $\mathbb{R}$ and $\mathbb{R}^+$, respecctively, considered as groups acting on themselves, as indicated.

as for jeffries: many [non-orthodox] bayesians consider $dx$ to be like a uniform [prior] distribution on $\mathbb{R}$. [it is the limit - in some sense - of the uniform distribution on [-A, A], as A $\to\infty$.] it is considered to be an 'objective prior', in that everyone [supposedly] agrees that it is THE uniform distribution on $\mathbb{R}$.

that $dx$ is improper - it is not actually a probability distribution on $\mathbb{R}$ - doesn't seem to faze these folks; as long as one comes up with a proper posterior distribution when using it [as is often the case], it can be taken as a 'prior'. an example is inference about the unknown mean $\mu$ of a normal distribution with $\sigma = 1$. if $X$ is one observation from this distribution, the conditional pdf of $X|\mu$ is

$$f_{X|\mu}(x|\mu) = \frac{1}{\sqrt{2\pi}}e^{\frac{1}{2}(x-\mu)^2},\quad -\infty < x < \infty.$$

multiplying by the 'uniform' prior 'pdf' of $\mu$, the [improper] joint pdf of $(X,\mu)$ is

$$f_{X,\mu}(x,\mu) = f_{X|\mu}(x|\mu){\mathbb{1}}_{(-\infty.\infty)}(\mu) = \frac{1}{\sqrt{2\pi}}e^{\frac{1}{2}(x-\mu)^2},\quad -\infty < x, \mu <\infty.$$

note that the [improper] prior 'pdf' for $\mu$ is $f_\mu(\mu)={\mathbb{1}}_{(-\infty,\infty)}(\mu)$, in that

$$\mathrm{'prob'}(a< \mu< b) = \int_a^b f_\mu(\mu) d\mu = \int_a^b d\mu = b-a.$$

then, integrating out the $\mu$ in the joint pdf, one gets that the marginal 'pdf' of $X$ is

$$f_X(x) = \int_{-\infty}^{\infty} f_{X|\mu}(x|\mu)d\mu = \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}}e^{\frac{1}{2}(x-\mu)^2}d\mu = {\mathbb{1}}_{(-\infty.\infty)}(x),$$

so that $X$ marginally has an improper distribution as well. finally, the posterior pdf for $\mu$ is

$$f_{\mu|X}(\mu|x)= f_{X,\mu}(x,\mu)/f_{X}(x) = \frac{1}{\sqrt{2\pi}}e^{\frac{1}{2}(x-\mu)^2},\quad -\infty < \mu < \infty,$$

so that $\mu|X$ is normal with mean $X$ and SD = 1.

with this example in mind, perhaps you can persuade yourself that $d\sigma/\sigma$ is some sort of analog of a uniform distribution on $\mathbb{{R}^+}$. [if one considers the parameter $\nu=\log\sigma $, this turns into the [lebesgue] uniform distribution for $\nu$, since [as pointed out by @hardmath], $d\nu=d(\log\sigma)=d\sigma/\sigma$.

whether or not you buy into this depends, i guess, on your propensity for being a 'true believer'.

  • 0
    Thank you very much for this answer. I guess I am a 'true believer.'2010-12-17