### Archive

Posts Tagged ‘estimation’

## Probability manipulations with Mathematica…

There comes a time that a statistician needs to do some ananytic calculations. There more than a bunch of tools to use but I usually prefer Mathematica or Maple. Today, I’m gonna use Mathematica to do a simple exhibition.

Let’s set this example upon the  $U(2 \theta _1-\theta _2\leq x\leq 2 \theta _1+\theta _2)$  distribution.

pfun = PDF[UniformDistribution[{2*Subscript[θ, 1] - Subscript[θ, 2],
2*Subscript[θ, 1] + Subscript[θ, 2]}], x]

$\begin{cases} \frac{1}{2 \theta _2} & 2 \theta _1-\theta _2\leq x\leq 2 \theta _1+\theta _2 \\ 0 & \text{True} \end{cases}$

One of the most intensive calculations is the characteristic function (eq. the moment generating function). This is straightforward to derive.

cfun=CharacteristicFunction[UniformDistribution[
{2*Subscript[θ, 1]-Subscript[θ, 2],2*Subscript[θ, 1]+Subscript[θ, 2]}],x]

$-\frac{i \left(-e^{i x \left(2 \theta _1-\theta _2\right)}+e^{i x \left(2 \theta _1+\theta _2\right)}\right)}{2 x \theta _2}$.

The Table[] command calculates for us the raw moments for our distribution.

Table[Limit[D[cfun, {x, n}], x -> 0]/I^n, {n, 4}]

$\left\{2 \theta _1,\frac{1}{3} \left(12 \theta _1^2+\theta _2^2\right),2 \theta _1 \left(4 \theta _1^2+\theta _2^2\right),16 \theta _1^4+8 \theta _1^2 \theta _2^2+\frac{\theta _2^4}{5}\right\}$.

Calculate the sample statistics.

T=List[8.23,6.9,1.05,4.8,2.03,6.95];
{Mean[T],Variance[T]}

$\{4.99333,8.46171\}$.

Now, we can use a simple moment matching technique to get estimates for the parameters.

Solve[{Mean[T]-2*Subscript[θ, 1]==0,-(2*Subscript[θ, 1])^2+
1/3 (12 Subscript[θ, 1]^2+\!\*SubsuperscriptBox[$$θ$$, $$2$$, $$2$$])-
Variance[T]==0},{Subscript[θ, 2],Subscript[θ, 1]}]

$\left\{\left\{\theta _1\to 2.49667,\theta _2\to -5.03836\right\},\left\{\theta _1\to 2.49667,\theta _2\to 5.03836\right\}\right\}$.

Check the true value for the $\theta _2$.

Reduce[2 Subscript[θ, 1]-Subscript[θ, 2]<=2 Subscript[θ, 1]+Subscript[θ, 2],
Subscript[θ, 2]]

$\theta _1\in \text{Reals}\&\&\theta _2\geq 0 .$

Then, $\left\{\left\{\theta _1\to 2.49667,\theta _2\to 5.03836\right\}\right\}$.

Categories: probability, statistics

## (Im)Perfect Detectors…

Detecting reliably an event is surely something to worry about in applied science. One of the main models used is the perfect-imperfect detector, obviously underlying a Poisson process…  Two detectors are counting events generated by a source (eg a photon device). The first one detects efficiently (perfect) the events whether the other one lacks efficiency.

Then X~Poi(λ) and Y~Poi(λp), where p is the inefficiency ratio and estimation is (almost) trivial. Assume that m,r are the counts of X,Y respectively. Furthermore, let k be the total observations and n the observations of X.

The mle of λ is m/n as usual. What about p?

The likelihood is proportional to $exp \left[ -\lambda p\left( k-n \right) \right]{{\left( \lambda p \right)}^{r}}$, so taking logarithms and differentiating with respect to p gives us that the mle is $\hat{p}=\frac{mr}{n(k-n)}$.

The real question is : what if $\frac{mr}{n(k-n)}>1$?

Categories: statistics