Processing math: 100%

Unbiasedness/ Unbiased Estimators| Statistical Inference




Unbiasedness/ Unbiased Estimators
 

Definition: An Estimator or a function of statistic \small T_{n}(x_{1},x_{2},x_{3},......x_{n}) is called unbiased estimator for a function of parameter \small \theta , say \small \gamma(\theta) if;

\small E(T_{n})=\small \gamma(\theta), where \small \theta belongs to the parameter space \small \Theta

 

Biased: let, \small b_{\theta} is the biased then 

\small b_{\theta}=E(T_{n})-\small \gamma(\theta)

 

Remarks: 

 

1. If,  \small E(T_{n} > \small \theta then it is called positively biased. 

2. If,  \small E(T_{n} < \small \theta then it is called negatively biased. 

 

Example:
 

1.\small (x_{1},x_{2},x_{3},......x_{n}) is a random sample from a normal population, \small N(\mu,1), show that \small t=\frac{\sum_{i=1}^{n} x_{i}^2}{n} is an unbiased estimator of \small \mu ^2 +1.

 

Ans: According to the question,

\small E(x_{i})=\small \mu,
\small V(x_{i})=\small 1       for all \small i=1,2,3,.........n

\small E(x_{i}^2)=\small V(x_{i}) +\small [E(x_{i})]^2 = \small 1+\mu^2

\small E(t)=\small E(\frac{\sum_{i=1}^{n} x_{i}^2}{n}) = \small 1+\mu^2

hence t is an unbiased estimator for \small 1+\mu^2 .

 

 

2. If T is unbiased estimator \small \theta then show that \small T^2 is biased estimator for \small \theta ^2.

 

Ans: According to the question, 

\small E(T)= \theta and,

\small V(T)= E(T^2) - [E(T)]^2

or, \small E(T^2)= \theta ^2+ V(T)

or, \small E(T^2) \neq \small \theta ^2

\small T^2 is a biased estimator for \small \theta ^2

 

 

3. Show that \frac{\sum x_{i}(\sum x_{i}-1)}{n(n-1)} is an unbiased estimator for \small \theta ^2 for the sample \small x_{1}, x_{2},..... x_{n} drawn on X which takes the values 0 or 1 with respective probabilities \small \theta and \small (1-\theta).

 

Ans: T=\small \sum_{i=1}^{n}x_{i}\simB(n,\theta)

\small E(T)=\small \theta

\small V(T)=\small n \theta (1 -\theta)

So, 

 E[\frac{\sum x_{i}(\sum x_{i}-1)}{n(n-1)}]=E\small \frac{T(T-1)}{n(n-1)}

= \small \frac{ E(T^2)-E(T)}{n(n-1)}

=\small \frac{ Var(t)+{E(T)}^2-E(T)}{n(n-1)}

=\small \frac{n\theta (1-\theta)+n^2\theta62-n\theta}{n(n-1)}

=\small \theta ^2

So,[\frac{\sum x_{i}(\sum x_{i}-1)}{n(n-1)}]is an unbiased estimator for \theta^2 


4.Suppose \small X and \small Y are independent random variable with the same unknown mean \small \mu. Both \small X and \small Y have same variances. let \small T= aX+bY be an estimator of \small \mu.

 

i. Show that \small T is an unbiased estimator of \small \mu if a+b=1.

[Hint: E(T)=\small a\mu+b\mu=\small \mu]

ii. Find the var(T)=?(when \small a=\frac{1}{3},b=\frac{2}{3})

[Hint: Var(T)=\small a^2 Var(X)+ b^2 Var(Y), Var(X)=36=Var(Y)]

 

5.Examine the unbiasednes of following estimates.

i. \small s_{1}^2=\frac{\sum_{i=1}^{n} (x_{i} - \bar x)}{n} 

ii. \small s_{2}^2=\frac{\sum_{i=1}^{n} (x_{i} - \mu)}{n} 

[Hint(i)]

\small E(s_{1}^2)=\small \frac{n-1}{n} \sigma ^2

[Hint(ii)]

[\small E(s_{2}^2)=\small \sigma ^2]

 

6. If \small X_{1},X_{2},X_{3},.... X_{n} is random sample of size \small n drawn from a population mean \small \mu, variance \small \sigma ^2

\small T=\sum_{i=1}^{n}\left | (x_{i}-\bar x) \right |

examine if T is an unbiased estimator for \small \sigma. If not obtain an unbiased estimator of \small \sigma.

[Hint: \small E(T)= \frac{\sum_{i=1}^{n}E\left | (x_{i}-\bar x) \right |}{n} = \small \sqrt\frac{2}{\pi}\sigma ]



 

 





Post a Comment (0)
Previous Post Next Post