Kerala University 2005 B.C.A Computer Application estimation - Question Paper
RegNo........................................................................................................(3 pages) 6016
Name................................................................................................................
FOURTH SEMESTER B.C.A. DEGREE EXAMINATION, APRIL/MAY 2005
(Vocational Course) Optional Subject: Statistics Paper VIIESTIMATION
Time : Three Hours Maximum : 90 Marks
Each unit carries 50 marks.
Not more than 30 marks will be awarded from from each unit.
Statistical tables will be provided on request.
1. Let X,, Xj,..Xn be a random sample from a population with mean |i and finite variance a2. Show
B 2 i
that X - (X1 + X2 + . . . + is unbiased for ja but S2 = ]T](X. - Xj In is not unbiased
i = 1
for a2. Suggest an unbiased estimator for o2.
(8 marks)
-JhBefine consistency of an estimator. State and prove an sufficient condition for consistency.
(8 marks)
3. If there exist two unbiased estimators for a parameters 0, show that there exist infinitely many unbiased estiamtors for 6.
(6 marks)
4. State Fisher-Neymann factorization criterion. Show that sample mean is a sufficient estimator for 9 in the case of a Poisson population with, mean 9.
(6 marks)
5. Give an example of a sufficient estimator which is neither unbiased nor consistent. (6 marks)
6. If Xj, X is a random sample from a normal population with mean n and variance 1, show that T, = (X, + X,)/2 and T2 * 2 Xj - X2 are unbiased estimators of j*. Compare their efficiencies.
(8 marks)
0 < x < 0 o' elsewhere
where 0 > 0, Show that T = max (X X,. .Xr) is sufficient for 0.
(8 marks) Turn over
8. State and prove Cramer-Rao inequality. (7 marks)
9. Distinguish between minimum variance bound estimator and uniformly minimum variance unbiased estimator. Give an example in which these are equal.
(7 marks)
10. State and prove Rao-Blackwell theorem. (7 marks)
11. Let X,, X,,.... X be a random sample from f(x, 0) = I x ' *' where 0 < 0
[ 0, elsewhere
< 1. Obtain Cramer-Rao lower bound to the variance of an unbiased estimator of O2.
(6 marks)
12. Explain minimuni-chisquare method of estimation. (4 marks)
fSjc 0 < 3C < fl
13. Let f(x, 0) = < ez where 9 > 0. Obtain MLE of 0 based on a sample of size n.
[ 0, elsewhere
(6 marks)
\jsL- e-" xp - 1 o < x <
14. Let X,, X,..X)i be a random sample from f(x) ~ < (p) where m > 0,
I 0
p > 0. Obtain moment estimators of m and p.
(6 marks)
15. Let Xj, X ..X,, be a random sample from f (r, 8) = -r * 0 < x < 1 were q > q obtain
[ 0 , elsewhere
MLE of 6. Check whether the MLE is unbiased and consistent.
(7 marks)
16. Distinguish between Point estimation and Interval estimation. (6 marks)
17. Stating the assumptions clearly, obtain confidence interval with confidence coefficient 0.95 for
\2
the parameter p in f(x ; p, ct) = exp
, - 00 < X < 00.
2 a2
a v2n
(8 marks)
18. A population follows normal distribution with mean and variance 25. A sample of size 10 from the population has mean 58. For the confidence interval (52, 64) for p, compute the confidence coefficient.
19. Let Xj, Xj,. . . , XB be a random sample from a normal population with mean n and variance a2. Obtain 95 % confidence interval for a2, when (a) n is known ; (b) ji is unknown.
(10 marks)
20. Obtain a 90 % confidence interval for the correlation P. Given that the sample correlation based on a sample of size 18 is 0.58.
(8 marks)
21. In an exit poll out of 300 voters 170 favour candidate A. Obtain 95 % confidence interval for the proportion of votes that the candidate A will be getting in the actual election.
(6 marks)
22. Stating the assumptions clearly explain how do you construct large sample confidence intervals.
(6 marks)
Attachment: |
Earning: Approval pending. |