Answer:
Correct option:
b. converges on the true parameter μ as the sample size increases.
Step-by-step explanation:
An estimator is a value of the sample statistic that is used to estimate or predict the true value of the parameter.
A consistent estimator converges in probability to the true parameter value as the sample size is increased.
That is, as n → ∞ the estimator → parameter.
The term converges in probability implies that when two random variables, say X and Y are almost similar to each other then there is a very high probability that the difference between them is vary small (denotes by ε)
[tex]P(|X-Y|>\epsilon)[/tex]
This probability statement simply indicates that X is very far from Y.
If X becomes closer to Y, i,e when X converges to Y the value of on increasing the sample size of X (say n) then [tex]P(|X-Y|>\epsilon)[/tex] tends 0.
So, as the sample size is increased the difference between the sample mean and population mean keeps on decreasing causing the probability of their difference tending to 0.
So, the sample mean is a consistent estimator for the population mean, μ.
The correct option is:
"b. converges on the true parameter μ as the sample size increases. "