Mar 9, 2010

Consistency and Convergence

A consistent sequence of estimators is a sequence of estimators that converge in probability to the quantity being estimated as the index (usually the sample size) grows without bound. In other words, increasing the sample size increases the probability of the estimator being close to the population parameter. Mathematically, a sequence of estimators \{t_n; n \ge 0\} is a consistent estimator for parameter θ if and only if, for all ε > 0, no matter how small, we have

 
\lim_{n\to\infty}\Pr\left\{
\left|
t_n-\theta\right|<\epsilon
\right\}=1.

The consistency defined above may be called Weak Consistency. The sequence is Strongly Consistent, if it Converges Almost Surely to the true value. To say that the sequence Xn converges almost surely or almost everywhere or with probability 1 or strongly towards X means that


    \operatorname{Pr}\!\left( \lim_{n\to\infty}\! X_n = X \right) = 1.
This means that the values of Xn approach the value of X, in the sense (see almost surely) that events for which Xn does not converge to X have probability 0. Using the probability space \scriptstyle (\Omega, \mathcal{F}, P ) and the concept of the random variable as a function from Ω to R, this is equivalent to the statement

    \operatorname{Pr}\Big( \omega \in \Omega : \lim_{n \to \infty} X_n(\omega) = X(\omega) \Big) = 1.

0 comments:

Post a Comment

 
Locations of visitors to this page