In some cases, outliers are very close to the mean, and thus our estimate of the mean can converge quickly on the true mean as we look at new samples. In other cases, outliers can be several orders of magnitude away from the mean, and our estimate converges very slowly or not at all.
I think this passage confuses several different things. Let me try to untangle it.
First, all outliers, by definition, are rare and are “far away from the mean” (compared to the rest of the data points).
Second, whether your data points are “close” to the mean or “several orders of magnitude” away from the mean is a function of the width (or dispersion or variance or standard deviation or volatility) of the underlying distribution. The width affects how precise your mean estimate from a fixed-size sample will be, but it does not affect the speed of the convergence.
The speed of the convergence is a function of what your underlying distribution is. If it’s normal (Gaussian), your mean estimate will converge at the same speed regardless of how high or low the variance of the distribution is. If it’s, say, a Cauchy distribution then the mean estimate will never converge.
Also, in small samples you generally don’t expect to get any outliers. If you do, your small-sample estimate is likely to be way out of whack and actually misleading.
I think this passage confuses several different things. Let me try to untangle it.
First, all outliers, by definition, are rare and are “far away from the mean” (compared to the rest of the data points).
Second, whether your data points are “close” to the mean or “several orders of magnitude” away from the mean is a function of the width (or dispersion or variance or standard deviation or volatility) of the underlying distribution. The width affects how precise your mean estimate from a fixed-size sample will be, but it does not affect the speed of the convergence.
The speed of the convergence is a function of what your underlying distribution is. If it’s normal (Gaussian), your mean estimate will converge at the same speed regardless of how high or low the variance of the distribution is. If it’s, say, a Cauchy distribution then the mean estimate will never converge.
Also, in small samples you generally don’t expect to get any outliers. If you do, your small-sample estimate is likely to be way out of whack and actually misleading.