Statistics/Point Estimates
Point Estimates
Definition: Suppose a random variable follows a statistical distribution or a law indexed by parameter . Then a function from the sample space to the parameter space is called a point estimator of .
In general, let be any function of . Then any function from the sample space to the domain of will be called a point estimator of .
Definition: If is a point estimator for , then for a realization of the random variable , the quantity is called a point estimate of and is denoted as .
Notice that the estimate is a random variable (unlike the true parameter ), since it depends on .
Examples
- Suppose follow independent Normal(μ,σ2). Then an estimator for the mean μ is the sample mean .
- Suppose follow Uniform[θ,θ+1]. Then an estimator for θ is . Another is . Yet another is
Notice that the above definition does not restrict the point estimator to only the "good" ones. For example, according to the definition it is perfectly fine to estimate the mean in the above example as something absurd, like . This freedom is in the definition is deliberate. In general, however, when we form point estimators we take some measure of goodness into account. It should be kept in mind that the point estimators will always be targeted to be close to the parameter it estimates, intuitively and if possible, formally.