The basic idea behind the jackknife variance estimator lies in systematically recomputing the statistic estimate, leaving out one or more observations at a time from the sample set. From this new set of replicates of the statistic, an estimate for the bias and an estimate for the variance of the statistic can be calculated.
Instead of using the jackknife to estimate the variance, it may instead be applied to the log of the variance. This transformation may result in better estimates particularly when the distribution of the variance itself may be non normal.
For many statistical parameters the jackknife estimate of variance tends asymptotically to the true value almost surely. In technical terms one says that the jackknife estimate is [[Consistency (statistics)|consistent]]. The jackknife is consistent for the sample [[mean]]s, sample [[variance]]s, central and non-central t-statistics (with possibly non-normal populations), sample [[coefficient of variation]], [[maximum likelihood estimator]]s, least squares estimators, and [[regression coefficient]]s.
It is not consistent for the sample [[median]]. In the case of a unimodal variate the ratio of the jackknife variance to the sample variance tends to be distributed as one half the square of a chi square distribution with two [[degrees of freedom]].
The jackknife like the original bootstrap is dependent on the independence of the data. Extensions of the jackknife to allow for dependence in the data have been proposed.
Another extension is the delete a group method used in association with [[Poisson sampling]].