Search results
Results From The WOW.Com Content Network
An important example of a log-concave density is a function constant inside a given convex body and vanishing outside; it corresponds to the uniform distribution on the convex body, which explains the term "central limit theorem for convex bodies".
An example application of the method of moments is to estimate polynomial probability density distributions. In this case, an approximating polynomial of order is defined on an interval [,]. The method of moments then yields a system of equations, whose solution involves the inversion of a Hankel matrix. [2]
This theorem can be used to disprove the central limit theorem holds for by using proof by contradiction. This procedure involves proving that Lindeberg's condition fails for X k {\displaystyle X_{k}} .
This section illustrates the central limit theorem via an example for which the computation can be done quickly by hand on paper, unlike the more computing-intensive example of the previous section. Sum of all permutations of length 1 selected from the set of integers 1, 2, 3
The method of moments was introduced by Pafnuty Chebyshev for proving the central limit theorem; Chebyshev cited earlier contributions by Irénée-Jules Bienaymé. [2] More recently, it has been applied by Eugene Wigner to prove Wigner's semicircle law, and has since found numerous applications in the theory of random matrices. [3]
The central limit theorem gives only an asymptotic distribution. As an approximation for a finite number of observations, it provides a reasonable approximation only when close to the peak of the normal distribution; it requires a very large number of observations to stretch into the tails.
The central limit theorem can provide more detailed information about the behavior of than the law of large numbers. For example, we can approximately find a tail probability of M N {\displaystyle M_{N}} – the probability that M N {\displaystyle M_{N}} is greater than some value x {\displaystyle x} – for a fixed value of N {\displaystyle N} .
This is justified by considering the central limit theorem in the log domain (sometimes called Gibrat's law). The log-normal distribution is the maximum entropy probability distribution for a random variate X —for which the mean and variance of ln( X ) are specified.