When.com Web Search

  1. Ad

    related to: convergence rate of sieve estimates in construction business

Search results

  1. Results From The WOW.Com Content Network
  2. Rate of convergence - Wikipedia

    en.wikipedia.org/wiki/Rate_of_convergence

    In asymptotic analysis in general, one sequence () that converges to a limit is said to asymptotically converge to with a faster order of convergence than another sequence () that converges to in a shared metric space with distance metric | |, such as the real numbers or complex numbers with the ordinary absolute difference metrics, if

  3. Sieve estimator - Wikipedia

    en.wikipedia.org/wiki/Sieve_estimator

    Sieve estimators have been used extensively for estimating density functions in high-dimensional spaces such as in Positron emission tomography (PET). The first exploitation of Sieves in PET for solving the maximum-likelihood image reconstruction problem was by Donald Snyder and Michael Miller, [1] where they stabilized the time-of-flight PET problem originally solved by Shepp and Vardi. [2]

  4. Multivariate kernel density estimation - Wikipedia

    en.wikipedia.org/wiki/Multivariate_kernel...

    The rate of convergence of the MSE to 0 is the necessarily the same as the MISE rate noted previously O(n −4/(d+4)), hence the convergence rate of the density estimator to f is O p (n −2/(d+4)) where O p denotes order in probability. This establishes pointwise convergence.

  5. Sieve analysis - Wikipedia

    en.wikipedia.org/wiki/Sieve_analysis

    A sieve analysis (or gradation test) is a practice or procedure used in geology, civil engineering, [1] and chemical engineering [2] to assess the particle size distribution (also called gradation) of a granular material by allowing the material to pass through a series of sieves of progressively smaller mesh size and weighing the amount of material that is stopped by each sieve as a fraction ...

  6. Asymptotic theory (statistics) - Wikipedia

    en.wikipedia.org/wiki/Asymptotic_theory_(statistics)

    The rate of convergence must be chosen carefully, though, usually h ∝ n −1/5. In many cases, highly accurate results for finite samples can be obtained via numerical methods (i.e. computers); even in such cases, though, asymptotic analysis can be useful. This point was made by Small (2010, §1.4), as follows.

  7. Goldston–Pintz–Yıldırım sieve - Wikipedia

    en.wikipedia.org/wiki/Goldston–Pintz...

    The Goldston–Pintz–Yıldırım sieve (also called GPY sieve or GPY method) is a sieve method and variant of the Selberg sieve with generalized, multidimensional sieve weights. The sieve led to a series of important breakthroughs in analytic number theory. It is named after the mathematicians Dan Goldston, János Pintz and Cem Yıldırım. [1]

  8. Fundamental lemma of sieve theory - Wikipedia

    en.wikipedia.org/wiki/Fundamental_lemma_of_sieve...

    In number theory, the fundamental lemma of sieve theory is any of several results that systematize the process of applying sieve methods to particular problems. Halberstam & Richert [ 1 ] : 92–93 write:

  9. Richardson extrapolation - Wikipedia

    en.wikipedia.org/wiki/Richardson_extrapolation

    In numerical analysis, Richardson extrapolation is a sequence acceleration method used to improve the rate of convergence of a sequence of estimates of some value = (). In essence, given the value of A ( h ) {\displaystyle A(h)} for several values of h {\displaystyle h} , we can estimate A ∗ {\displaystyle A^{\ast }} by extrapolating the ...