When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Integral probability metric - Wikipedia

    en.wikipedia.org/wiki/Integral_probability_metric

    In probability theory, integral probability metrics are types of distance functions between probability distributions, defined by how well a class of functions can distinguish the two distributions. Many important statistical distances are integral probability metrics, including the Wasserstein-1 distance and the total variation distance .

  3. Statistical distance - Wikipedia

    en.wikipedia.org/wiki/Statistical_distance

    In statistics, probability theory, and information theory, a statistical distance quantifies the distance between two statistical objects, which can be two random variables, or two probability distributions or samples, or the distance can be between an individual sample point and a population or a wider sample of points.

  4. Divergence (statistics) - Wikipedia

    en.wikipedia.org/wiki/Divergence_(statistics)

    The information geometry definition of divergence (the subject of this article) was initially referred to by alternative terms, including "quasi-distance" Amari (1982, p. 369) and "contrast function" Eguchi (1985), though "divergence" was used in Amari (1985) for the α-divergence, and has become standard for the general class. [1] [2]

  5. Total variation distance of probability measures - Wikipedia

    en.wikipedia.org/wiki/Total_variation_distance...

    The total variation distance (or half the norm) arises as the optimal transportation cost, when the cost function is (,) =, that is, ‖ ‖ = (,) = {(): =, =} = ⁡ [], where the expectation is taken with respect to the probability measure on the space where (,) lives, and the infimum is taken over all such with marginals and , respectively.

  6. Wasserstein metric - Wikipedia

    en.wikipedia.org/wiki/Wasserstein_metric

    In mathematics, the Wasserstein distance or Kantorovich–Rubinstein metric is a distance function defined between probability distributions on a given metric space. It is named after Leonid Vaseršteĭn .

  7. Energy distance - Wikipedia

    en.wikipedia.org/wiki/Energy_distance

    Energy distance is a statistical distance between probability distributions.If X and Y are independent random vectors in R d with cumulative distribution functions (cdf) F and G respectively, then the energy distance between the distributions F and G is defined to be the square root of

  8. Probabilistic metric space - Wikipedia

    en.wikipedia.org/wiki/Probabilistic_metric_space

    Let D+ be the set of all probability distribution functions F such that F(0) = 0 (F is a nondecreasing, left continuous mapping from R into [0, 1] such that max(F) = 1). Then given a non-empty set S and a function F : S × S → D+ where we denote F ( p , q ) by F p , q for every ( p , q ) ∈ S × S , the ordered pair ( S , F ) is said to be a ...

  9. Bhattacharyya angle - Wikipedia

    en.wikipedia.org/wiki/Bhattacharyya_angle

    In statistics, Bhattacharyya angle, also called statistical angle, is a measure of distance between two probability measures defined on a finite probability space. It is defined as It is defined as Δ ( p , q ) = arccos ⁡ BC ⁡ ( p , q ) {\displaystyle \Delta (p,q)=\arccos \operatorname {BC} (p,q)}