When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Hellinger distance - Wikipedia

    en.wikipedia.org/wiki/Hellinger_distance

    In probability and statistics, the Hellinger distance (closely related to, although different from, the Bhattacharyya distance) is used to quantify the similarity between two probability distributions. It is a type of f-divergence. The Hellinger distance is defined in terms of the Hellinger integral, which was introduced by Ernst Hellinger in 1909.

  3. f-divergence - Wikipedia

    en.wikipedia.org/wiki/F-divergence

    Notably, except for total variation distance, all others are special cases of -divergence, or linear sums of -divergences. For each f-divergence D f {\displaystyle D_{f}} , its generating function is not uniquely defined, but only up to c ⋅ ( t − 1 ) {\displaystyle c\cdot (t-1)} , where c {\displaystyle c} is any real constant.

  4. Kullback–Leibler divergence - Wikipedia

    en.wikipedia.org/wiki/Kullback–Leibler_divergence

    The term "divergence" is in contrast to a distance (metric), since the symmetrized divergence does not satisfy the triangle inequality. [10] Numerous references to earlier uses of the symmetrized divergence and to other statistical distances are given in Kullback (1959 , pp. 6–7, §1.3 Divergence).

  5. Divergence (statistics) - Wikipedia

    en.wikipedia.org/wiki/Divergence_(statistics)

    The only divergence for probabilities over a finite alphabet that is both an f-divergence and a Bregman divergence is the Kullback–Leibler divergence. [8] The squared Euclidean divergence is a Bregman divergence (corresponding to the function ⁠ x 2 {\displaystyle x^{2}} ⁠ ) but not an f -divergence.

  6. Statistical distance - Wikipedia

    en.wikipedia.org/wiki/Statistical_distance

    Many terms are used to refer to various notions of distance; these are often confusingly similar, and may be used inconsistently between authors and over time, either loosely or with precise technical meaning. In addition to "distance", similar terms include deviance, deviation, discrepancy, discrimination, and divergence, as well as others ...

  7. Category:F-divergences - Wikipedia

    en.wikipedia.org/wiki/Category:F-divergences

    Hellinger distance; K. Kullback–Leibler divergence; T. Total variation distance of probability measures This page was last edited on 3 April 2023, at 01:17 (UTC). ...

  8. Normal distribution - Wikipedia

    en.wikipedia.org/wiki/Normal_distribution

    The Kullback–Leibler divergence of one normal distribution ... The Hellinger distance between the same distributions is equal to (,) = + ⁡ (() +) The Fisher ...

  9. Distribution learning theory - Wikipedia

    en.wikipedia.org/wiki/Distribution_learning_theory

    There are several ways to measure the distance between two distribution. The three more common possibilities are Kullback-Leibler divergence; Total variation distance of probability measures; Kolmogorov distance; The strongest of these distances is the Kullback-Leibler divergence and the weakest is the Kolmogorov distance.