When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Hellinger distance - Wikipedia

    en.wikipedia.org/wiki/Hellinger_distance

    In probability and statistics, the Hellinger distance (closely related to, although different from, the Bhattacharyya distance) is used to quantify the similarity between two probability distributions. It is a type of f-divergence. The Hellinger distance is defined in terms of the Hellinger integral, which was introduced by Ernst Hellinger in 1909.

  3. f-divergence - Wikipedia

    en.wikipedia.org/wiki/F-divergence

    Applying this theorem to the KL-divergence, ... In particular, this shows that the squared Hellinger distance and Jensen-Shannon divergence are symmetric.

  4. Kullback–Leibler divergence - Wikipedia

    en.wikipedia.org/wiki/Kullback–Leibler_divergence

    The term "divergence" is in contrast to a distance (metric), since the symmetrized divergence does not satisfy the triangle inequality. [10] Numerous references to earlier uses of the symmetrized divergence and to other statistical distances are given in Kullback (1959 , pp. 6–7, §1.3 Divergence).

  5. Total variation distance of probability measures - Wikipedia

    en.wikipedia.org/wiki/Total_variation_distance...

    The total variation distance is related to the Kullback–Leibler divergence by Pinsker’s inequality: (,) ().One also has the following inequality, due to Bretagnolle and Huber [2] (see also [3]), which has the advantage of providing a non-vacuous bound even when () >:

  6. Divergence (statistics) - Wikipedia

    en.wikipedia.org/wiki/Divergence_(statistics)

    The only divergence for probabilities over a finite alphabet that is both an f-divergence and a Bregman divergence is the Kullback–Leibler divergence. [8] The squared Euclidean divergence is a Bregman divergence (corresponding to the function ⁠ x 2 {\displaystyle x^{2}} ⁠ ) but not an f -divergence.

  7. Statistical distance - Wikipedia

    en.wikipedia.org/wiki/Statistical_distance

    Bhattacharyya distance (despite its name it is not a distance, as it violates the triangle inequality) f-divergence : generalizes several distances and divergences Discriminability index , specifically the Bayes discriminability index , is a positive-definite symmetric measure of the overlap of two distributions.

  8. Information projection - Wikipedia

    en.wikipedia.org/wiki/Information_projection

    Viewing the Kullback–Leibler divergence as a measure of distance, the I-projection is the "closest" distribution to q of all the distributions in P. The I-projection is useful in setting up information geometry , notably because of the following inequality, valid when P is convex: [ 1 ]

  9. Fisher information metric - Wikipedia

    en.wikipedia.org/wiki/Fisher_information_metric

    By Chentsov’s theorem, the Fisher information metric on statistical models is the only Riemannian metric (up to rescaling) that is invariant under sufficient statistics. [3] [4] It can also be understood to be the infinitesimal form of the relative entropy (i.e., the Kullback–Leibler divergence); specifically, it is the Hessian of