Search results
Results From The WOW.Com Content Network
In probability and statistics, the Hellinger distance (closely related to, although different from, the Bhattacharyya distance) is used to quantify the similarity between two probability distributions. It is a type of f-divergence. The Hellinger distance is defined in terms of the Hellinger integral, which was introduced by Ernst Hellinger in 1909.
Notably, except for total variation distance, all others are special cases of -divergence, or linear sums of -divergences. For each f-divergence D f {\displaystyle D_{f}} , its generating function is not uniquely defined, but only up to c ⋅ ( t − 1 ) {\displaystyle c\cdot (t-1)} , where c {\displaystyle c} is any real constant.
The term "divergence" is in contrast to a distance (metric), since the symmetrized divergence does not satisfy the triangle inequality. [10] Numerous references to earlier uses of the symmetrized divergence and to other statistical distances are given in Kullback (1959 , pp. 6–7, §1.3 Divergence).
The total variation distance is related to the Kullback–Leibler divergence by Pinsker’s inequality: (,) ().One also has the following inequality, due to Bretagnolle and Huber [2] (see also [3]), which has the advantage of providing a non-vacuous bound even when () >:
The information geometry definition of divergence (the subject of this article) was initially referred to by alternative terms, including "quasi-distance" Amari (1982, p. 369) and "contrast function" Eguchi (1985), though "divergence" was used in Amari (1985) for the α-divergence, and has become standard for the general class.
Many terms are used to refer to various notions of distance; these are often confusingly similar, and may be used inconsistently between authors and over time, either loosely or with precise technical meaning. In addition to "distance", similar terms include deviance, deviation, discrepancy, discrimination, and divergence, as well as others ...
In statistics, the Bhattacharyya distance is a quantity which represents a notion of similarity between two probability distributions. [1] It is closely related to the Bhattacharyya coefficient , which is a measure of the amount of overlap between two statistical samples or populations.
Hellinger distance; K. Kullback–Leibler divergence; T. Total variation distance of probability measures This page was last edited on 3 April 2023, at 01:17 (UTC). ...