Search results
Results From The WOW.Com Content Network
In probability and statistics, the Hellinger distance (closely related to, although different from, the Bhattacharyya distance) is used to quantify the similarity between two probability distributions. It is a type of f-divergence. The Hellinger distance is defined in terms of the Hellinger integral, which was introduced by Ernst Hellinger in 1909.
Applying this theorem to the KL-divergence, ... In particular, this shows that the squared Hellinger distance and Jensen-Shannon divergence are symmetric.
The term "divergence" is in contrast to a distance (metric), since the symmetrized divergence does not satisfy the triangle inequality. [10] Numerous references to earlier uses of the symmetrized divergence and to other statistical distances are given in Kullback (1959 , pp. 6–7, §1.3 Divergence).
In statistics, probability theory, and information theory, a statistical distance quantifies the distance between two statistical objects, which can be two random variables, or two probability distributions or samples, or the distance can be between an individual sample point and a population or a wider sample of points.
The total variation distance (or half the norm) arises as the optimal transportation cost, when the cost function is (,) =, that is, ‖ ‖ = (,) = {(): =, =} = [], where the expectation is taken with respect to the probability measure on the space where (,) lives, and the infimum is taken over all such with marginals and , respectively.
The information geometry definition of divergence (the subject of this article) was initially referred to by alternative terms, including "quasi-distance" Amari (1982, p. 369) and "contrast function" Eguchi (1985), though "divergence" was used in Amari (1985) for the α-divergence, and has become standard for the general class.
Viewing the Kullback–Leibler divergence as a measure of distance, the I-projection is the "closest" distribution to q of all the distributions in P. The I-projection is useful in setting up information geometry , notably because of the following inequality, valid when P is convex: [ 1 ]
In statistics, the Bhattacharyya distance is a quantity which represents a notion of similarity between two probability distributions. [1] It is closely related to the Bhattacharyya coefficient , which is a measure of the amount of overlap between two statistical samples or populations.