When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Statistical distance - Wikipedia

    en.wikipedia.org/wiki/Statistical_distance

    In statistics, probability theory, and information theory, a statistical distance quantifies the distance between two statistical objects, which can be two random variables, or two probability distributions or samples, or the distance can be between an individual sample point and a population or a wider sample of points.

  3. Bhattacharyya distance - Wikipedia

    en.wikipedia.org/wiki/Bhattacharyya_distance

    In statistics, the Bhattacharyya distance is a quantity which represents a notion of similarity between two probability distributions. [1] It is closely related to the Bhattacharyya coefficient , which is a measure of the amount of overlap between two statistical samples or populations.

  4. Total variation distance of probability measures - Wikipedia

    en.wikipedia.org/wiki/Total_variation_distance...

    In probability theory, the total variation distance is a distance measure for probability distributions. It is an example of a statistical distance metric, and is sometimes called the statistical distance , statistical difference or variational distance .

  5. Similarity measure - Wikipedia

    en.wikipedia.org/wiki/Similarity_measure

    Hamming distance; Jaro distance; Similarity between two probability distributions. Typical measures of similarity for probability distributions are the Bhattacharyya distance and the Hellinger distance. Both provide a quantification of similarity for two probability distributions on the same domain, and they are mathematically closely linked.

  6. Gaussian function - Wikipedia

    en.wikipedia.org/wiki/Gaussian_function

    Gaussian functions are widely used in statistics to describe the normal distributions, in signal processing to define Gaussian filters, in image processing where two-dimensional Gaussians are used for Gaussian blurs, and in mathematics to solve heat equations and diffusion equations and to define the Weierstrass transform.

  7. Pinsker's inequality - Wikipedia

    en.wikipedia.org/wiki/Pinsker's_inequality

    is the (non-normalized) variation distance between two probability density functions and on the same alphabet . [2] This form of Pinsker's inequality shows that "convergence in divergence" is a stronger notion than "convergence in variation distance".

  8. Wasserstein metric - Wikipedia

    en.wikipedia.org/wiki/Wasserstein_metric

    This result generalises the earlier example of the Wasserstein distance between two point masses (at least in the case =), since a point mass can be regarded as a normal distribution with covariance matrix equal to zero, in which case the trace term disappears and only the term involving the Euclidean distance between the means remains.

  9. Hellinger distance - Wikipedia

    en.wikipedia.org/wiki/Hellinger_distance

    In probability and statistics, the Hellinger distance (closely related to, although different from, the Bhattacharyya distance) is used to quantify the similarity between two probability distributions. It is a type of f-divergence. The Hellinger distance is defined in terms of the Hellinger integral, which was introduced by Ernst Hellinger in 1909.