When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Statistical distance - Wikipedia

    en.wikipedia.org/wiki/Statistical_distance

    In statistics, probability theory, and information theory, a statistical distance quantifies the distance between two statistical objects, which can be two random variables, or two probability distributions or samples, or the distance can be between an individual sample point and a population or a wider sample of points.

  3. Bhattacharyya distance - Wikipedia

    en.wikipedia.org/wiki/Bhattacharyya_distance

    In statistics, the Bhattacharyya distance is a quantity which represents a notion of similarity between two probability distributions. [1] It is closely related to the Bhattacharyya coefficient , which is a measure of the amount of overlap between two statistical samples or populations.

  4. Total variation distance of probability measures - Wikipedia

    en.wikipedia.org/wiki/Total_variation_distance...

    Total variation distance is half the absolute area between the two curves: Half the shaded area above. In probability theory, the total variation distance is a statistical distance between probability distributions, and is sometimes called the statistical distance, statistical difference or variational distance.

  5. Continuous uniform distribution - Wikipedia

    en.wikipedia.org/wiki/Continuous_uniform...

    The sum of two independent, equally distributed, uniform distributions U 1 (a,b)+U 2 (a,b) yields a symmetric triangular distribution on the support [2a,2b]. The distance between two i.i.d. uniform random variables |U 1 (a,b)-U 2 (a,b)| also has a triangular distribution, although not symmetric, on the support [0,b-a].

  6. Mahalanobis distance - Wikipedia

    en.wikipedia.org/wiki/Mahalanobis_distance

    The Mahalanobis distance is a measure of the distance between a point and a distribution, introduced by P. C. Mahalanobis in 1936. [1] The mathematical details of Mahalanobis distance first appeared in the Journal of The Asiatic Society of Bengal in 1936. [ 2 ]

  7. Cohen's h - Wikipedia

    en.wikipedia.org/wiki/Cohen's_h

    In statistics, Cohen's h, popularized by Jacob Cohen, is a measure of distance between two proportions or probabilities. Cohen's h has several related uses: It can be used to describe the difference between two proportions as "small", "medium", or "large". It can be used to determine if the difference between two proportions is "meaningful".

  8. Bhattacharyya angle - Wikipedia

    en.wikipedia.org/wiki/Bhattacharyya_angle

    In statistics, Bhattacharyya angle, also called statistical angle, is a measure of distance between two probability measures defined on a finite probability space. It is defined as It is defined as Δ ( p , q ) = arccos ⁡ BC ⁡ ( p , q ) {\displaystyle \Delta (p,q)=\arccos \operatorname {BC} (p,q)}

  9. Similarity measure - Wikipedia

    en.wikipedia.org/wiki/Similarity_measure

    The Euclidean distance formula is used to find the distance between two points on a plane, which is visualized in the image below. Manhattan distance is commonly used in GPS applications, as it can be used to find the shortest route between two addresses. [citation needed] When you generalize the Euclidean distance formula and Manhattan ...