When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Calibration - Wikipedia

    en.wikipedia.org/wiki/Calibration

    The formal definition of calibration by the International Bureau of Weights and Measures (BIPM) is the following: "Operation that, under specified conditions, in a first step, establishes a relation between the quantity values with measurement uncertainties provided by measurement standards and corresponding indications with associated measurement uncertainties (of the calibrated instrument or ...

  3. Calibration curve - Wikipedia

    en.wikipedia.org/wiki/Calibration_curve

    A calibration curve plot showing limit of detection (LOD), limit of quantification (LOQ), dynamic range, and limit of linearity (LOL).. In analytical chemistry, a calibration curve, also known as a standard curve, is a general method for determining the concentration of a substance in an unknown sample by comparing the unknown to a set of standard samples of known concentration. [1]

  4. List of measuring instruments - Wikipedia

    en.wikipedia.org/wiki/List_of_measuring_instruments

    This includes mostly instruments which measure macroscopic properties of matter: In the fields of solid-state physics; in condensed matter physics which considers solids, liquids, and in-betweens exhibiting for example viscoelastic behavior; and furthermore, in fluid mechanics, where liquids, gases, plasmas, and in-betweens like supercritical ...

  5. Gauge fixing - Wikipedia

    en.wikipedia.org/wiki/Gauge_fixing

    In the physics of gauge theories, gauge fixing (also called choosing a gauge) denotes a mathematical procedure for coping with redundant degrees of freedom in field variables. By definition, a gauge theory represents each physically distinct configuration of the system as an equivalence class of detailed local field configurations.

  6. Observational error - Wikipedia

    en.wikipedia.org/wiki/Observational_error

    For example, a spectrometer fitted with a diffraction grating may be checked by using it to measure the wavelength of the D-lines of the sodium electromagnetic spectrum which are at 600 nm and 589.6 nm. The measurements may be used to determine the number of lines per millimetre of the diffraction grating, which can then be used to measure the ...

  7. Internal standard - Wikipedia

    en.wikipedia.org/wiki/Internal_standard

    In NMR spectroscopy, e.g. of the nuclei 1 H, 13 C and 29 Si, frequencies depend on the magnetic field, which is not the same across all experiments. Therefore, frequencies are reported as relative differences to tetramethylsilane (TMS), an internal standard that George Tiers proposed in 1958 and that the International Union of Pure and Applied Chemistry has since endorsed.

  8. Measurement uncertainty - Wikipedia

    en.wikipedia.org/wiki/Measurement_uncertainty

    In metrology, measurement uncertainty is the expression of the statistical dispersion of the values attributed to a quantity measured on an interval or ratio scale.. All measurements are subject to uncertainty and a measurement result is complete only when it is accompanied by a statement of the associated uncertainty, such as the standard deviation.

  9. Measurement microphone calibration - Wikipedia

    en.wikipedia.org/wiki/Measurement_microphone...

    Laboratory standard microphones calibrated using this method are used in-turn to calibrate other microphones using comparison calibration techniques (‘secondary calibration’), referencing the output of the ‘test’ microphone against that of the reference laboratory standard microphone.