When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Analytical quality control - Wikipedia

    en.wikipedia.org/wiki/Analytical_quality_control

    Because of the complex inter-relationship between analytical method, sample concentration, limits of detection and method precision, the management of Analytical Quality Control is undertaken using a statistical approach to determine whether the results obtained lie within an acceptable statistical envelope.

  3. Intra-rater reliability - Wikipedia

    en.wikipedia.org/wiki/Intra-rater_reliability

    In statistics, intra-rater reliability is the degree of agreement among repeated administrations of a diagnostic test performed by a single rater. [ 1 ] [ 2 ] Intra-rater reliability and inter-rater reliability are aspects of test validity .

  4. Cohen's kappa - Wikipedia

    en.wikipedia.org/wiki/Cohen's_kappa

    Cohen's kappa coefficient (κ, lowercase Greek kappa) is a statistic that is used to measure inter-rater reliability (and also intra-rater reliability) for qualitative (categorical) items. [1] It is generally thought to be a more robust measure than simple percent agreement calculation, as κ takes into account the possibility of the agreement ...

  5. Laboratory quality control - Wikipedia

    en.wikipedia.org/wiki/Laboratory_quality_control

    Quality control (QC) is a measure of precision, or how well the measurement system reproduces the same result over time and under varying operating conditions. Laboratory quality control material is usually run at the beginning of each shift, after an instrument is serviced, when reagent lots are changed, after equipment calibration, and ...

  6. Inter-rater reliability - Wikipedia

    en.wikipedia.org/wiki/Inter-rater_reliability

    In statistics, inter-rater reliability (also called by various similar names, such as inter-rater agreement, inter-rater concordance, inter-observer reliability, inter-coder reliability, and so on) is the degree of agreement among independent observers who rate, code, or assess the same phenomenon.

  7. Repeatability - Wikipedia

    en.wikipedia.org/wiki/Repeatability

    The repeatability coefficient is a precision measure which represents the value below which the absolute difference between two repeated test results may be expected to lie with a probability of 95%. [citation needed] The standard deviation under repeatability conditions is part of precision and accuracy. [citation needed]

  8. Reproducibility - Wikipedia

    en.wikipedia.org/wiki/Reproducibility

    Reproducibility, closely related to replicability and repeatability, is a major principle underpinning the scientific method.For the findings of a study to be reproducible means that results obtained by an experiment or an observational study or in a statistical analysis of a data set should be achieved again with a high degree of reliability when the study is replicated.

  9. Flow injection analysis - Wikipedia

    en.wikipedia.org/wiki/Flow_injection_analysis

    Flow injection analysis (FIA) was first described by Ruzicka and Hansen in Denmark in 1974 and Stewart and coworkers in United States in 1979. FIA is a popular, simple, rapid, and versatile technique which is a well-established position in modern analytical chemistry, and widespread application in quantitative chemical analysis.