When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Laboratory quality control - Wikipedia

    en.wikipedia.org/wiki/Laboratory_quality_control

    Quality control (QC) is a measure of precision, or how well the measurement system reproduces the same result over time and under varying operating conditions. Laboratory quality control material is usually run at the beginning of each shift, after an instrument is serviced, when reagent lots are changed, after equipment calibration, and ...

  3. Analytical quality control - Wikipedia

    en.wikipedia.org/wiki/Analytical_quality_control

    Because of the complex inter-relationship between analytical method, sample concentration, limits of detection and method precision, the management of Analytical Quality Control is undertaken using a statistical approach to determine whether the results obtained lie within an acceptable statistical envelope.

  4. Intra-rater reliability - Wikipedia

    en.wikipedia.org/wiki/Intra-rater_reliability

    In statistics, intra-rater reliability is the degree of agreement among repeated administrations of a diagnostic test performed by a single rater. [ 1 ] [ 2 ] Intra-rater reliability and inter-rater reliability are aspects of test validity .

  5. Inter-rater reliability - Wikipedia

    en.wikipedia.org/wiki/Inter-rater_reliability

    In statistics, inter-rater reliability (also called by various similar names, such as inter-rater agreement, inter-rater concordance, inter-observer reliability, inter-coder reliability, and so on) is the degree of agreement among independent observers who rate, code, or assess the same phenomenon.

  6. Cohen's kappa - Wikipedia

    en.wikipedia.org/wiki/Cohen's_kappa

    Cohen's kappa coefficient (κ, lowercase Greek kappa) is a statistic that is used to measure inter-rater reliability (and also intra-rater reliability) for qualitative (categorical) items. [1] It is generally thought to be a more robust measure than simple percent agreement calculation, as κ takes into account the possibility of the agreement ...

  7. Verification and validation - Wikipedia

    en.wikipedia.org/wiki/Verification_and_validation

    Verification is intended to check that a product, service, or system meets a set of design specifications. [6] [7] In the development phase, verification procedures involve performing special tests to model or simulate a portion, or the entirety, of a product, service, or system, then performing a review or analysis of the modeling results.

  8. Westgard rules - Wikipedia

    en.wikipedia.org/wiki/Westgard_Rules

    The Westgard rules are a set of statistical patterns, each being unlikely to occur by random variability, thereby raising a suspicion of faulty accuracy or precision of the measurement system. They are used for laboratory quality control, in "runs" consisting of measurements of multiple samples

  9. Round-robin test - Wikipedia

    en.wikipedia.org/wiki/Round-robin_test

    In experimental methodology, a round-robin test is an interlaboratory test (measurement, analysis, or experiment) performed independently several times. [1] This can involve multiple independent scientists performing the test with the use of the same method in different equipment, or a variety of methods and equipment.