Laboratory Standards Technical Guide What are Standards, Calibration Curves and Dilutions?

Standards

The reference to standards in a laboratory setting often refers to an actual chemical or physical material called a metrological standard. A metrological standard is the fundamental example or reference for a unit of measure. Simply stated, a standard is the ‘known’ to which an ‘unknown’ can be measured. Metrological standards fall into different hierarchal levels.

Primary standards: The definitive example of its measurement unit to which all other standards are compared and whose property value is accepted without reference to other standards of the same property or quantity (1,2: VIM, ISO Guide 30). Primary standards of measure, such as weight, are created and maintained by metrological agencies and bureaus around the world (i.e., NIST).
Secondary standards: Close representations of primary standards which are measured against primary standards. Many chemical standards companies create chemical standards against a primary weight set to create secondary standards traceable to that primary standard.
Working standards: Created against or with secondary standards to calibrate equipment.

There are also many standards designated as reference materials, reference standards or certified reference materials which are materials which are manufactured or characterized for a set of properties and are traceable to a primary or secondary. If the material is a certified reference material, then it must be accompanied by a certificate which includes information on the material’s stability, homogeneity, traceability, and uncertainty (2,3: ISO 30 & 17024).

Using standards and reference materials

Certified standards or certified reference materials (CRMs) are materials produced by standards providers which have one or more certified values with uncertainty established using validated methods and are accompanied by a certificate. The uncertainty characterizes the range of the dispersion of values that occur through the determinate variation of all the components which are part of the process for creating the standard.

CRMs have several uses including validation of methods, standardization or calibration of instruments or materials, and for use in quality control and assurance procedures. A calibration procedure establishes the relationship between a concentration of an analyte and the instrumental or procedural response to that analyte.

A calibration curve is the plotting of multiple points within a dynamic range to establish the analyte response within a system during the collection of data points. One element of the correct interpretation of data from instrumental systems is the effect of a sample matrix upon an instrumental analytical response. The matrix effect can be responsible for either analyte suppression or enhancement. In analysis where matrix can influence the response of an analyte, it is common to match the matrix of analytical standards or reference materials to the matrix of the target sample to compensate for matrix effects.

Different approaches to using calibration standards may need to be employed to compensate for the possible variability within a procedure or analytical system.

  • Internal standards are reference standards that are either similar in character or analogs of the target analytes that have a similar response and are added to the sample prior to analysis. This type of standard allows the variation of instrument response to be compensated for using a relative response ratio established between the internal standard and the target analyte.
    • Examples: deuterated forms of target analytes, similar to analogues or elements not present in sample
  • Standard addition or spiking standard is an internal standard added to overcome matrix responses, instrument responses and the analyte responses are indistinguishable from each other as the analyte concentration nears the lower limit of detection or quantitation. A target standard can then be added in known concentration to compensate for the matrix or instrument effects to bring the signal of the target analyte into a quantitative range.
  • External standards are multiple calibration points (customarily three or more points) that contain standards or known concentrations of the target analytes and matrix components and exist outside of the test samples. Depending on the type of analytical techniques, linear calibration curves can be generated between response and concentration which can be calculated for the degree of linearity or the correlation coefficient (r). An r value approaching 1 reflects a higher degree of linearity, most analysts accept values of > 0.999 or better as acceptable correlation.

Calibration Curves

Calibration curves are often affected by the limitations of the instrumentation. Data can become biased by calibration points, by instruments limits of detection, quantitation and linearity, and by the response of the system versus its baseline (signal-to-noise).

Figure 1 Limits of detection and quantitation
Figure 1. Limits of detection and quantitation

  • Limit of detection (LOD): lower limit of a method or system at which the target can be detected as different from a blank with a high confidence level (usually over three standard deviations from the blank response).
  • Limit of quantitation (LOQ): lower limit of a method or system between which the target can be reasonably calculated where two distinct values between the target and blank can be observed (usually over ten standard deviations from the blank response) (6-AOAC). (Figure 1).
  • Signal-to-noise (S/N): response of an analyte measured on an instrument as a ratio of that response to the baseline variation (noise) of the system. Limits of detection are often recognized as target responses which have three times the response of baseline noise or s/n >/=3. Limits of quantitation are recognized as target responses which have ten times the response of baseline noise or s/n >/= 10.
  • Limits of linearity (LOL): upper limits of a system or calibration curve where the linearity of the calibration curve starts to be skewed creating a loss of linearity (Figure 2). This loss of linearity can be a sign that the instrumental detection source is approaching saturation.
  • Dynamic range: array of data values between the LOQ and the LOL is where the greatest potential for accurate measurements will occur.
  • Figure 2. Calibration curve limits and range

    Figure 2. Calibration curve limits and range

    The understanding of a system’s dynamic range, the accurate bracketing of calibration curves within the range and around the target analyte concentration increase the accuracy of the measurements. If a calibration curve is created that does not potentially bracket all the possible target data points, then the calibration curve can be biased to artificially increase or decrease the results and create error. To create calibration curves and working standards there must be an accurate process of converting units, calculating dilution targets, and preparing dilutions.

    Dynamic range, concentration, and error

    The first step in creating standards, working solutions and dilutions is understanding the dynamic range your analysis is targeting - is it ppb, ppm, percent? (Table 1).

    Table 1. Analytical target ranges and instrumentation examples

    Unit Analytical target Instruments
    % Macroelements, Nutrients, Active Ingredients, Flavorings, Solvents GC-FID, LC-UV/Vis, AA
    ppm Micronutrients, Pharmaceuticals, Solvents GC-FID, GC/MS, LC-UV/Vis, LC/MS, AA, ICP
    ppb Contaminants, Pesticides, Aflatoxins, Heavy Metals GC/MS, GC/MS/MS, LC/MS, LC/MS/MS, ICP-MS

    If you are looking for a major component of the sample, then your standards must be in percent levels, or the samples must be diluted down to the correct concentrations. If target analytes are trace elements or trace contaminants, then standards and calibration curves must be diluted down to match the target within the dynamic range of the instrument technique.

    Tables 2 to 4 show the basic conversions between different concentrations based on mass or volume.

    Table 2. Weight to weight concentrations

    Name Symbol Equivalence
    Parts per thousand* ppt* g/kg mg/g μg/mg ng/μg
    Parts per million ppm mg/kg μg/g ng/mg pg/μg
    Parts per billion μg/kg ng/g pg/mg fg/μg
    Parts per trillion** ppt** ng/kg pg/g fg/mg ag/μg

    Table 3. Weight to volume concentrations

    Name Symbol Equivalence
    Parts per thousand* ppt* g/L mg/mL μg/μL ng/nL
    Parts per million ppm mg/L μg/mL ng/μL pg/nL
    Parts per billion ppb μg/L ng/mL pg/μL fg/nL
    Parts per trillion** ppt** ng/L pg/mL fg/μL ag/nL

    Table 4. Concentration conversions

    Units Symbol ppt* ppm ppb ppt**
    1 part per thousand ppt* 1 x 10ˆ 3 1 x 10ˆ 6 1 x 10ˆ 9
    1 part per million ppm 1 x 10ˆˉ 3 1 x 10ˆ 3 1 x 10ˆ 6
    1 part per billion ppb 1 x 10ˆˉ 6 1 x 10ˆˉ 3 1 x 10ˆ 3
    1 part per trillion ppt** 1 x 10ˆˉ 9 1 x 10ˆˉ 6 1 x 10ˆˉ 3
    * ppt = parts per thousand
    ** ppt = parts per trillion

    Calibration curves are created by diluting standards into several target points along the dynamic range to cover the possible target results. Proper dilution of standards and samples is based on the understanding of basic dilution, volumetric procedures, and dilution factors. Volumetric measurement is a common repeated daily activity in most analytical laboratories. Many processes in the laboratory from sample preparation to standards calculation depend on accurate and contamination free volumetric measurements. Unfortunately, laboratory volumetric labware, syringes and pipettes are one of the most common sources of contamination, carryover, and error in the laboratory.

    The root of these errors is based on the four “I” errors of volumetrics:

    1. Improper use: measuring tool is not used correctly.
    2. Incorrect choice: measurement tool is not appropriate for the volume or type of measurement.
    3. Inadequate cleaning: carryover causes contamination.
    4. Infrequent calibration: measuring tool is not calibrated for use.

    These four “I’s” can lead to error and contamination which negate all intent and careful measurement processes.

    Many errors can be avoided by understanding the markings displayed on the volumetrics and choosing the proper tool for the job. There is a lot of information displayed on volumetric labware. Most labware, especially glassware, is designated as either Class A (analytical or quantitative) or Class B labware (general use).

    A second type of improper use and incorrect choice can be seen in the selection of pipettes and syringes for analytical measurements. Many syringe manufacturers recommend a minimum dispensing volume of approximately 10% of the total volume of the syringe or pipette.

    A study by SPEX® CertiPrep showed that dispensing such a small percentage of the syringe’s total volume created a large amount of error. The largest rates of error were seen in the smaller syringes of 10 and 25 μL. Dispensing 20% of the 10 μL syringe created 23% error. Error only dropped down to below 5% as the volume dispensed approached 100%. In the larger syringes, measurements over 25% were able to see error in and around 1%. The larger syringes were able to get closer to the 10% manufacturer’s dispensing minimum without a large amount of error but the error did drop as the dispensed volume approached 100% (4).

    Understanding Dilutions and Dilution Factors

    The first dilution many labs make is a stock solution or starting solution. This type of working standard is made to create a higher concentration stock from raw materials or concentrated material from which other standards will be made. For this calculation, one needs the concentration of the target stock, the final weight or volume of the total stock and the purity of the raw material (or concentration of the concentrate being used).

    Starting material or stock starting calculation

    X (mass or volume units of target)/final mass/volume * purity (concentration) of material * 106

    Example: 90% pure compound with which you want to make a 10,000-ppm stock standard and you need to make 10 mL of standard.

    Set up your equation as:

    X/10 mL * 0.90 * 106=10,000 ppm (μg/mL)

    Solve for X: 10,000 * 10/(0.9 * 106) = 0.111 μg starting material

    Another type of dilution is a simple dilution using a dilution factor as seen in the equations below:

    Simple dilutions

    Dilution = volume or mass of sample / total volume or mass of (sample + diluent)

    Dilution factor = total V of (sample + diluent) / V of sample
    ** or we can simply say the reciprocal of dilution

    Example: You have a 1,000-ppm stock solution and you need to make a 10 mL of a 10 ppm for your other scientists, how much do you use?

    X/10mL x 1,000 = 10 ppm (μg/mL)
    This would be a 100x dilution using 0.1 mL

    To make calibration curves, most people employ a mixture of simple dilutions separate from each other or use a serial dilution which is a series of dilutions, where each dilution is cumulative.

    (1/10) x (1/10) x (1/10) x (1/10) x (1/10) = 1/100,000 = 10-5 dilution

    A typical series is shown below:

    Serial Dilution - Series of Dilutions

    Tips for dilutions

    • Make sure you keep track of units (see tables provided)
    • Unify units when possible - i.e., convert to grams, or μg, uniformly
    • Don’t forget purity when making a stock solution
    • Don’t forget to account for the weight or volume of internal standards or spiking solutions
    • Use the Spex® Dilut-U-lator® if needed
    • Make sure your calibration points are within range of your analytical targets and within the dynamic range of the instrument; if not, either the samples or standards will need to be diluted

    References

    1. BIPM, IEC, IFCC, ILAC, IUPAC, IUPAP, ISO, OIML (2012) The international vocabulary of metrology - basic and general concepts and associated terms (VIM), 3rd edn. JCGM 200:2012. http://www.bipm.org/vim.
    2. ISO Guide 30, terms and definitions used in connection with reference materials.
    3. ISO Guide 17034, general requirements for the competence of reference material producers.
    4. Spex CertiPrep Application Note “Understanding Measurement: A Guide to Error, Contamination and Carryover in Volumetric Labware, Syringes and Pipettes” in publication.