RSS

Calibration Curve


In analytical chemistry, a calibration curve is a general method for determining the concentration of a substance in an unknown sample by comparing the unknown to a set of standard samples of known concentration. A calibration curve is one approach to the problem of instrument calibration.
The calibration curve is a plot of how the instrumental response, the so-called analytical signal, changes with the concentration of the analyte (the substance to be measured). The operator prepares a series of standards across a range of concentrations near the expected concentration of analyte in the unknown. The concentrations of the standards must lie within the working range of the technique (instrumentation) they are using. Analyzing each of these standards using the chosen technique will produce a series of measurements. For most analyses a plot of instrument response vs. analyte concentration will show a linear relationship. The operator can measure the response of the unknown and, using the calibration curve, can interpolate to find the concentration of analyte.

The data - the concentrations of the analyte and the instrument response for each standard - can be fit to a straight line, using linear regression analysis. This yields a model described by the equation y = mx + y0, where y is the instrument response, m represents the sensitivity, and y0 is a constant that describes the background. The analyte concentration (x) of unknown samples may be calculated from this equation

A calibration curve plot showing limit of detection (LOD), limit of quantification (LOQ), dynamic range and limit of linearity (LOL)

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

0 comments:

Post a Comment