# 8: Calibrating Data

- Page ID
- 222589

A calibration curve is one of the most important tools in analytical chemistry as it allows us to determine the concentration of an analyte in a sample by measuring the signal it generates when placed in an instrument, such as a spectrophotometer. To determine the analyte's concentration we must know the relationship between the signal we measure , \(S\), and the analyte's concentration, \(C_A\), which we can write as

\[S = k_A C_A + S_{blank} \nonumber\]

where \(k_A\) is the calibration curve's sensitivity and \(S_{blank}\) is the signal in the absence of analyte.

How do we find the best estimate for this relationship between the signal and the concentration of analyte? When a calibration curve is a straight-line, we represent it using the following mathematical model

\[y = \beta_0 + \beta_1 x \nonumber \]

where *y *is the analyte’s measured signal, *S*, and *x *is the analyte’s known concentration, \(C_A\), in a series of standard solutions. The constants \(\beta_0\) and \(\beta_1\) are, respectively, the calibration curve’s expected *y*-intercept and its expected slope. Because of uncertainty in our measurements, the best we can do is to estimate values for \(\beta_0\) and \(\beta_1\), which we represent as *b*_{0} and *b*_{1}. The goal of a linear regression analysis is to determine the best estimates for *b*_{0} and *b*_{1}.