Principles and Practice of Chromatography - The Basic Column Chromatograph > The Detector Output > Page 57

The Detector Output

Most practical detectors must have a linear output, e.g.,

where
(y) is the output of the detector in appropriate units,
(c) is the concentration of solute in the mobile phase,
and  (A) is a constant.

All are designed to provide a response that is as close as possible to linear for accurate quantitative analysis. However, the output from some detector sensors may not be linearly related to the solute concentration and appropriate signal modifying circuits must be introduced into the detector electronics to provide a linear output (e.g., the output from a light adsorption sensor will be exponential and consequently it must be used with a logarithmic amplifier to produce an output that is linearly related to solute concentration).

The most important detector specification is sensitivity as it defines the minimum concentration of solute that can be detected. It is best defined a function of the detector response and the noise level The detector response (Rc) can be defined as the voltage output for unit change in solute concentration or as the voltage output that would result from unit change in the physical property that the detector measures, e.g. refractive index or carbon content. Detector noise is the term given to any perturbation on the detector output that is not related to an eluted solute. It is a fundamental property of the detecting system and determines the ultimate sensitivity or minimum detectable concentration that can be achieved. Detector noise has been arbitrarily divided into three types, 'short term noise', 'long term noise' and 'drift' all three of which are depicted in figure 26.