Breadcrumbs Section. Click here to navigate to respective pages.
Chapter

Chapter
The Critical Factors that Govern a Successful Quantitative Chromatographic Analysis
DOI link for The Critical Factors that Govern a Successful Quantitative Chromatographic Analysis
The Critical Factors that Govern a Successful Quantitative Chromatographic Analysis book
The Critical Factors that Govern a Successful Quantitative Chromatographic Analysis
DOI link for The Critical Factors that Govern a Successful Quantitative Chromatographic Analysis
The Critical Factors that Govern a Successful Quantitative Chromatographic Analysis book
ABSTRACT
The early work in liquid chromatography from Tswett onward involved almost no quantitative assays, but was largely used as a preparative technique, quantitative evaluations being carried out offline using separate analytical methods. Actual quantitative assays made directly by monitoring the column eluent commenced with gas chromatography (GC), first used in this way by the inventors of the technique, James and Martin, in 1952 [1] for the analysis of fatty acid mixtures. In fact, the need to determine the composition of mixed fatty acids extracted from plant tissue, to help elucidate their synthetic pathways, was the actual incentive that provoked the development of the technique in the first place. In the first instrument, the column eluent was bubbled through a suitable aqueous liquid to absorb the eluted acids. The solution contained an indicator, the color of which changed as each solute was eluted, and the solution was then manually titrated. Later the titration process was automated by the inventors (probably the first automatic titration apparatus to be made and certainly the only one available at that time), and an integral chromatogram was formed by plotting the volume of base solution added against time. The resulting integram displayed each substance as a step, the height of which was proportional to the amount of fatty acid eluted. Subsequently, Martin developed the density balance detector [2], the first inline detector, which had a very useful linear response. The gas density balance was an extremely complicated and ingenious device consisting of a Wheatstone network of capillary tubes that were drilled out of a high conductivity copper block. Consequently, the sensing device was fairly compact. In the block, there were two columns of gas, one containing the column eluent, the other a reference gas. When solute vapor was present in the column eluent, the pressure difference across the two columns, due to the differing densities of the gases, was arranged to cause a flow of gas over two heated thermocouples, cooling one and heating the other. The output from the thermocouples was fed to an appropriate recording milliammeter. The detector was linear over about three orders of magnitude of concentration and had a sensitivity (minimum detectable concentration) of about 5 X 107 g/ml (n-heptane). The detector output provided a differential output that displayed solute peaks in the conventional manner and could be assessed quantitatively using peak areas or peak heights (methods will be discussed later).