HPLC Theory: System Suitability Parameters
High performance liquid chromatography is defined as a separation of mixtures of compounds due to differences in their distribution equilibrium between two phases, the stationary phase packed inside columns and the mobile phase, delivered through the columns by high pressure pumps. Components whose distribution into the stationary phase is higher, are retained longer, and get separated from those with lower distribution into the stationary phase. The theoretical and practical foundations of this method were laid down at the end of 1960s and at the beginning of 1970s. The theory of chromatography has been used as the basis for System- Suitability tests, which are set of quantitative criteria that test the suitability of the chromatographic system to identify and quantify drug related samples by HPLC at any step of the pharmaceutical analysis.
Retention Time (tR), Capacity Factor k' and Relative Retention Time (RRT)
The time elapsed between the injection of the sample components into the column and their detection is known as the Retention Time (tR). The retention time is longer when the solute has higher affinity to the stationary phase due to its chemical nature. For example, in reverse phase chromatography, the more lypophilic compounds are retained longer. Therefore, the retention time is a property of the analyte that can be used for its identification.
A non retained substance passes through the column at a time t0, called the Void Time. The Retention Factor or Capacity Factor k' of an analyte is measured experimentally as shown in Figure 3 and Eqn 1:
Eqn 1a [pic]
The Capacity Factor describes the thermodynamic basis of the separation and its definition is the ratio of the amounts of the solute at the stationary and mobile phases within the analyte band inside the chromatographic column: Eqn 1b [pic]
Where Cs is the concentration of the solute at the stationary phase and Cm is its concentration at the mobile phase and ? is the ratio of the stationary and mobile phase volumes all within the chromatographic band.
The Retention Factor (Eqn 1a) is used to compare the retention of a solute between two chromatographic systems, normalizing it to the column's geometry and system flow rate. The need to determine the void time can be tricky sometimes, due to the instability of the elution time of the void time marker, t0, therefore, when the chromatogram is complex in nature, and one known component is always present at a certain retention time, it can be used as a retention marker for other peaks. In such cases the ratio between the retention time of any peak in the chromatogram and the retention time of the marker is used (tR (Peak) / tR (Marker)) and referred to as the Relative Retention Time (RRT). RRT is also used instead of the capacity ratio for the identification of the analyte as well as to compare its extent of retention in two different chromatographic systems.
Figure 3: Example of Capacity factor calculation in LC. In this case: tR = 0.739, t0 = 0.17, therefore k' = (0.739-0.176)/0.176 = 3.20
Efficiency: Plate Count N and Peak Capacity Pc
Figure 4 describes a chromatogram with 4 peaks and a detected void peak. The parameters of the System Suitability are displayed in the inserted table. The sharpness of a peak relative to its retention time is a measure of the system's efficiency, calculated as N, plate count. Band-broadening phenomena in the column such as eddy diffusion, molecular diffusion, mass-transfer kinetics and extra-column effects reduce the efficiency of the separation (33). The sharpness of a peak is relevant to the limit of detection and limit of quantification of the chromatographic system. The sharper the peak for a specific area, the better is its signal-to-noise; hence the system is capable of detecting lower...
Please join StudyMode to read the full document