Section Outline
- Basic Principles of Laboratory Medicine
- Examination of Urine
- Kidney Function Tests
- Laboratory Tests in Diabetes Mellitus
- Liver Function Tests
- Laboratory Tests in Disorders of Lipids
- Biochemical Cardiac Markers
- Examination of Cerebrospinal Fluid
- Examination of Serous Body Fluids and Synovial Fluid
- Examination of Sputum
- Examination of Feces
- Gastric Analysis
- Tests for Malabsorption and Pancreatic Function Tests
- Thyroid Function Tests
- Laboratory Tests in Pregnancy
- Infertility
- Semen Analysis
INTRODUCTION
A clinical pathology or a medical laboratory is a place where specimens from human body are collected, processed, examined or analyzed. Clinical laboratory specimens include whole blood, plasma, serum, urine, cerebrospinal fluid, feces, body fluids, etc. Various clinical laboratory departments include hematology, clinical chemistry, immunohematology, urinalysis, microbiology, parasitology, coagulation, cytology and anatomic pathology. Laboratory testing plays a major role in the clinical decision-making by the physician and in the overall patient management. The main functions of laboratory testing are:
- Screening for disease: This refers to examination or testing for presence or absence of a subclinical disease. Types of screening include, (a) Population (or mass) screening or screening done on a large population, (b) Selective (or targeted) screening or screening done in a high-risk population, (c) Individual screening, and (d) Opportunistic screening, i.e., screening of patients who consult the health practitioner for some other purpose.
- Confirmation or rejection of clinical diagnosis
- Monitoring course of disease and response to therapy
- Assessment of severity of disease
There is increasing dependence on clinical laboratory services by the healthcare system. However, results of these laboratory tests are affected by various preanalytical, analytical, and postanalytical factors. Awareness about these factors can minimize their interference and reduce or remove the likelihood of errors. Evidence indicates that most of the laboratory errors occur in preanalytical (62%) and postanalytical (23%) phases as compared to the analytical (15%) phase. About 25% of such results can have consequences for the patient care. An accurate laboratory result not only involves analytic accuracy but also other factors listed in Box 1.1.
The total testing process of a specimen comprises of three distinct phases:
- Preanalytical phase: All procedures or processes occurring before the actual testing of the specimen.
- Analytical phase: All procedures or processes related to actual testing of the specimen.
- Postanalytical phase: All procedures or processes involved following test performanceAll laboratory tests pass through all the three phases.
PREANALYTICAL PHASE
Before the specimen is analyzed, certain factors can affect the test result, i.e., these errors are introduced before the analysis of the sample. The preanalytic phase consists of physiological or biological factors as well as specimen collection, handling, storage and transport. Preanalytic variables are listed in Table 1.1. The majority of problems associated with laboratory test results are due to errors in preanalytic phase.
Patient Identification Procedure
This is the most important step before specimen collection. Incorrect patient identification before specimen collection can result in misdiagnosis or even fatality due to subsequent incorrect treatment based on laboratory report. Ideally, two 4methods should be used for patient identification. Acceptable patient identification methods are: (1) patient's name (if patient is not able to communicate verbally, verification from armband or from nursing staff), (2) date of birth, (3) patient's unique registration number in a hospital setting, and (4) driver's license or picture identification. If there is a suspicion of wrong laboratory results and if there is another patient with a corresponding wrong set of results, crossover of labels between patients may be the cause and the problem should be investigated.
Patient Preparation
Timing of collection of sample is important in some cases, e.g., therapeutic drug monitoring, measurement of cortisol, glucose, etc. Test for blood glucose and lipid profile must be done after patient has fasted overnight. Diurnal variation and posture should be taken into consideration for certain analytes (see below). Oral glucose tolerance test needs patient preparation as outlined in Chapter 4 “Laboratory Tests in Diabetes Mellitus.”
Smoking before collection of specimen can cause increase in WBC count, glucose, cortisol, growth hormone, cholesterol and triglycerides. Chronic smoking may be associated with raised red cell count and hemoglobin levels; decreased arterial pO2 and increased carbon monoxide stimulate erythropoietin release and cause secondary polycythemia.
Selection of Appropriate Specimen for the Test that is Ordered
Correct specimen, specimen container and anticoagulant should be chosen for the test that is ordered (Table 1.2 and Fig. 1.1).
Specimen Collection
Types of blood specimen include serum, plasma and whole blood.
Serum: After removal from the body, serum is the fluid portion remaining after blood has clotted (after about 30–60 minutes). Most chemistry, immunology and serology tests are performed on the serum. In a fasting state, serum appears clear and pale yellow. In contrast to plasma, it is devoid of fibrinogen, and therefore it contains less protein than plasma. Potassium in serum is also slightly higher than in plasma because some potassium is released from platelets during clotting. Serum should be obtained by centrifugation only after the sample is clotted completely.
Plasma: The liquid portion of anticoagulated blood sample is called as plasma. Many laboratory tests can be done on either plasma or serum. Coagulation tests cannot be performed on serum (since coagulation factors are consumed during clotting) and are done on plasma. Plasma is preferred sample for estimation of potassium and ammonia (since these substances are released by cells during clotting). Also, if immediate report is required, plasma sample is preferred since blood sample can be immediately centrifuged to separate plasma.
Whole blood: Most hematology tests are performed on anticoagulated whole blood sample.
Sources of blood specimen include arterial, venous, or capillary blood.
Arterial blood: This is used for measurement of blood gases (partial pressure of oxygen and carbon dioxide) and pH. For collection of arterial blood, syringes are used instead of evacuated tubes because of the pressure in an artery. The usual arteries are radial, brachial, and femoral.
Venous blood: Most chemistry and hematological investigations are done on venous blood.
Capillary blood: Capillary blood is obtained from infants and small children.
|
Venepuncture:
- Proper antiseptic must be used for cleaning and disinfecting the venepuncture site. Isopropyl alcohol wipes can contaminate the sample for blood alcohol determination. If proper disinfection is not done or if infection is present at the site of puncture, contamination of blood cultures can occur.
- 5Intravenous (IV) line: Blood should never be collected from the IV line, especially above the IV access site, since IV fluids can dilute or contaminate the sample and affect test results. Dilution of blood sample with intravenous fluids will cause spuriously low blood cell counts. Depending on the type of intravenous fluid, increased levels of glucose, potassium, sodium, and chloride and a decrease in other analytes like urea and creatinine will occur. If intravenous line is in place, blood is drawn from the opposite arm or blood is collected from the fingerstick. If not possible, then IV line is turned off for 2 minutes, tourniquet is applied below IV line insertion site, and a different vein below the IV line is used for collection of sample. Location of IV line, type of fluid being infused, and site of venepuncture should be documented.
- Presence of sclerosed veins, edema, hematoma, scars, burns, and tattoos. Another site should be selected for venepuncture.
- Mastectomy: Drawing blood from the arm on the same side as a mastectomy should be avoided. Mastectomy involves lymph node removal and can cause lymphostasis or stoppage of lymph flow, thus making the arm vulnerable for swelling and infection.
- Heparin or saline lock: These are catheters connected to a stopcock through which medication is given or blood is drawn. These are often flushed with heparin to keep them from clotting. If blood is drawn from such a site, then contamination with heparin can occur. If blood is to be collected, first 5 mL of blood should be discarded or blood may be collected below heparin lock if nothing is being infused.
- Tourniquet: Prolonged application of tourniquet should be avoided before collection of venous blood since it causes stasis of blood flow, hemoconcentration, and increased concentration of analytes bound to cells or proteins. If tourniquet is applied for too long, if patient excessively clenches his fist, or if exercises his arm before venepuncture, potassium may be released from cells and cause artefactual hyperkalemia. If tourniquet needs to be applied, then it should be applied for less than one minute. Prolonged tourniquet application can also increase levels of hematocrit, proteins and lactic acid. Fist pumping during venepuncture will increase potassium, lactic acid, calcium and phosphorous. Clenching or pumping of fist during venepuncture is of no use and should be avoided.
- Evacuated tube system or syringe method of collection: Blood is collected from a vein either using a needle attached to a syringe or a stoppered evacuated tube. If veins are small, fragile, or hard to find, a winged or butterfly infusion set is used usually in infants and small children. Evacuated tubes are sample collection tubes with a premeasured vacuum that automatically draws the volume of blood depicted on the label. The evacuated tube system is better since blood is collected directly from the vein into the tube, thus reducing contamination of specimen and minimizing hazard of exposure of blood to the collector. If syringe method is used, needle should be detached before transferring blood from the syringe into the tube to avoid hemolysis.
- Tube additives and order of draw: Specimen collection tubes may contain additives or are additive-free. If the additive contains an anticoagulant, clotting will be inhibited; all other additives and additive-free tubes are used to obtain serum. These are outlined in Chapter 20 “Collection of Blood”.Incorrect anticoagulation or contamination from incorrect order of draw can produce incorrect results. For example, (a) if sample in K2 EDTA is collected before serum or heparin tubes, calcium and magnesium levels are reduced 6and potassium levels are increased, and (b) contamination of citrate tube with clot activator will produce incorrect coagulation test results.Additive contamination is possible if bottom of tube is not held lower than top during collection.Collection tube should be filled up to ±10% of recommended volume.
- Mixing of specimen: After collection, blood should be completely mixed with anticoagulant by 3–8 inversions of tube depending on the additive in the tube. Inversion refers to gently inverting a specimen upside down and then back right side up. If not immediately and thoroughly mixed, microclots may form in the tube with an anticoagulant, while clotting may be incomplete in the tube containing clot activator. Microclots, fibrin and platelet clumping will induce erroneous results. Shaking or vigorous mixing should be avoided as it may lead to hemolysis or foaming making the sample unsuitable for testing.
Specimen Labeling and Transport
After collection, specimen tube should be promptly labeled in front of the patient and then sent to the laboratory with the request form. If the specimen is collected in a hospital and analyzed in a laboratory within that hospital, time for transportation may not be a factor. However, if the specimen is to be sent to a distant laboratory for analysis, care must be taken while shipping the specimen. Tubes without anticoagulant should be kept in a vertical position to allow complete clotting and reduce stopper contamination.
Agitation or rough handling during transport can induce hemolysis (making specimen unsuitable for tests like potassium and lactate dehydrogenase) or cause platelet activation (making specimen unsuitable for coagulation tests like prothrombin time and activated partial thromboplastin time). Excess vibration of syringe specimen for arterial blood gas analysis can increase pO2 values.
Special Precautions
- Chilling: Samples for adrenocorticotropic hormone (ACTH), catecholamines, lactic acid, ammonia, pyruvate, gastrin, parathyroid hormone, and blood gases should be transported on slurry of crushed ice and water; use of ice cubes alone may cause hemolysis.
- Warming: Samples for cold agglutinins and cryoglobulins should be kept warm during transport (e.g., by holding it in hand or using 37°C heat block)
- Protection from light: Samples for bilirubin, carotene, erythrocyte protoporphyrin, vitamin A and vitamin B12 should be kept protected from light by wrapping them in an aluminum foil.
- Chain of custody: When any laboratory sample is linked to a crime investigation or is used as evidence in legal proceedings (e.g., blood alcohol, drug testing, DNA testing), it should have a special documentation protocol called as chain of custody. A special form accompanies the specimen from collection to reporting of result. This form bears the name and signature of the person being investigated, a witness if necessary, and all those who handle it.
- Blood gas and ionized calcium analysis: Due to sensitivity of pH, pCO2, and pO2 to time, temperature, and handling, all blood gas specimens are immediately measured and not stored. Collection in a heparinized syringe is recommended for blood gas analysis. Plastic syringe is recommended if testing can be done within 30 minutes of collection; if more time is anticipated, then a glass syringe should be used and specimen is stored in ice.
Identification of Specimen
Specimen and requisition form should be checked for proper identification information before receiving them in the laboratory.
Handling and Processing of Specimen at the Site of Testing
Some analytes are unstable in unprocessed serum and plasma. If laboratory tests are to be performed on serum or plasma, serum or plasma should be separated from blood as soon as possible or within two hours of blood collection. Prolonged contact between of serum/plasma with blood cells can cause exchange of compounds between them affecting concentrations of analytes like potassium, glucose, lactic acid, etc. Before centrifugation, blood should be allowed to clot for sufficient time to separate serum (to avoid fibrin strands in the serum). For separation of plasma, tubes should be centrifuged within 2 hours of collection. During centrifugation, tubes should be kept capped to avoid loss of CO2, increase in pH (causing inaccurate results for pH and CO2), evaporation, or aerosol formation.
Primary tubes should not be recentrifuged, since this will cause hemolysis; if recentrifugation is needed, then serum or plasma should be transferred to another tube and then centrifuged. Sample from primary tube is transferred to secondary containers by aspiration and not by pouring.
Serum or plasma which is separated can be kept at room temperature for 8 hours, at 2–8°C for 48 hours, or can be kept frozen at −20°C for longer duration. Repeated freezing and thawing should be avoided. Whole blood sample should never be frozen.
Reasons for Specimen Rejection
- Specimen not labeled or incorrectly labeled: Error in patient identification can lead to administration of a wrong blood product with fatal transfusion reaction.
- Inadequate amount of sample, e.g., partially filled anticoagulated sample tube will cause inaccurate result due to excess amount of anticoagulant.
- Hemolyzed sample: Hemolysis is often due to collection of blood with a small bore needle and forcing the blood into the container with the needle attached. It causes false elevation of potassium (since potassium is the major intracellular cation) and of lactate dehydrogenase (present in red cells). Plasma potassium and lactate dehydrogenase are the most sensitive indicators of hemolysis. 7Hemolysis interferes with results of almost all laboratory tests, especially colorimetric assays.
- Lipemic sample: A lipemic sample is often obtained after a fatty meal and significantly alters serum triglyceride levels. To avoid interference from diet-derived triglycerides, patient must be fasting.
- Clots in sample cause spuriously low blood cell counts.
- Specimen collected in a wrong tube.
- Specimen collected in an outdated tube.
- Specimen collected at the wrong time.
- Exposure of specimen to light can affect levels of bilirubin, porphyrins, beta-carotene, and vitamins A and B6; such specimens should be protected from light by aluminum foil or an amber container.
- Exposure of sample to extremes of temperature.
- Delay in transit of sample.
Physiological Variables
Diet
- Ingestion of food with high fat content (e.g., butter, cheese): This increases triglycerides. Increased lipid levels in blood cause serum or plasma to turn milky or turbid which will interfere with other tests.
- High protein diet increases blood urea and ammonia.
- Chronic alcoholism: Alcohol abuse is associated with fasting hypoglycemia, hypertriglyceridemia, increased aspartate aminotransferase, increased gamma glutamyl transferase, sideroblastic anemia, folate deficiency and macrocytosis of alcoholism.
- Ingestion of meat, fish, iron, and horseradish produce false positive occult blood test on stools.
- Ingestion of beverages containing caffeine increases cortisol and ACTH levels.
- Drinking excess water and fluids decreases hemoglobin and alters electrolytes.Specimens for lipids (triglycerides, cholesterol), glucose, gastrin, and insulin should be collected after overnight 8–12 hours fast in a basal state (fasting sample).
Diurnal Variation
Some analytes show diurnal (occurring daily) variation in their concentration.
- Higher levels in the morning: Cortisol, adrenocorticotropic hormone, aldosterone, renin, thyroid stimulating hormone, potassium, bilirubin, hemoglobin, red blood cells, insulin, iron
- Lower levels in the morning: Creatinine, glucose, growth hormone, triglycerides, phosphates.
Treatment errors can occur if samples for analytes that exhibit diurnal variation are not drawn at the appropriate time.
Timing of Collection
Apart from diurnal variation, timing of collection is also important, especially for monitoring of therapy, e.g., for monitoring of heparin therapy, sample for activated partial thromboplastin time should be collected 6 hours after the last dose. Activity of glucose-6-phosphate dehydrogenase should not be measured immediately following an acute hemolytic episode.
Exercise
Moderate amount of exercise can increase blood levels of lactate, creatine phosphokinase, aspartate aminotransferase, and lactate dehydrogenase. Exercise also activates coagulation, fibrinolysis and platelets.
Gender
After sexual maturity, women generally have lower levels of hemoglobin, iron, ferritin, serum creatinine, albumin and calcium as compared to men. Differences in iron, ferritin, and hemoglobin are due to menstrual losses in women and stimulating effect of testosterone on erythropoiesis in men.
Age
- Newborns have higher mean corpuscular volume (MCV) and hemoglobin (mostly HbF). Increased hemoglobin in newborns is due to high concentration of HbF which shifts oxygen dissociation curve to the left with less release of oxygen to the tissues; this stimulates erythropoietin production with consequent increase in erythropoiesis and hemoglobin synthesis.
- Serum bilirubin rises after birth and peaks at about 5 days (physiologic jaundice).
- Newborns do not have naturally-occurring IgM ABO antibodies and blood grouping is done by forward grouping.
- IgG antibodies in the newborns are maternally derived.
- Infants have lower glucose values.
- Due to skeletal and muscle development, children have higher serum alkaline phosphatase and creatinine levels. They also have higher serum phosphate levels than adults because phosphate is required for internalization of calcium into bones for mineralization.
- Lower hemoglobin concentration in children is because increased concentration of phosphates enhances synthesis of 2,3-bisphosphoglycerate (2,3-BPG); this in turn causes right shift of oxygen dissociation curve and increased delivery of oxygen to the tissues.
- In adults, total cholesterol and triglycerides increase by 2 mg/dL per year until middle age.
- Creatinine clearance gradually decreases with advancing age. As glomerular filtration rate and creatinine clearance are lower in the elderly, there is a risk of renal damage with certain drugs if usual doses are used.
- Naturally-occurring ABO IgM antibodies are often reduced in the elderly.
- Plasma urea concentration is higher in the elderly.
Altitude
At higher altitude, due to lower oxygen tension, red cell count, hemoglobin and hematocrit are higher.
Emotional or physical stress can cause elevation of adrenocorticotropic hormone, cortisol, catecholamines, and WBC count.
Posture
During phlebotomy, posture of the patient can influence the laboratory test result. Upright posture increases concentrations of total proteins, albumin, calcium, bilirubin, cholesterol, renin, aldosterone, catecholamines, and triglycerides.
Pregnancy
Pregnancy causes a number of changes in body systems.
- Due to expansion of plasma volume relative to red cell mass, hemoglobin levels are lower due to dilutional effect.
- Glomerular filtration rate and creatinine clearance are higher due to increased plasma volume.
- Serum alkaline phosphatase is higher due to synthesis by placenta.
- High level of serum human placental lactogen reduces sensitivity of peripheral tissue to insulin (insulin antagonist) with subsequent glucose intolerance.
- Serum thyroxine and cortisol are increased due to increased synthesis of corresponding binding proteins in liver.
- High progesterone level stimulates respiratory center and causes respiratory alkalosis of pregnancy.
ANALYTICAL PHASE
Analytical factors affect the actual test procedure and include:
- Maintenance and calibration of instruments
- Use of standards and controls during the test procedure to validate the test reagents/kits
- Quality control to ensure proper working of test methods
- Techniques and other factors during the test procedure like aliquoting, pipetting, dilution, timing, stability of reagents, laboratory water
- Interference by substances like dirt or buildup of proteins in the sampling probe.
POSTANALYTICAL PHASE
Postanalytical errors occur during calculations to derive the result, recording and reporting of results, and notification of test result to the clinician and patient. An example is laboratory test result, after correctly performing the test, being recorded and reported to the wrong patient.
GAUSSIAN DISTRIBUTION AND REFERENCE RANGES
Values of many tests will follow a normal or Gaussian distribution (i.e., there will be equal number of results above and below the mean value) to produce a bell-shaped curve. In a normal distribution, values of mean, median, and mode are the same. Mean refers to the average of all values, median refers to the value at the center or mid-point of distribution, while the mode is the most frequently observed value. If the distribution of values is skewed or non-Gaussian, values of mean, median and mode are different. Other features of Gaussian distribution are: (1) mean ±1SD (standard deviation) incorporates 68% of all values, (2) mean ± 2SD incorporates 95% of all values, and (3) mean ± 3SD incorporates 99% of all values (Fig. 1.2). Standard deviation is the average deviation of an individual value from the mean value in a normal population. Tests with high precision (e.g., serum electrolytes) have a low SD, while tests with low precision have a high SD (e.g., serum enzymes). Thus, the smaller the SD better is the precision of the test. Standard deviation is calculated from the following formula:

Where, Σ = summation; x1 = individual value; x = mean; and n is the total number of values.
The reference range is established by measuring the value of an analyte in a large number of normal subjects (at least 100) and the reference range is derived as mean ± 2SD. Addition of 2SD to the mean establishes upper normal level or cut off, while subtraction of 2SD from the mean establishes the lower normal level or cut off. The normal reference range comprises of 95% of all the values obtained in the population studied. (There is overlap of values at higher end of normal and lower end of abnormal).
Reference range may vary according to age, sex, and ethnicity of an individual.
Most of the analytes follow Gaussian or normal distribution pattern; some analytes like serum cholesterol and serum triglycerides have a non-Gaussian or skewed pattern.
INTERPRETATION OF LABORATORY TESTS
Laboratory medicine is diverse and includes enormous number of chemical, hematological, immunological, genetic and other tests; however, principles that guide the choice and interpretation of laboratory tests are similar in all 9subspecialties. Correct selection of clinical laboratory tests and their proper interpretation require knowledge about uses, advantages, and shortcomings of clinical laboratory tests. Incorrect interpretation of a test result due to lack of this knowledge can lead to serious mistakes, increase in healthcare costs, and patient morbidity and sometimes mortality. Certain basic principles about interpretation of laboratory tests are outlined below.
- Laboratory tests are used for screening, diagnosis, monitoring of disease activity or of therapy, and assessment of risk factors or of prognosis. Therefore, all laboratory tests are not advised for the same reason. Common examples are given below.
- Tests used for diagnosis: Blood glucose for diagnosis of diabetes, blood smear for malaria parasite, factor VIII assay for diagnosis of hemophilia A, D-dimer test for diagnosis of disseminated intravascular coagulation.
- Tests used for monitoring of disease activity or therapy: Glycosylated hemoglobin in diabetes mellitus to assess long-term control, blood glucose to monitor insulin therapy in diabetes mellitus, prothrombin time to monitor oral anticoagulant therapy, tumor markers for monitoring of cancer treatment.
- Tests used for assessment of risk: Lipid levels in cardiovascular disease.
- Sensitivity and specificity of laboratory tests should be considered before ordering them. To increase the possibility of detection, a laboratory test with the highest sensitivity should be used for screening a population. If the screening test is positive, a laboratory test with a high specificity should be used for confirmation.
- Various preanalytical variables affect the results of laboratory tests (see earlier).
- Laboratory test result must be interpreted in the light of the clinical condition. An abnormal test result may not mean the same diagnosis in two different patients. For example, a prolonged APTT and a normal PT result in a female with recurrent abortions suggests the diagnosis of antiphospholipid antibody syndrome, while the same result in a male with bleeding tendencies suggests the possibility of hemophilia.
- Reference ranges of laboratory results are established by each laboratory and may vary according to age, sex, and race. Values of some analytes like von Willebrand factor depends on blood group (value in blood group AB is higher than in blood group O).The established reference range refers to 95% of the values obtained in the population studied; therefore, 2.5% of the results obtained will be below and 2.5% will be above the reference range. Most of the reference ranges are established in adults and may be outside the range in children; in such cases, specialized textbooks should be consulted for expected results.Laboratory tests that directly measure an analyte (e.g., blood glucose, serum creatinine) have more consistent reference ranges across all laboratories. Reference ranges of indirect or functional tests (e.g., prothrombin time) vary widely among laboratories.
- Highly abnormal and unexpected results should be interpreted with caution. Although highly abnormal result may prove to be correct after reviewing clinical data and results of other investigations, possibility of a spurious laboratory result due to clerical errors, improper sample collection, and various preanalytical variables must be considered before taking management decisions.
- Laboratory test results are often reported by different laboratories in different units (conventional or SI) and may appear abnormal, e.g., normal serum calcium in conventional units is 8.5–10.5 mg/dL, while in SI units it is 2.1–2.6 mmol/L; therefore, before interpreting, the concerned clinician must determine the unit of reporting.
- The laboratory test ordered should be clear (as many tests have similar short forms or similar names) and self-explanatory and should not create confusion or misinterpretation in the laboratory. Examples include C-reactive protein (an acute phase protein) vs. factor C assay (a natural anticoagulant), and GTT (glucose tolerance test vs. gamma glutamyl transferase).
- In neonates, infants and children, as the blood volume is small, repeated blood collections for laboratory testing can induce iatrogenic anemia.
- Critical values (also called as action values or callback values) are those laboratory results which indicate potentially life-threatening situations and are immediately notified by the laboratory to the concerned clinician since they need urgent clinical intervention. Each institute should have its own set of critical values and physician notification policy. Appendix III lists some critical values.
- Diagnosis should not be based on a single abnormal test result, since it may be spurious from improper blood collection or laboratory variability; a particular trend in test results from successive samples is more important.
- It should be attempted to attribute all the abnormal laboratory test results to a single cause, especially in patients younger than 60 years (Osler's rule). If it is not possible, then only multiple diagnoses may be considered.
- If serial results performed on different occasions on the same patient differ by more than 2.8 times the analytical standard deviation, change in the patient's condition is more likely than laboratory imprecision.
TEST PERFORMANCE SPECIFICATIONS
Before a laboratory test is used routinely, its reliability needs to be established. Four indicators are commonly used for this purpose: accuracy, precision, sensitivity, and specificity.
Accuracy (trueness): This is the ability of the test to measure what it claims to measure; it is also defined as how close measurement is to the true value. Accuracy of a particular test can be determined from the control sample (often provided by the manufacturer of the test kit) the value of which is known.
Precision (reproducibility): This is the ability of the test to reproduce the same result (or results that are close to one another) when the same sample is tested multiple times. The results may not be close to the true value. An ideal test is 10both precise and accurate, but a precise test may not always be accurate.
Accuracy and precision of a laboratory test are represented graphically on a dartboard configuration with bull's eye in the center in Figure 1.3.
Sensitivity: This is the ability of the test to identify correctly the presence of a disease, i.e., it produces a true positive result. A highly sensitive test produces few false-negative results. High sensitivity is desirable in screening tests since a test with a high sensitivity can detect low concentration of an analyte. Sensitivity of a test can be calculated by the following formula:

A positive result with a 100% sensitive test (i.e., a test with no false negative results) includes all individuals with disease, while a normal test result excludes disease. However, a positive result can be a true positive or a false positive and does not confirm presence of disease.
Specificity: This is the ability of the test to identify correctly the absence of a disease, i.e., it produces a true negative result. A highly specific test produces few false-positive results. High specificity is desirable in confirmatory tests because tests with high specificity measure only the analyte it is supposed to measure and not other related substances (i.e., there is no cross-reactivity). Specificity of a test can be calculated by the following formula:

A positive result with a 100% specific test (i.e., a test with no false positive results) confirms disease, while a normal or negative test result does not exclude the disease since it can be a true negative or a false negative.
The ideal test is 100% specific (negative in all individuals without disease) and 100% sensitive (i.e., positive in all individuals with disease). However, a test cannot be 100% sensitive or specific due to some overlap of values obtained in normal individuals and in individuals with disease. Therefore, in any test, few abnormal values are generated in healthy individuals (false positives) and few normal values are generated in individuals with disease (false negatives). False positive result may mislead the clinician and lead to unnecessary investigations and treatment. False negative result may lead to missing or delaying diagnosis.
Relationship between sensitivity and specificity of a test: Sensitivity and specificity move in opposite direction with change in test parameters; increase in sensitivity is associated with decrease in specificity and vice versa. This is because if sensitivity of a test is to be improved (to detect more number of people with the disease), the limits of positivity must be made less stringent. Highly sensitive test will then also yield positive result in people without the disease.
DIAGNOSTIC VALUE OF A TEST
True positive (TP): This is positive test result in a patient who has the disease or condition, i.e., patients with a given disease or condition who are correctly classified by a test to have the disease or condition.
False positive (FP): This is positive test result in a patient who does not have the disease or condition, i.e., patients without a given disease or condition that are classified by a test to have that disease or condition.
True negative (TN): This is negative test result in a patient who does not have the disease or condition, i.e., patients without a disease or condition who are correctly classified by a test as not having that disease or condition.
False negative (FN): This is negative test result in a patient who has the disease or condition, i.e., patients with a disease or condition that are classified by a test as not having that disease or condition.
Diagnostic sensitivity: This is the percentage of population with disease that test positive. It is the ability of a test to detect a disease or condition.
Diagnostic specificity: This is the percentage of population without the disease that test negative. It is the ability of a test to detect the absence of a disease or condition.
Positive predictive value: This is the percentage of time a positive result is correct, i.e., chance of a person having a particular disease or condition if the test is positive.

Tests with 100% specificity have a positive predictive value of 100%.
Negative predictive value: This is the percentage of time a negative result is correct, i.e., chance of a person not having a particular disease or condition if the test is negative or within the reference range.11
Figs. 1.4A to D: (A) Ideal test: Test with 100% sensitivity (100% negative predictive value) and 100% specificity (100% positive predictive value); (B) Due to overlap of test result values in health and disease, some individuals with disease have result within the reference range (false negative), while some individuals without disease will have result outside the reference range (false positive); (C) If the diagnostic cutoff point is set too high, it will lead to no false positives, but many false negatives. Therefore, specificity and positive predictive value are increased, but sensitivity and negative predictive value are decreased; (D) If the diagnostic cutoff point is set too low, it will lead to no false negatives but many false positives. Therefore, sensitivity and negative predictive value are increased, but specificity and positive predictive value are decreased.

Tests with 100% sensitivity have a negative predictive value of 100%.
A test with 100% sensitivity and 100% negative predictive value is excellent for screening for disease. A test with 100% specificity and 100% positive predictive value is excellent for confirmation of disease. Increasing sensitivity of a test decreases specificity and positive predictive value. Increasing specificity of a test decreases sensitivity and negative predictive value (Figs. 1.4A to D).
Receiver operating characteristic (ROC) curve: ROC curve analysis is useful for: (1) Identification of optimal cut-off for a diagnostic test, and (2) Comparing two or more laboratory tests to identify the test with the highest discriminatory ability. In ROC curve, sensitivity (or true positives) of a test is plotted on y-axis and false positive rate (1−specificity) is plotted on x-axis, across the entire range of cutoffs for the laboratory test being evaluated (Fig. 1.5). The area under the curve is the efficiency of the test or how frequently the test is able to detect the condition (i.e., discriminatory ability of a test). The higher the area under the curve, higher is the efficiency of the test. An area under the ROC curve of 1.0 indicates a perfect diagnostic test, while a test with area under the curve of 0.5 is of no value (equal to diagnosis by a coin toss). In general, area under the curve of 0.8 indicates a good discriminatory test.
Comparison of two or more diagnostic tests is based on calculating area under the curve. If one ROC curve shows higher area than that of the comparison test, that test has higher sensitivity and specificity at all cutoffs.
Fig. 1.5: Receiver operating characteristic curves for laboratory tests: (A) The dotted line represents test of no value; (B) Test with moderate value; (C) Test with a good discriminatory power.
QUALITY ASSURANCE
Quality assessment or quality assurance is a process that assures highest quality results by a laboratory by closely monitoring preanalytical (preceding test performance), analytical (related to test proper), and postanalytical (after test analysis) phases of testing. Through quality assurance, quality of laboratory reports is guaranteed so that right result is produced at the right time, on the right specimen, from the right patient, at the right price and the interpretation of result is based on right reference ranges.
- Development, implementation, and regular monitoring of Standard Operating Procedures (SOPs) for all the tests and activities of a laboratory.
- Employing appropriately trained and experienced staff.
- Procurement of validated instruments and their proper and regular maintenance and calibration.
- Correct practices in preanalytical phase: Requisition form, patient preparation, patient identification, sample identification, sampling collection technique, storage and transport of specimen, storage of testing kits and quality control materials, training of personnel, retention of current package inserts.
- Correct practices in analytical phase: Validated and correctly stored test kit, validated and calibrated instruments, preparation of control and patient samples, exactly following manufacturer's directions, documentation of quality control results, implementation of internal quality control.
- Correct practices in postanalytical phase: Proper documentation of patient results in appropriate units, issue of results in a timely manner, immediate calling of critical values, use of correct reference ranges, reporting of certain results to authorities as per law.
- Monitoring whether test results are reaching the patient early so as to affect decision-making process about diagnosis and treatment.
- Participation in an external quality control program.
QUALITY CONTROL
Quality control (QC) is a process of systematic monitoring of analytical processes through the use of control samples to verify accuracy and reproducibility of patient's results. It forms a part of analytical phase of quality assurance. Since the value of the concentration of an analyte in the patient's sample is unknown, reliance is placed on control samples to generate accurate results. A control is a sample that contains the analyte of interest with a known concentration, is physically and chemically similar to patient's sample, and is tested in exactly the same manner. Controls can be obtained commercially or can be prepared in the laboratory. At least two controls (high and low) should be used for quantitative estimation of an analyte; they must be run along with patient samples or at least thrice in a 24-hour period (once in each shift). For a qualitative test, positive and negative controls are included in each run of the test.
Laboratory quality control may be external or internal.
External quality control: This consists of testing a control material not built into the system. An example is participation in proficiency testing program. In this, a specimen is sent by a government agency or a commercial company to a group of participating laboratories. The specimen is analyzed by each laboratory and the result is reported to the agency which then evaluates, grades, and compares result with other laboratories. Result is considered as acceptable if it is, (a) within ±2SD of mean (mean is derived from values submitted by all participating laboratories), (b) within 10% of target value, or (c) within a fixed deviation from a target value. Quality control between laboratories can thus be monitored.
If proficiency testing program is not available, the laboratory must validate the test result every 6 months by comparing the result with that of a reference laboratory or of another laboratory offering the test (by dividing the sample and sending it for testing).
Internal quality control: Results of controls are recorded during daily operation of a laboratory in a Levey-Jennings chart (Fig. 1.6). A Levey-Jennings chart is a graph in which all control values for a test are plotted with respect to the calculated mean and standard deviation for an extended period. If all control values are within the acceptable limit of mean ± 2SD, then all test runs during that period are acceptable.
Each control (low and high) for each test should have a Levey-Jennings chart prepared for each month. Westgard rules (Table 1.3) are used for interpretation of Levey-Jennings chart. If any violation of rules is detected, then the test run is rejected and the problem resolved before resuming testing of patient samples.
Another internal quality control measure is delta check which can be incorporated into the computer of the automated analyzer or the laboratory information system. Delta check is the process of comparison between patient's current laboratory values with previous results over a period of time. The basis of delta check is that a laboratory result in a patient should not deviate significantly from the previous value. If the pre-established limit is exceeded, then it is flagged by the computer (failed delta check); in such a case, it is necessary to determine whether it is due to an alteration in medical condition or a laboratory error. It may also indicate a specimen mix-up.
Fig. 1.6: Levey-Jennings chart constructed using mean and standard deviations obtained from control values of a laboratory test. Plot shows results of control values for 15 consecutive days. Normally control results randomly fluctuate above and below the mean (as shown in this plot). Abnormalities that can occur and not shown in the figure include: (1) a single control result lying outside the established limit (mean ±2SD), indicating a random error; (2) five consecutive values on five consecutive days fall on the same side of the mean but remain at a constant level (‘shift’); and (3) five consecutive values on five consecutive days show consistent increase or decrease (‘trend’). A shift or a trend on Levey-Jennings chart indicates a systematic error.
|
Random and systematic errors: Errors should be detected before reports are issued by the laboratory. Two types of analytical errors can occur during the test procedure: random or systematic.
Random error is an error that does not recur in a regular pattern when a large number of measurements of the same quantity are made under identical conditions. The source of this type of error cannot be definitely detected. Causes of random error include voltage fluctuation, temperature changes, dirty glassware, or interfering substances present in patient sample due to diet or drugs. A random error is indicated by a control result that is significantly different from others on Levey-Jennings chart or violation of 13S or R4S Westgard rules. Random errors are unavoidable.
A systematic error is an error that recurs or remains constant during the course of a number of measurements of the same quantity under identical conditions. This type of error is inherent in the test system or is introduced into the test system. This error can make results consistently higher or lower than their actual value. Causes of systematic error include contamination or deterioration of reagents and mechanical fault in the instrument. A systematic error is indicated by a trend on Levey-Jennings chart or violation of 22S, 41S or 10× Westgard rules.
CREDENTIALS OF A LABORATORY
Accreditation refers to the recognition granted by a non- governmental authoritative agency to institutions that meet certain quality requirements. Although voluntary, most healthcare institutions seek accreditation because it enhances the reputation of that institution and also it provides the public a way to assess the quality of that institution. In India, accreditation of laboratories is voluntary and the accreditation body is the National Accreditation Board for Testing and Calibration Laboratories (NABL). This agency has signed Mutual Recognition Agreement with the regional body Asia Pacific Laboratory Accreditation Cooperation and with the apex body the International Laboratory Accreditation Cooperation. For accreditation, the institution or a clinical laboratory desiring accreditation invites the accrediting body to inspect its facilities to determine whether the established standards are being met.
The internationally accepted standard for clinical laboratories is ISO 15189.
The term licensure refers to permission granted by state to individuals or organizations to carry out certain professions or businesses. In India, license is required for blood banks, which is issued by Food and Drugs Administration (FDA). A license is mandatory and it is illegal to practice or operate in that state without a license.
POINT OF CARE TESTING
One of the major advances in laboratory science has been point of care testing. Point of care testing (POCT) refers to performing laboratory tests at the site of patient's care, e.g., bedside, in the emergency department, clinics, nursing homes, intensive care units, operation theaters, health fares, or in the home-setting. POCT is also called as near-patient testing, bedside testing, off-site testing, and alternate site testing. With POCT, laboratory test is brought to the patient rather than collecting patient's sample and sending it to the laboratory for testing. Rapid advances in technology have made implementation and increasing use of POCT possible. The basic aim of POCT is to provide rapid test results where immediate medical therapy is required. These tests are minimally invasive, relatively simple to perform, convenient, require only a small amount of sample, and the results are immediately available. Real-time measurement of the patient's sample can be done in a short time so that immediate treatment can begin if required. Although less technical skill is required to perform POC tests, the relevant personnel (e.g., staff from nursing service, surgery, emergency department) are required to be trained by the laboratory staff and their performance needs to be monitored. Regular maintenance and calibration of the instrument, and quality control cross checks with the main laboratory are necessary.
Although many POC tests are available, the commonly performed POC tests include measurement of blood glucose, glycated hemoglobin, cholesterol, electrolytes, creatinine, cardiac markers, blood gas analysis, pregnancy tests, alcohol, blood hemoglobin, activated clotting time, platelet function testing (e.g., VerifyNow® System that measures response to antiplatelet agent aspirin), thromboelastography, prothrombin time, international normalized ratio (INR) and activated partial thromboplastin time. POC analyzers are small, portable, hand-held devices that are easy-to-use and test can be done on whole blood without anticoagulation (i.e., through skin puncture or syringe venepuncture).
SAFETY IN THE CLINICAL LABORATORY
The environment of clinical laboratory is unique because it poses various types of hazards (biological, chemical, and physical) to the laboratory staff, patients, and to the environment. To avoid these, implementation of a relevant and affordable safety program is essential.
Biological Hazards
Infection can be acquired in the laboratory through skin (from contaminated benches and equipment through cuts and scratches, needlestick injuries), mouth (during mouthpipetting, eating food in the laboratory), respiratory tract (aerosols), or eyes (sample splashing).
To limit occupational exposure to blood-borne pathogens (like human immunodeficiency virus, hepatitis B virus, and hepatitis C virus), following preventive measures are recommended:
- Practicing universal precautions: Universal precautions are a set of precautions released by Centers for Disease Control and Prevention (CDC), USA to prevent transmission of human immunodeficiency virus, hepatitis B virus, and other blood-borne pathogens when providing first aid or healthcare. The principal behind the universal precautions is blood and certain body fluids (Box 1.3) of all patients should be considered as potentially infectious for blood-borne pathogens. Universal precautions include use of protective barrier devices and precautions to prevent injuries caused by needles and other sharp devices. Universal precautions are not meant to replace other measures for infection control like handwashing and other disease-specific isolation precautions.
- Work practice procedures: (i) Use of personal protective equipment like gloves, gowns, laboratory coats, face shields or masks, and safety goggles, (ii) handwashing (with soap and water or alcohol-based handrub) should be performed after contact with patients and laboratory specimens, after completing laboratory work and before leaving laboratory, after removing gloves, before eating and drinking, and before all activities that involve hand contact with mucous membranes or breaks in skin, (iii) use of safety signs and symbols like prohibition signs crossed with red lines (e.g., no smoking, no drinking, no eating, no mouthpipetting), displaying international biohazard sign (Fig. 1.7) on the laboratory door, (iv) use of biological safety cabinets, and (v) no recapping of contaminated needles.
- Housekeeping procedures: (i) Decontamination of infectious material (reusable equipment and work surfaces) by autoclaving, boiling, or use of chemical disinfectants as appropriate, (ii) Proper disposal of laboratory waste according to local regulations. Laboratory waste and contaminated materials need proper separation of waste into color-coded discard bags/containers according to method of disposal and associated hazard.
- HBV vaccination program for laboratory staff
- Training of laboratory staff.
Chemical Hazards
Hazardous chemicals in the laboratory can be categorized as follows:
- Corrosive or caustic chemicals: Can cause destruction of human tissue on inhalation or contact; examples are strong acids and bases like sulphuric acid, nitric acid, glacial acetic acid, hydrochloric acid, potassium hydroxide, and sodium hydroxide; phenol; sodium hypochlorite
- Toxic chemicals: Interfere with metabolic reactions in the body when inhaled, ingested, or absorbed through skin; examples are chemicals containing heavy metals like potassium cyanide, mercury
- Carcinogenic chemicals: Cancer-causing chemicals; examples are benzidine, formaldehyde (probable)
- Mutagenic and teratogenic chemicals: Mutagens induce mutations while teratogens cause defect in embryo development; examples are benzene, lead, mercury, toluene, radioactive substances
- Flammable chemicals: Can cause fire; examples are acetone, diethyl ether, alcohols, xylene
- Oxidizing chemicals: Can cause explosion; examples are perchloric acid and hydrogen peroxide.
15Storage information and safety instructions about the chemical are displayed on the label of the container by the manufacturer. The safety diamond (National Fire Protection Association) is one method of labelling chemicals in which a diamond indicates the severity of a particular hazard from least danger (0) to the greatest hazard (4). General safety precautions include proper storage and handling, protection of eyes, skin, clothing, and equipment, avoiding inhalation of fumes, washing affected area immediately, and flammables to be kept away from sources of ignition and oxidizers.
Physical Hazards
Electrical equipments, laboratory instruments, and glassware can pose a hazard if incorrectly used.
BIBLIOGRAPHY
- Cheesbrough M. District Laboratory Practice in Tropical Countries. 2nd edition. Cambridge University Press. 2009.
- Gaw A, Murphy MJ, Srivastava R, Cowan RA, O’Reilly DSJ. Clinical Biochemistry An Illustrated Colour Text. 5th edition. Edinburgh. Churchill Livingstone Elsevier. 2013.
- Goljan EF. Rapid review Pathology. 3rd edition. Philadelphia. Elsevier Saunders. 2011.
- McPherson RA, Pincus MR: Henry's Clinical Diagnosis and Management by Laboratory Methods. 22nd edition. Philadelphia. Elsevier Saunders. 2011.
- Park YA, Marques MB. Teaching medical students basic principles of laboratory medicine. Clin Lab Med. 2007;27:411-24.
- Westgard JO, Groth T. A multirule shewhart chart for quality control in clinical chemistry. Clin Chem. 1981;27(3):493-501.