To skip the text and go directly to the objects, CLICK HERE
Accurate medical diagnosis is a critical first step for determining individual treatments, as well as for tracking the spread of disease and establishing effective public health strategies. Beginning in the late 19th century, scientists investigating infectious disease developed new diagnostic techniques along with new therapies. The new tools emerged from a growing understanding of the immune system and the role played by antitoxins (antibodies) that the body produced in response to invading organisms or toxins (antigens).
Scientists devised tests to detect the presence of antibodies in blood and employed antigens to provoke an immune response. These tests revealed information about the patient’s disease history, including both on-going infections and prior exposure to disease. Testing could reveal infections before outward symptoms appeared and helped identify disease “carriers”—individuals who remained symptomless but nonetheless could spread disease. Diagnostic tests continue to influence our understanding of disease and how we define the borders between sickness and health.
Even the most accurate tests are imperfect. Repeat testing and different techniques can give conflicting results. Scientists, doctors, and public health workers employ tests for different purposes. Factors such as cost, speed, and ease-of-use influence the design of testing methods and devices. Rapid screening tests that can be used in the field are often backed up by more extensive laboratory-based tests. In recent years, the increase of point-of-care diagnostic devices has allowed more testing to move out of the laboratory and into the clinic or home.
Each of the following tests has had an enormous influence on how individuals and communities have chosen to control, prevent, and treat disease.
Serological (Blood) Tests for Syphilis
Testing blood serum for the presence of antibodies required specialized tools and techniques for collecting blood samples without introducing contaminants. The Keidel Vacuum Bleeding Tube, introduced around 1915, provided one solution. Each sterile package contained a needle attached via a short rubber tube to a sealed glass vacuum tube. After the needle was inserted into the vein, the seal was broken, allowing blood to be drawn quickly into the glass tube. The sample could then be resealed and sent to the laboratory for testing. The Keidel device was marketed particularly for the Wassermann test—a serological test for syphilis developed by August Paul von Wassermann in 1906. The diagnostic test aided public health departments in their efforts to control the spread of sexually transmitted diseases.
Widal Test for Typhoid
In 1896, French physician and bacteriologist Georges Fernand Isidore Widal introduced a blood test for typhoid that still bears his name. Scientists had observed that cholera bacteria would clump together when injected into animals that had been immunized against the disease. This clumping, called agglutination, resulted from the binding of antibodies in the blood serum to the bacterial antigens. The clumps were easily observed through a microscope. Widal devised a practical diagnostic technique for typhoid based on this phenomenon. He mixed a small sample of the patient’s blood serum with a suspension of typhoid bacilli, then used a microscope to examine a drop of the solution. If the cells clumped together, the patient had typhoid antibodies, which indicated either current infection or prior exposure to the disease.
In 1905, bacteriologist John Borden modified the Widal test in a way that freed practicing physicians from their reliance on the services of bacteriological laboratories. By 1912, the Mulford company was advertising a complete test outfit based on his modifications. The kit included a needle and glass capillary tubes for collecting blood from the earlobe, a bottle of salt solution for diluting the sample, a bottle of killed typhoid bacilli suspension, dropper bottles, test tubes, and a rack. The test required no microscope, as the bacterial clumping was visible to the naked eye, appearing as a small white mass at the bottom of the test tube.
Skin Test for Tuberculosis
In 1890, German bacteriologist Robert Koch introduced tuberculin as a treatment for tuberculosis. Tuberculin was essentially a broth of the tubercle bacillus (the bacterium that causes tuberculosis) which was heated and filtered to remove the infectious organism. Through his investigations, Koch discovered that the substance also had value as a diagnostic. When injected in an infected individual, it provoked a visible local allergic reaction. This reaction could be used as a marker for the disease.
However, tuberculin, as originally produced by Koch, was too impure to make a reliable diagnostic. In the 1930s, American biochemist Florence Barbara Seibert succeeded in isolating the protein (antigen) in tuberculin that elicited the distinctive reaction. By the early 1940s, Seibert’s Purified Protein Derivative (P.P.D.) became the accepted world standard for the tuberculin skin test.
Schick Test for Susceptibility to Diphtheria
The Schick test, developed by Hungarian-born pediatrician Béla Schick in the 1910s, was designed to detect the absence of antibodies. Schick needed a simple technique to determine an individual’s susceptibility to diphtheria before deciding whether the individual would benefit from a dose of preventative serum or vaccine.
To perform the test, a tiny dose of diphtheria toxin was injected into the skin of one forearm and a dose of inactivated toxin was injected in the other arm to serve as a control. If an individual had no immunity (no antibodies), redness and swelling would develop around the injection site. No significant reaction would occur if the individual was immune to diphtheria from prior exposure to the disease.
The Schick test became an important screening tool in the diphtheria vaccination campaigns launched in the 1920s. Public health officials tested whole classrooms of students to determine who had already been exposed to the disease. Those who were unexposed – and therefore susceptible – to diphtheria received the new vaccine.
Testing for HIV
In March 1985, the Food and Drug Administration (FDA) approved the first blood test for the detection of antibodies to the human immunodeficiency virus (HIV), the virus that causes AIDS. The test was initially designed for screening the blood supply to stop the spread of AIDS through transfusions. At the time, an estimated 2% of known cases of AIDS had been linked to transfusions of contaminated blood. Blood-collecting centers around the country immediately began using the new test.
Testing individuals for HIV exposure proved much more contentious. Although health officials promised confidentiality, patients feared disclosure and the misuse of test information. The presence of HIV antibodies was not in itself a diagnosis of AIDS, however in popular understanding the antibody test became the “AIDS test.” Testing led to divisive debates over the best way to protect individual rights and stop the spread of infection.
Diagnostic options have changed dramatically since 1985. In 2012, the FDA approved the first in-home HIV antibody test. The test uses saliva rather than blood, and results are provided in about 20 minutes. Additional laboratory tests are required to confirm diagnosis.