Rates of death from tuberculosis in the United States decreased from 194 per 100 000 persons in 1900 to 40 per 100 000 persons in 1945, in part because the epidemic of tuberculosis in the western world was running its course and in part because of public health initiatives and improved socioeconomic conditions. In 1945, 63 000 persons died of tuberculosis and 115 000 new cases of the disease emerged. Streptomycin and para-aminosalicylic acid had just been discovered; the discovery of isoniazid followed, in 1952. Sanitarium care, nonsurgical and surgical collapse therapy, and resectional surgery were in widespread use. By the middle of the 1950s, it was evident that bedrest did not add to the benefit produced by effective chemotherapy, and sanitariums began to close, a process that was completed by the 1970s. As mortality and morbidity due to tuberculosis rapidly decreased, the U.S. government decreased funding for tuberculosis and many states and cities downgraded their tuberculosis control programs.
After 1984, the rate of new cases of tuberculosis, which had decreased to 9.4 per 100 000, began to increase and focal outbreaks of multidrug-resistant tuberculosis were reported. Noncompliance with drug therapy, homelessness, immigration to the United States from developing countries, and human immunodeficiency virus (HIV) infection were invoked as explanations. With the reinstitution of federal funding, improved case-finding and surveillance, and the practice of having patients receive therapy while under direct observation, the rate of new cases of tuberculosis decreased to 8.7 per 100 000 in 1995, the lowest rate since national surveillance was begun in 1953. However, at the end of the 20th century, the worldwide burden of tuberculosis, which is engrafted onto the pandemic of HIV infection, is enormous: an estimated 7.6 million new cases in developing countries and 400 000 new cases in industrial nations.