Top Qs
Timeline
Chat
Perspective
History of public health in the United States
From Wikipedia, the free encyclopedia
Remove ads
The history of public health in the United states studies the US history of public health roles of the medical and nursing professions; scientific research; municipal sanitation; the agencies of local, state and federal governments; and private philanthropy. It looks at pandemics and epidemics and relevant responses with special attention to age, gender and race. It covers the main developments from the colonial era to the early 21st century.
At critical points in American history the public health movement focused on different priorities. When epidemics or pandemics took place the movement focused on minimizing the disaster, as well as sponsoring long-term statistical and scientific research into finding ways to cure or prevent such dangerous diseases as smallpox, malaria, cholera. typhoid fever, hookworm, Spanish flu, polio, HIV/AIDS, and covid-19. The acceptance of the germ theory of disease in the late 19th century caused a shift in perspective, described by Charles-Edward Amory Winslow, as "the great sanitary awakening".[1] Instead of attributing disease to personal failings or God's will, reformers focused on removing threats in the environment. Special emphasis was given to expensive sanitation programs to remove masses of dirt, dung and outhouse production from the fast-growing cities or (after 1900) mosquitos in rural areas. Public health reformers before 1900 took the lead in expanding the scope, powers and financing of. local governments, with New York City and Boston providing the models.
Since the 1880s there has been an emphasis on laboratory science and training professional medical and nursing personnel to handle public health roles, and setting up city, state and federal agencies. The 20th century saw efforts to reach out widely to convince citizens to support public health initiatives and replace old folk remedies. Starting in the 1960s popular environmentalism led to an urgency in removing pollutants like DDT or harmful chemicals from the water and the air, and from cigarettes.[2][3][4][5] A high priority for social reformers was to obtain federal health insurance despite the strong opposition of the American Medical Association and the insurance industry. After 1970 public health causes were no longer deeply rooted in liberal political movements. Leadership came more from scientists rather than social reformers. Activists now focused less on the government and less on infectious disease. They concentrated on chronic illness and the necessity of individuals to reform their personal behavior—especially to stop smoking and watch the diet—in order. to avoid cancer and heart problems.[6][7][8]
Remove ads
Colonial era to 1789
Summarize
Perspective
The healthcare system began in the Colonial Era. Localistic community-oriented care was typical, with families and neighbors providing assistance to the sick using traditional remedies and herbs. New immigrants to the colonies had high death rates from their exposure to a new disease environment. However by the second generation death rates were lower than in England because there was much more food and less crowding. Becoming a regular doctor was difficult. Finally in 1765 the first medical school opened at the College of Philadelphia. That city opened a hospital in 1751; the second one opened in New York City in 1791. By 1775 the 13 colonies had 3,500 to 4,000 regular doctors. About one in ten was formally trained, usually in England or Scotland. They had a clientele among the wealthier classes, but the popular image was one of distrust.[9][10][11]
Colonial death rates and family patterns
The Chesapeake region --Maryland and Virginia—experienced high mortality rates, particularly among new arrivals from England. This high death rate significantly impacted family structures and slowed population growth. Men immigrated far more than women, so there was a persistent shortage of females in the Chesapeake colonies, which further stifled natural population increase and lowered rates of marriage for men. Due to the high mortality rates and unbalanced sex ratio, traditional family structures were difficult to maintain. Many families consisted of step-parents, step-children, and half-siblings, creating complex family networks. By contrast, New England had lower death rates, and much more family stability, and enabled the patriarchal New Englanders to better make long-term plans for acquiring enough land to provide farms for the next generation.[12][13][14]
Smallpox
Smallpox was pandemic but vaccination was introduced in the 1750s. In the 1775–1782 North American smallpox epidemic data based on remnant settlements indicate at least 130,000 people died in the epidemic that started in 1775.[15][16]
During the Revolution General George Washington insisted his soldiers get inoculated else his forces might get decimated or the British try to use smallpox as a weapon.[17][18]
Remove ads
The New Nation to 1900
Summarize
Perspective
Statistics and sanitation
Lemuel Shattuck (1793-1859) of Boston promoted legislation that required a better statewide system for the local registration of vital information on births and deaths. He specified the need for precise details on age, sex, race, and occupation, as well as standard terminology for diseases and cause of death. This law was passed in 1842 and was soon copied by most other states.[19] His proposals greatly expanded the questionnaires used in the Massachusetts state census of 1845. He was a key consultant for the 1850 United States census. He helped convince Congress to fund a much more complex census, and he designed most of the interview forms used by door-to-door canvassers. His Report on the Sanitary Condition of Massachusetts in 1850 on a sanitary survey of Massachusetts was farsighted.[20] It explained how to remove the giant mounds of dirt, horse dung, and outhouse production that were overwhelming the neighborhoods of fast growing cities.[21] It inspired reforms in many cities that faced the same public health crisis.[22]
Metropolitan Board of Health in New York City
The Metropolitan Board of Health was established in 1866 by the Radical Republican who controlled the New York state legislature. It became a model for many major cities due to its innovative approach and effectiveness in addressing public health issues. The state government gave the city's Board extensive powers to create, execute, and judge ordinances related to public health. This comprehensive authority allowed for swift and effective action in addressing health crises. The Board leadership consisted of four police commissioners, the health officer of the Port of New York, and four commissioners appointed by the governor, three of whom were required to be physicians. This diverse makeup ensured a balance of expertise and perspectives. Within weeks of its formation, the Board secured agreements with city butchers to clean up and relocate slaughterhouses, imposed health standards on the milk industry, improved water supply, and began cleaning city streets. When the cholera epidemic broke out in the spring of 1866, the Board successfully fought it with a stringent health code, house-to-house inspections, disinfectants, and quarantines. This resulted in a significantly lower death toll in New York City compared to other major cities. The Board's formation was preceded by a comprehensive sanitary inspection of New York City, which revealed widespread poor living conditions in the slum districts. This data-driven approach to identifying and addressing public health issues was modelled after Shattuck's statewide work in Massachusetts. It became a standard practice in other cities. Furthermore, the Board recognized the connection between housing, politics, morals, and health, setting a precedent for addressing the social determinants of health.[23][24]
The success of New York City's Metropolitan Board of Health in improving public health conditions and managing disease outbreaks demonstrated the effectiveness of a centralized, empowered health authority. This model was subsequently adopted by other cities and states, shaping the future of public health administration in America.[25][26][27]
Medical education

Many of the early medical schools in the United States were founded by alumni of the University of Edinburgh Medical School, in Scotland. The nation's first medical school opened in 1765 at the College of Philadelphia by John Morgan and William Shippen Jr. It evolved into the University of Pennsylvania's Perelman School of Medicine. In New York City in1767, Dr. Samuel Bard opened a medical school. In 1814 it became Columbia University's Vagelos College of Physicians and Surgeons . Harvard Medical School opened in 1782; Dartmouth in 1797; Yale in 1810.[28]
According to Kenneth Ludmerer and William G. Rothstein American medical schools before 1880 had far more weaknesses than strengths. There were no entrance requirements—any young man could sign up and many schools did not even require a high school diploma. The curriculum was narrow, consisting of only seven courses, and instruction was entirely didactic lectures with little to no practical experience, no laboratories, and no work with patients. Physical facilities were meager, often just a single amphitheater or the second floor. Most schools were proprietary, operated for profit by their faculty, who gave most of their attention to their private practice. The standard course consisted of only two four-month terms of lectures. Graduation requirements were minimal, with brief and superficial examinations. The strengths were that the many proprietary schools made a professional career more widely available than the colonial apprenticeship system it replaced. Furthermore the lectures provided more systematic teaching compared to the apprenticeship model After 1880 German medical influences modernized the system with leaders like Johns Hopkins, Harvard, Pennsylvania, Columbia and Michigan extending their courses, adding new scientific subjects, and hiring full-time medical scientists with laboratories.[29][30][31]
Hospitals
Hospitals in the 19th century were largely designed for poor people in the larger cities. There were no paying patients. Very small proprietary hospitals were operated by practicing physicians and surgeons to take care of their own paying patients in better facilities than the charity hospitals offered. By the 1840s, the major religious denominations, especially the Catholics and Methodists, began opening hospitals in major cities. The South had small hospitals in its few cities. In the rich plantation areas, slave owners hired physicians to keep their slaves in working shape. In the poor white areas there were few doctors and very few hospitals.[32]
In the 1840s–1880s era, Catholics in Philadelphia founded two hospitals, for the Irish and German Catholic immigrants. They depended on revenues from the paying sick, and became important health and welfare institutions in the Catholic community.[33] By 1900 the Catholics had set up hospitals in most major cities. In New York the Dominicans, Franciscans, Sisters of Charity, and other orders set up hospitals to care primarily for their own ethnic group. By the 1920s they were serving everyone in the neighborhood.[34] In smaller cities too the Catholics set up hospitals, such as St. Patrick Hospital in Missoula, Montana. The Sisters of Providence opened it in 1873. It was in part funded by the county contract to care for the poor, and also operated a day school and a boarding school. The nuns provided nursing care especially for infectious diseases and traumatic injuries. They also proselytized the patients to attract converts and restore lapsed Catholics back into the Church. They built a larger hospital in 1890.[35] Catholic hospitals were largely owned and staffed by orders of nuns (who took oaths of poverty), as well as unpaid nursing students. When the population of nuns dropped sharply after the 1960s, the hospitals were sold. The Catholic Hospital Association formed in 1915.[36][37]
The Methodists made medical services a priority from the 1850s. They began opening charitable institutions such as orphanages and old people's homes. In the 1880s, Methodists began opening hospitals which served people of all religious beliefs. By 1895, 13 hospitals were in operation in major cities.[38]
Remove ads
The South
Summarize
Perspective
Compared to the North and West, the South was always a warmer climate that fostered diseases. It had far fewer cities and they lagged the North in innovation.[39] After the Civil War it was a much more sickly region, lacking in doctors, hospitals, medicine, and all aspects of public health. When a threat of yellow fever appeared Southern cities imposed temporary quarantines to stop travel from infected areas. The rest of the time there was inaction, and a reluctance to spend on sanitation.[40] Most Southerners were too poor to buy the patent medicines that were so popular elsewhere. Instead there was a heavy reliance on cheap herbal and folk remedies, especially among African Americans and Appalachians.[41][42][43]
Hookworm
The urban-rural dichotomy has a medical dimension. Two major diseases, malaria and hookworm, historically were rural phenomenon in warm areas of the South. They were stamped out by large-scale efforts to clean up the environment. Malaria is spread by the bite of a particular species of mosquito, and is eradicated by systematically draining pools of stagnant water or spraying with DDT.[44][45]
The Rockefeller Sanitary Commission in 1910 discovered that nearly half the farm people, white and Black, in the poorest parts of the South were infected with hookworms. In the typical victim hundreds of the worms live hooked to the wall inside the small intestine, eat the best food, and leave the victim weak and listless. It was called the "germ of laziness." Victims were infected by walking barefoot in grassy areas where people defecate. In the long run outhouses and shoes solved the problem but the Commission developed a quick cure. The volunteer drank a special medicine that loosened the insects' grip, then drank a strong laxative. When most residents did so the hookworms would be gone. The Commission, headed by Wickliffe Rose, helped state health departments set up eradication crusades that treated 440,000 people in 578 counties in all 11 Southern states, and ended the epidemic.[46][47][48]
The Black South
In the Southern states 1890s to 1930s, Jim Crow virtually dictated inferior medical care for the large, very poor African American minority. There was neglect and racism on the part of white physicians. Black physicians were too few and too poorly trained at their two small schools, Howard University and Meharry Medical College. Likewise nursing standards were subpar, and there were very few all-Black hospitals. The southern progressive movement did initiate reforms that helped somewhat, as did Northern philanthropies, but the whites benefitted more.[49][50][51][52]
The Tuskegee study

The most infamous American episode of bad medical ethics was the Tuskegee syphilis study. It was conducted between 1932 and 1972 by two federal agencies, the United States Public Health Service (PHS) and the Centers for Disease Control and Prevention (CDC) on a group of 399 African American men with syphilis. They were not asked to give permission, were not told their medical condition, and when penicillin became available in the mid 1940s it was deliberately not given them so the researchers could discover what happens to untreated men. As a result the lives of 100 of the 399 men were cut short and they died of syphilis.[53][54]
In retrospect the Tuskegee experiment caused deep distrust on the part of the African American community, and apparently reduced Black reliance on public health agencies.[55][56][57] One research study in 2018 estimated that the angry negative response caused the average life expectancy at age 45 for all Black men to fall by up to 1.5 years.[58]
Remove ads
Since 1900
Summarize
Perspective
Hospitals
In the U.S., the number of hospitals reached 4400 in 1910, when they provided 420,000 beds.[59] These were operated by city, state and federal agencies, by churches, by stand-alone non-profits, and by for-profit enterprises. All the major denominations built hospitals; the 541 Catholic ones (in 1915) were staffed primarily by unpaid nuns. The others sometimes had a small cadre of deaconesses as staff.[60] Non-profit hospitals were supplemented by large public hospitals in major cities and research hospitals often affiliated with a medical school. The largest public hospital system in America is the New York City Health and Hospitals Corporation, which includes Bellevue Hospital, the oldest U.S. hospital, affiliated with New York University Medical School.[61][62]
Measles and vaccines
According to the Center for Disease Control:[63]
Before the measles vaccination program started in 1963, an estimated 3 to 4 million people got measles each year in the United States, of which 500,000 were reported. Among reported cases, 400 to 500 died, 48,000 were hospitalized, and 1,000 developed encephalitis (brain swelling) from measles.
Measles affected approximately 3,000 Americans per million until the 1960s. The first effective vaccine appeared in 1963, and was quickly adopted with little controversy. The rate plunged to 13 cases per million by the 1980s, and to about 1 case per million by 2000.[64] In the 21st century occasional measles, outbreaks occur locally, usually caused by a person returning from a foreign visit. The disease is highly contagious, but with a community vaccination rate of 95% or higher, a local outbreak will quickly end. With lower rates of vaccination, however, measles can continue to spread. There are low vaccination rates in some traditionalistic religious groups, such as some Orthodox Jewish, Amish, Mennonite and Jehovah’s Witnesses communities.[65]
According to a March 2021 poll conducted by The Associated Press/NORC, vaccine skepticism and Vaccine hesitancy is s more widespread among white evangelicals than most other blocs of Americans. Among white evangelical Protestants, 40% said they were not likely to get vaccinated against COVID-19. That compares with 25% of all Americans, 28% of white mainline Protestants and 27% of nonwhite Protestants.[66]
1917 Measles in the Army
When the U.S. Army began drafting 4 million soldiers in 1917–1918, 95,000 men who had never been exposed to measles before caught the disease. Of these, 23,000 were hospitalized and 3,206 died. Most of the victims came from rural areas where measles was uncommon. There was simultaneously a parallel epidemic of primary streptococcal pneumonia in soldiers without measles.[67]
Spanish flu pandemic of 1918
the world wide Spanish flu epidemic of 1918 probably originated in the United States, and had a major impact on all parts of the country, as well as US Army in the American Expeditionary Force in France.[68]
In the U.S., about 20 million out of a population of 105 million became infected in the 1918–1919 season, and an estimated 500,000 to 850,000 died (0.5 to 0.8 percent of the U.S. population).[69][70][71] Native American tribes were particularly hard hit. In the Four Corners area, there were 3,293 registered deaths among Native Americans.[72] Entire Inuit and Alaskan Native village communities died in Alaska.[73]
Lack of rural medical care.
The Flexner Report of 1910 made for a radical change in medical education. It emphasized the importance of high quality. university-based, research oriented medical. education. It had the result of closing down most of the of small proprietary. local schools that produced doctors for rural America. In 1938, rural counties with without a city of 2500 people had 69 doctors per 100,000 population, while urban counties with cities of 50,000 or more population had 174.[74] The growing shortage of physicians in rural areas, especially in the South.[75][76]
Public health nursing
Public health nursing after 1900 offered a new career for professional nurses in addition to private duty work. The role of public health nurse began in Los Angeles in 1898, and by 1924, there were 12,000 public health nurses, half of them in America's 100 largest cities. Their average annual salary of public health nurses in larger cities was $1390. In addition, there were thousands of nurses employed by private agencies handling similar work. Public health nurses supervised health issues in the public and parochial schools, to prenatal and infant care, handled communicable diseases such as tuberculosis, and dealt with an aerial diseases.[77][78]

Historian Nancy Bristow has argued that the great 1918 flu pandemic contributed to the success of women in the field of nursing. This was due in part to the failure of medical doctors, who were nearly all men, to contain and prevent the illness. Nursing staff, who were nearly all women, celebrated the success of their patients and were less inclined to identify the spread of the disease with their own work.[79]
During the Great Depression in the 1930s, federal relief agencies funded many large-scale public health programs in every state, some of which became permanent. The programs expanding job opportunities for nurses, especially the private duty RNs who suffered high unemployment rates.[80][81]

A leader was Dr. Sara Josephine Baker who established many programs to help the poor in New York City keep their infants healthy, leading teams of nurses into the crowded neighborhoods of Hell's Kitchen and teaching mothers how to dress, feed, and bathe their babies.[82]
Native Americans
The federal Office of Indian Affairs (OIA) operated a large-scale field nursing program. Field nurses targeted native women for health education, emphasizing personal hygiene, and infant care and nutrition.[83]
Tuberculosis

In the United States there was dramatic reduction in what had been the greatest killer, tuberculosis (often called "consumption").[84] Starting in the 1900s, public health campaigns were launched to educate people about the contagion.[85] In later decades, posters, pamphlets and newspapers continued to inform people about the risk of contagion and methods to avoid it, including increasing public awareness about the importance of good hygiene and avoidance of spitting in public.[86] Improved awareness of good hygiene practices reduced the number of cases, especially in middle class neighborhoods. Public clinics were set up to improve awareness and provide screenings. This resulted in sharp declines through the 1920s and 1940s. Thanks to the public health campaigns, as well as the antibiotic drug streptomycin as a powerful cure from 1947, tuberculosis was downgraded to a minor disease in the U.S. by 1960[87]
Children
Public health programs have significantly improved children's health over the past century through various initiatives and interventions. These programs have addressed key issues such as infant mortality, disease prevention, and access to local healthcare for mothers and their babies. By 1915 child health had become a priority. Progressive Era reformers state by state focused on rescuing children under age 10 or 12 from low-wage employment in factories. See Child labor in the United States[88]
At the national level, the United States Children's Bureau, founded in 1912, played a crucial role in improving children's health. Congress originally gave it a very broad mandate:[89]
The said bureau shall investigate and report ...upon all matters pertaining to the welfare of children and child life among all classes of our people, and shall especially investigate the questions of infant mortality, the birth-rate, orphanage, juvenile courts, desertion, dangerous occupations, accidents and diseases of children, employment, legislation affecting children in the several states and territories.
Its actual work was much more limited. Its major initiatives included the Campaign for Better Babies (1915), to educate mothers, reduce infant mortality, and identify threats to children's health. The Children's Year (1918-1919) promoted child health and welfare, focusing on reducing infant mortality, improving nutrition, and promoting safe recreation.[90]
The Sheppard–Towner Act of 1921 had a significant influence on children's health policies, marking a turning point in public health initiatives for mothers and infants. It set the stage for future federal involvement in maternal and child health care. It set up 3,000 child and maternal health care centers, many in rural areas. It funded millions of home visits by nurses to mothers and their infants. One result was the infant mortality rate dropped from 76 deaths per 1000 live births to 68 in 1929.[91]
Title V of the Social Security Act (1935) established a federal-state partnership for maternal and child health services, providing funding for state health departments to implement children's health programs.[92] The 1950s and 1960s saw major efforts to vaccinate children against various diseases, especially polio.[93] In 1971 the measles vaccine (approved in 1963) was combined with new vaccines against mumps (1967) and rubella (1969) into a single vaccination MMR by Dr Maurice Hilleman.[94]
March of Dimes and eradication of polio
March of Dimes is a nonprofit organization that works to improve the health of mothers and babies.[95] It reaches a mass audience of contributors for funding health care of victims of polio and other diseases, and is a major source of medical research funding.[96] It was founded in 1938 by businessman Basil O'Connor and wheel-chair-bound polio victim President Franklin D. Roosevelt, as the National Foundation for Infantile Paralysis, to combat polio.[97] In the 1940s there were 40,000 new cases every year, and summer programs for children were restricted, especially swimming pools. From 1938 through the approval of the Salk vaccine in 1955, the foundation spent $233 million on polio patient care, which led to more than 80 percent of U.S. polio patients' receiving significant foundation aid.[98] After Salk's polio vaccine virtually ended the polio epidemic by 1959, the organization needed a new mission for its 3,100 chapters nationwide, and 80,000 volunteers who had collected billions of dimes. It expanded its focus under Virginia Apgar to the prevention of birth defects and infant mortality.[99] In 2005, as preterm birth emerged as the leading cause of death for children worldwide,[100] research and prevention of premature birth became the organization's primary focus.[101]
The Golden Age of powerful new drugs
In the 1940s penicillin, streptomycin and other powerful antibiotics became available. They were quick, cheap cures for many of the most common and deadly bacterial infections, including tuberculosis and pneumonia. They extended the average human lifespan by 23 years and marked a "Golden Age" of public health.[102] The 1950s and 1960s saw the advent of other powerful drugs: medicines to prevent inflammation in the joints and kidneys; to dilate arteries in the battle against high blood pressure or constrict blood vessels to combat shock; to regulate heartbeats; and to thin the blood. Professional medicine now for the first time was armed with drugs to cure major diseases. Diseases caused by a virus, however, were still not curable, and the new problem emerged of new variants of bacteria that resisted the new drugs.[103][104]
Health insurance and Medicare
Committee on the Costs of Medical Care designed new government insurance programs with its 1932 report. It was strongly opposed by the American Medical Association, which blocked all such notions by presidents Franklin D. Roosevelt and Harry S Truman.[105][106] However, New Deal legislation, especially the Wagner Act of 1935 greatly strengthened labor unions. Membership grew in the late 1930s and soared during World War II. One of the high priorities for unions was to negotiate health insurance for workers and their families, and take credit for it.[107][108]
In July 1965, under the leadership of President Lyndon Johnson, Congress enacted Medicare under Title XVIII of the Social Security Act to provide government health insurance to people age 65 and older, regardless of income or medical history. Before Medicare was created, approximately 60% of people over the age of 65 had health insurance (as opposed to about 70% of the population younger than that), with coverage often unavailable or unaffordable to many others, because older adults paid more than three times as much for health insurance as younger people.[109]
In 1997 a compromise was reached with private insurance companies, which were given a major role in Medicare Advantage as part of the Medicare program for retired people. By 2024 54% of Medicare recipients were enrolled in Medicate Advantage.[110][8]
In 2010 the Obama Administration passed the Affordable Care Act a program to enable wider health insurance for lower income families. There was a partisan dimension, with Republicans generally opposed, even though their constituencies were increasingly composed of lower income voters.[111][112][113]
Covid–19 pandemic
The worldwide Covid-19 pandemic in 2020–2022 led to 1.2 million deaths among 103 million who got sick in the U.S. There was massive economic damage as people stayed home from school, work and entertainment venues.[114][115]
Remove ads
Mental health
Summarize
Perspective
Mental health policies in the United States have experienced four major reforms: the American asylum movement led by Dorothea Dix in 1843; the mental hygiene movement inspired by Clifford Beers in 1908; the deinstitutionalization started by Action for Mental Health in 1961; and the community support movement called for by The CMCH Act Amendments of 1975.[116][117][118][119]
Asylum movement
The efforts of Dorothea Dix (1803-1887) were instrumental in shifting societal perceptions of mental health and advocating for humane care. In three years in the mid 1840s she traveled more than 10,000 miles by stagecoach, visiting over 500 almshouses, 300 county jails, 18 state penitentiaries, and an indeterminate number of hospitals.[120] In 1843, she submitted a "Memorial" to the Legislature of Massachusetts, describing the abusive treatment and horrible conditions received by the mentally ill patients in jails, cages, and almshouses. "I proceed, gentlemen, briefly to call your attention to the present state of insane persons confined within this Commonwealth, in cages, closets, cellars, stalls, pens! Chained, naked, beaten with rods, and lashed into obedience...."[121] She made similar studies for other states and reported to their legislatures. Her activism led to 32 new hospitals and a nationwide reform of the asylum system, funded by state governments. Dix's work helped change public attitudes toward mental illness. She convinced state leaders that individuals with mental health conditions deserved humane treatment and that society had a responsibility to care for its most vulnerable members.[122][123]
Mental hygiene movement
In A Mind That Found Itself (1908) Clifford Whittingham Beers described the humiliating treatment he received and the deplorable conditions in the mental hospital.[124] In 1909, the National Committee for Mental Hygiene (NCMH) was founded by Beers and a small group of reform-minded scholars and scientists. Beers explained, “Its chief concern was to humanize the care of the insane: to eradicate the abuses, brutalities and neglect from which the mentally sick have traditionally suffered.”[125] It marked the beginning of the "mental hygiene" movement. The NCMH (later Mental Health America) played a pivotal role in promoting education, prevention, and scientific approaches to mental health care,[126]
World War I catalyzed this idea with an additional emphasis on the impact of maladjustment, which convinced the hygienists that prevention was the only practical approach to handle mental health issues.[127] However, prevention was not always successful, especially for chronic illness; the condemnable conditions in the hospitals were even more prevalent, especially under the pressure of the increasing number of chronically ill and the influence of the depression.[116]
Deinstitutionalization
In 1961, the Joint Commission on Mental Health published a report called Action for Mental Health, whose goal was for community clinic care to take on the burden of prevention and early intervention of the mental illness, therefore to leave space in the hospitals for severe and chronic patients. The court started to rule in favor of the patients' will on whether they should be forced to treatment. By 1977, 650 community mental health centers were built to cover 43 percent of the population and serve 1.9 million individuals a year, and the lengths of treatment decreased from 6 months to only 23 days.[128] However, issues still existed. Due to inflation, especially in the 1970s, the community nursing homes received less money to support the care and treatment provided. Fewer than half of the planned centers were created, and new methods did not fully replace the old approaches to carry out its full capacity of treating power.[128] Besides, the community helping system was not fully established to support the patients' housing, vocational opportunities, income supports, and other benefits.[116] Many patients returned to welfare and criminal justice institutions, and more became homeless. The movement of deinstitutionalization was facing great challenges.[129]
Community support movement
After realizing that simply changing the location of mental health care from the state hospitals to nursing houses was insufficient to implement the idea of deinstitutionalization, the National Institute of Mental Health (NIMH) in 1975 created the Community Support Program (CSP) to provide funds for communities to set up a comprehensive mental health service and supports to help the mentally ill patients integrate successfully in the society. The program stressed the importance of other supports in addition to medical care, including housing, living expenses, employment, transportation, and education; and set up new national priority for people with serious mental disorders. In addition, the Congress enacted the Mental Health Systems Act of 1980 to prioritize the service to the mentally ill and emphasize the expansion of services beyond just clinical care alone.[130]
Later in the 1980s, under the influence from the Congress and the Supreme Court, many programs started to help the patients regain their benefits. A new Medicaid service was also established to serve people who were diagnosed with a "chronic mental illness". People who were temporally hospitalized were also provided aid and care and a pre-release program was created to enable people to apply for reinstatement prior to discharge.[128] Not until 1990, around 35 years after the start of the deinstitutionalization, did the first state hospital begin to close. The number of hospitals dropped from around 300 by over 40 in the 1990s, and finally a Report on Mental Health showed the efficacy of mental health treatment, giving a range of treatments available for patients to choose.[130][131]
However, several critics maintain that deinstitutionalization has, from a mental health point of view, been a thoroughgoing failure. The seriously mentally ill are either homeless, or in prison; in either case (especially the latter), they are getting little or no mental health care. This failure is attributed to a number of reasons over which there is some degree of contention, although there is general agreement that community support programs have been ineffective at best, due to a lack of funding.[129]
The 2011 National Prevention Strategy included mental and emotional well-being, with recommendations including better parenting and early intervention programs, which increase the likelihood of prevention programs being included in future US mental health policies.[132][page needed] The NIMH is researching only suicide and HIV/AIDS prevention, but the National Prevention Strategy could lead to it focusing more broadly on longitudinal prevention studies.[133][failed verification]
In 2013, United States Representative Tim Murphy introduced the Helping Families in Mental Health Crisis Act, HR2646. The bipartisan bill went through substantial revision and was reintroduced in 2015 by Murphy and Congresswoman Eddie Bernice Johnson. In November 2015, it passed the Health Subcommittee by an 18–12 vote.[134]
Remove ads
Life Expectancy
Summarize
Perspective
Life expectancy in the United States has shown a remarkable increase over the past century, with a few small fluctuations. In 1900, life expectancy at birth was approximately 47 years. This figure rose steadily, reaching about 69 years by 1950; 72 in 1975, and 77 in 2000 . In 2023 it reached 78.4 years—75.8 years for males and 81.1 years for females.[135]
Causes and cures
Nationwide, multiple factors Influenced life expectancy at birth:[136]
- Infant mortality: Early 20th century rates were largely shaped by high infant mortality. The rate in 1900 was about 10% of newborns died—in some cities as many as 30%.[137][138][139]
- Infectious diseases: The death rate from infectious diseases—especially tuberculosis, influenza and pneumonia—fell by 90% from 1900 to 1950. By the late 1940s, Penicillin was the major drug in use.[140]
- Chronic diseases: As infectious disease mortality declined, cardiovascular disease and cancer became leading causes of death.[141]
- Race: In 1900 life expectancy at birth was 47.6 years for white babies and 33.0 years for Blacks. In 1970 it was 71.7 and 65.3.[142][143] As of 2021, life expectancy at birth varies significantly by race and ethnicity:[144]
- Asian Americans: 84 years
- Hispanic Americans: 78 years
- White Americans: 76 years
- Black Americans: 71 years
- Native Americans: 65 years

- Public Health Measures: Improvements in sanitation, nutrition, medical care, drugs, technology, and awareness of risk have contributed to the overall increase in life expectancy.[145]
Mortality rates in major cities fell sharply from 1900 to 1930s. According to Cutler and Miller, the main factor was cleaning up the water supply through filtration and chlorination. They estimate that this accounted for nearly half of the total mortality reduction. The greatest impact was on young people, in terms of 3/4 of the reduction in infant mortality and nearly 2/3 for child mortality.[146][147]
Tobacco
- Smoking: Smokers start in their teenage years and it affects their death rates decades later. After 1920 the dramatic rise in cigarette smoking contributed to increased mortality from lung cancer and from strokes and heart attacks caused by cardiovascular disease. By the late 20th century smoking had sharply declined among better educated individuals.[148] However, in the early 21st century vaping an electronic cigarette became popular among teenagers.[149][150]
Remove ads
See also
- Healthcare in the United States
- Mental health
- History of hospitals
- United States Public Health Service
- United States Department of Health and Human Services
- Health department#United States Health Departments, by state
- History of public health in New York City
- History of public health in Chicago
- Baltimore City Health Department, Includes history.
- American Public Health Association, for professionals
- History of public health in the United Kingdom
- History of public health in Canada
- Demographic history of the United States
- Race and health in the United States
- List of epidemics and pandemics
- History of medicine in the United States
- History of nursing in the United States
- Leaders
- Dorothea Dix, (1802-1887), upgraded insane asylums
- Sara Josephine Baker, (1873–1945), public health physician
- Thomas Parran (surgeon general) (1892–1968)
- Basil O'Connor (1892–1972), head of American Red Cross and founder of March of Dimes
- Virginia Apgar, (1909–1974), infant mortality
- Jonas Salk, (1914–1995), polio vaccine
Remove ads
Notes
Further reading
External links
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads