Health in America
Health in America
Health in America


Administering medicine, Montrose Camp first aid station, Pittsburgh, PA, 1921.

Modern nursing began in England when Florence Nightingale organized the nursing corps during the Crimean War in the 1850s. In the U.S., Dorothea Dix created a Civil War nursing corps. Walt Whitman was one of her nurses.

U.S. Cadet Army Nurse Corps – World War II recruitment poster.

Today we associate nursing with women but that has not always been the case. Before the practice of medicine emerged as the more prestigious and well-paying profession in the 20th century, both men and women were nurses. That changed as men took over the medical profession and women were relegated to serve as nurses.

Nursing gradually became professional­ized and by 1900, hospitals had established more than 400 nursing schools for women. Nurses provided hospitals with a low-wage workforce. Women went into nursing because, like elementary school teaching, it was one of the few professional jobs available to them in the late 19th and early 20th century. Even with their nursing degrees, nurses' roles were often treated as low-grade to the male doctors who ran the hospitals. Sarah Dock, an R.N., wrote in the American Journal of Nursing in 1917, "No matter how gifted she may be, she will never become a reliable nurse until she can obey without question."

Extraordinary women, such as Lillian Wald, placed nurses in the forefront of home medical care and social services by establishing the Visiting Nurse Service and the Henry Street Settlement in 1893. These public health nurses had much greater autonomy than private duty nurses or nurses in hospitals.

American Indian nurse graduates from the Native American Higher Education Initiative program sponsored by the W.K. Kellogg Foundation.

With the rise of modern hospitals after World War II, nurses acquired more support, greater authority and a less hierarchical relationship with doctors. Attitudes about nursing also changed, as nurses joined unions, attained higher levels of education, and demand for their services grew. Feminism had a profound impact on the nursing profession, encouraging women and men to question the gender roles inherent in the doctor-nurse relationship.

La Guardia Community College nursing students celebrate graduation, 2010.
Volunteer nurses’ aides at Freedmen’s Hospital, Washington, DC, 1943.
Three young nurses on horseback, part of the Frontier Nursing Service, provide health care services to rural and remote areas in Kentucky, 1931.
Eastern Kentucky University