The Home Front

The impact of the war on the United States was nowhere near as devastating as it was in Europe and the Pacific, where the battles were waged, but it still profoundly changed everyday life for all Americans. On the positive side, the war effort finally and definitively ended the economic depression that had been plaguing the country since 1929. It also called upon Americans to unite behind the war effort and give of their money, their time, and their effort, as they sacrificed at home to assure success abroad. The upheaval caused by white men leaving for war meant that for many disenfranchised groups, such as women and African Americans, there were new opportunities in employment and wage earning. Still, fear and racism drove cracks in the nation’s unified facade.