Women and the War: How WWII Changed American Society
WWII changed the lives of depression-weary Americans across the country. After the country suffered through the Great Depression entering the war was the last thing on Americans minds. But as the war brewed overseas and grew closer and closer it became inevitable that America would have to get involved. As with most wars WWII offered expansion for women. Their duties and responsibilities at home expanded while their husbands were at war. Their roles in the workforce greatly increased. For the first time women were accepted into the military in an official capacity. WWII was one of the most important steps towards our present day state of gender equality because it provided women the opportunity to challenge stereotypes by taking untraditional roles such as in factories or the military and ultimately got to show America what they are capable of and why they are more than deserving of equal rights.
The Great Depression had already changed American women’s lives before WWII was even thought of. Women had to learn to live economically on much meager incomes than they were used to. Most women of the time had husbands but they were either unemployed or they had undergone severe pay cuts. Women learned how to make living in these conditions as comfortable for their family as possible. Eleanor Roosevelt encouraged women to be the force that brought America out of the depression. In her first book that she wrote as first lady she stated “The women know that life must go on and that the needs of life must be met and it is their courage and determination which time and again have pulled us through worse crises than the present one.” Eleanor played a key role in promoting equality for all Americans. Some women had already been working but only in clerical and domestic service jobs. When the depression hit these jobs remained “women’s duties” which was an advantage, as most...