American Women in the 1920s
The 1920s was a decade full of challenges, opportunities, and new outlooks on the world for American women. They experienced drastic changes in politics, education, and even within their own homes. The “new women” were independent, confident, and no longer afraid to fight for their rights. Being isolated in their own homes, getting married and having children was no longer the only option, and many women chose different life paths, whether it would be pursuing a career, getting involved with the politics, or joining the feminist movement. The twenties was also a period of careless fun and casual relationships for many women as the society’s view on what was appropriate slowly changed. However you look at it, the 1920s was a decade that made life better for women in terms of political and personal freedom. Throughout the history women in America, women have been excluded from the politics and decision making in the society, but that began to slowly change as the United States approached the Roaring Twenties. Women in the nineteenth century were nothing more than housewives and mothers. They didn’t have the right to vote, they couldn’t speak to express their opinions publicly, and were mostly dependent on men. Isabel Conesa wrote, “Women were expected to have children, keep house, provide emotional support for their husbands, and in other ways, contribute to American society. However, during the twenties, those demands came to seem less and less compatible.” (p.1, Role of Women in the Roaring Twenties). The only authority women possessed in the decades prior to the twenties, was with her children and with the issues connected to homecare. From adolescence on, young girls were told that the desires of their fathers, sons, and husbands are much more important than their own. They were taught the ethic of service to family that should always be a priority for them. A girl in the nineteenth century would spend her childhood under a...
Please join StudyMode to read the full document