American Literature – 1860 to Present
Changing gender roles refer to the evolving expectations and behaviors associated with masculinity and femininity in society. During significant historical events like World War I, these roles underwent drastic transformations as women began to take on responsibilities traditionally held by men, such as working in factories, serving as nurses, and contributing to the war effort, leading to a re-evaluation of gender norms.
congrats on reading the definition of changing gender roles. now let's actually learn it.