AP US History
Women's role refers to the social, political, and economic responsibilities and expectations assigned to women throughout history. In various periods, women's roles have evolved significantly, reflecting changes in societal norms, economic needs, and political movements, often tied to broader developments in society.