History of American Business
Gender role theory explains how society shapes expectations and behaviors associated with being male or female. This theory highlights how traditional roles have historically assigned women to domestic duties and men to professional careers, influencing the dynamics of the labor force and women’s participation in the workplace.
congrats on reading the definition of gender role theory. now let's actually learn it.