AP US History
Post-WWI refers to the period following World War I, which ended in 1918, marked by significant political, social, and economic changes across the globe. This era was characterized by the aftermath of the war, including the Treaty of Versailles, the rise of new ideologies, and a shift in power dynamics that set the stage for future conflicts and transformations in society.