AP US History
U.S. imperialism refers to the policy and practice of extending the influence and control of the United States over foreign territories and peoples, particularly during the late 19th and early 20th centuries. This era marked a significant transformation in America's role on the global stage, as it transitioned from a nation focused primarily on continental expansion to one that sought overseas colonies and influence, driven by economic interests, strategic considerations, and a belief in American superiority.