What does the term "Manifest Destiny" refer to in American history? 🔊
The term "Manifest Destiny" refers to the 19th-century belief that the expansion of the United States across the American continent was both justified and inevitable. This ideology emerged in the 1840s and was used to justify westward expansion into territories inhabited by Native Americans and other countries. Proponents of Manifest Destiny believed it was America’s divine right to spread democracy and civilization. The concept played a key role in events such as the annexation of Texas and the Mexican-American War, significantly shaping the geographical and political landscape of the United States while contributing to conflicts over slavery and indigenous rights.
Equestions.com Team – Verified by subject-matter experts