Definition from Wiktionary, the free dictionary
The phrase was first used primarily by Jackson Democrats in the 1840s to promote the annexation of much of what is now the Western United States.
- (US) The political doctrine or belief held by the United States of America, particularly during its expansion, that the nation was destined to expand toward the west.
- (US) The political doctrine or belief held by many citizens of the United States of America that their system is best, and the idea that all humans would like to become Americans.
- The belief that God supports the expansion of the United States of America throughout the entire North American continent except Mexico.