American Century
From Wiktionary, the free dictionary
Jump to navigation
Jump to search
English[edit]
Proper noun[edit]
the American Century
- The period since the middle of the 20th century, seen as largely dominated by the United States in political, economic, and cultural terms.