South

Definition from Wiktionary, the free dictionary
Jump to: navigation, search
See also: south

English[edit]

Wikipedia has an article on:

Wikipedia

Proper noun[edit]

the South

  1. (US) Those states which formed the Confederacy during the American Civil War.
  2. (US) The south-eastern states of the United States, including more or less the same states as formed the Confederacy.
  3. The southern part of any region.

Anagrams[edit]