Definition from Wiktionary, the free dictionary
See also: south
- (US) Those states which formed the Confederacy during the American Civil War.
- (US) The south-eastern states of the United States, including more or less the same states as formed the Confederacy.
- The southern part of any region.