Wild West

Definition from Wiktionary, the free dictionary
Jump to: navigation, search

English[edit]

Proper noun[edit]

Wild West

  1. The western United States during the 19th-century era of settlement, commonly believed to be lawless and unruly.
  2. (by extension) A place or situation in which disorderly behavior prevails, especially due to a lack of regulatory oversight or an inadequate legal system.
    The CEO commented that the Russian business environment of the 1990s was the Wild West.

See also[edit]