West Germany

Definition from Wiktionary, the free dictionary
Jump to: navigation, search

English[edit]

Wikipedia has an article on:

Wikipedia

Proper noun[edit]

West Germany

  1. A former country in Europe, now part of Germany. Officially called the Federal Republic of Germany (FRG).

Translations[edit]

See also[edit]