An Encylopedia Britannica Company

Wild West

1 ENTRIES FOUND:
Wild West noun
Wild West
noun
Britannica Dictionary definition of WILD WEST

the Wild West

: the western United States in the past when there were many cowboys, outlaws, etc. often used before another noun