West Indies

1 ENTRIES FOUND:
Save
West Indies /ˈɪndiz/ proper noun
West Indies
/ˈɪndiz/
proper noun
Learner's definition of WEST INDIES

the West Indies

: islands between southeastern North America and northern South America

— West Indian

adjective or noun
Comments & Questions
Comments & Questions
What made you want to look up West Indies? Include any comments and questions you have about this word.