Western medicine

1 ENTRIES FOUND:
Save
Western medicine noun
Western medicine
noun
Learner's definition of WESTERN MEDICINE
[noncount]
: the typical methods of healing or treating disease that are taught in Western medical schools
Comments & Questions
Comments & Questions
What made you want to look up Western medicine? Include any comments and questions you have about this word.