United States of America

Save
United States of America proper noun
or United States
United States of America
proper noun
or United States
Learner's definition of UNITED STATES OF AMERICA

the United States of America

or the United States
: country in North America

— American

adjective or noun
Comments & Questions
Comments & Questions
What made you want to look up United States of America? Include any comments and questions you have about this word.