United States of America

Save
United States of America proper noun
or United States
United States of America
proper noun
or United States
Learner's definition of UNITED STATES OF AMERICA

the United States of America

or the United States
: country in North America

— American

adjective or noun