An Encylopedia Britannica Company

United States of America

United States of America proper noun
or United States
United States of America
proper noun
or United States
Britannica Dictionary definition of UNITED STATES OF AMERICA

the United States of America

or the United States
: country in North America

— American

adjective or noun