The United States has always been an empire because since it became independent, it had sought to increase its territory.
An empire is a nation that controls other territories and tries to increase the territories it already has.
Since independence, the U.S. has steadily increased its territories by going to war with Mexico and Spain, and buying land from France and Russia. This allowed it to get to the size it is today.
In conclusion, the U.S. has been an empire for a long time.
Find out more on wars with Mexico at https://brainly.com/question/4322600.