If the United States took colonies, what was there philosophically that separated them from those who colonized them such as Britain. Sure, they were not an empire but rather a democratic federation, but creating a nation that colonizes others still leads to one form of imperialism or another. The US was also a major proponent of anti-colonialistic ideas. If they suddenly started colonizing others, their hypocrisy would be evident.