Respuesta :

I'd say when the countries distanced themselves from Germany after their role in WWI, the United States blamed Germany for what happened and made a peace treaty with other countries that excluded Germany. 
ACCESS MORE