The way that the United States was impacted after World War I is:
The World War I was a significant war which affected several nations as they went to war to defend their allies and also to prevent being invaded.
This led to the need for production of weapons and this led to new employment opportunities in America as there was heavy production of materials for the war.
Read more about World War I here:
https://brainly.com/question/8535233