Generally speaking, imperialism was the action of European and Asian countries taking control of foreign lands in places such as Africa and Latin America, in order to extract natural resources and labor from these lands to enrich their home countries. This led eventually to increased nationalism and militarism, which helped lead to World War I.