The French colonized the Americas in the 16th century and established a colonial empire in the Western Hemisphere. France founded colonies in much of eastern North America, on a number of Caribbean islands, and in South America.
The primary interest of France in colonizing North America was because of the great amount of furs they found there. They also wanted to acquire fish and spread Christianity.