What has happened to Christianity in Europe since World War II?

I need an answer fast and correct cuz it’s a final

Respuesta :

Developed countries with modern, secular educational facilities in the post-World War II era have shifted towards post-Christian, secular, globalized, multicultural and multifaith societies. Christianity currently remains the predominant religion in Latin America, Western Europe, Canada and the United States.

I don’t know if this is right tbh but yeah I hope it is and I hope you pass (please mark as brainlist answer)
ACCESS MORE