Answer:
Now that WWII is over, nations should not still have colonies and territories. They should they give the land back to the people that live there. They should do this because it simply isnt fair, there isnt any thing they are doing with the land, since WWII is over. There is not point of keeping the lands to themselves, thats absurd.
Brainliest please, i need just 1 more :D
Explanation: