Ok. Down to business.

I've heard people say on the news things like "Keep our borders open" and "America is a country that lets people in freely even without papers and etc"

Now, this is coming from the news, media, etc.

But many Americans who live in the places where illegal immigrants go to are tired of them coming in illegally, and they feel frightened, while the people who want open borders don't even live close to the illegal immigrants.

So, my question is,

Do you think its right that America closes its borders, and keep our citizens safe?
Or do you think we shouldn't close our borders, making our country open to people who don't come in legally and can possibly bring something new to our country like a new parasitic insect or disease.


NOTE: I DON'T WANT TO START ANY ARGUMENTS, I'M JUST ASKING CAUSE I NEVER GET TO HEAR THE PEOPLES THOUGHT ABOUT OPEN OR CLOSED BORDERS.

thank you.