Immigration plays a huge role in the population of the United States. The U.S is looked at as a place for a new start and a place to begin a new life. This country gives people the opportunity to make their own choices and have their own freedom, who are we to decide who can have these rights or not. Should Americans really have to right to deny another human being the right to live in this beautiful country? America is a wonderful place to live, so should immigration become legalized, meaning as soon as a foreigner steps on U.S soil should they be considered a U.S citizen or should immigration stay illegal for those who do not go through the naturalization process?
Many people may look down on immigration because they see immigrants just coming to the United States and "taking" jobs from less fortunate Americans, which in all honestly those Americans are not willing to do those jobs anyways, but that still remains to be a huge debate. Americans tend to frown upon immigration because if immigrants take the time to actually come to this country, they should take the time to become a legalized citizen of the U.S, but even if immigrants did so would Americas opinions really change? Another problem people see with immigrants is the increase of population, some may see this as a huge dilemma and in some eyes it is completely true, but at the same time immigrants come to this country wanting to start a new life. When you know that is what immigration pretty much means then maybe immigrants would have a better chance at starting a new life in the U.S. Immigration happens to more countries then just one, and I am sure the citizens are equally upset at the effects immigration has on their country as well. Immigrants may come to this country and take away a few American jobs or even some of America's homes but if these spots are open and no one is willing to take them then why should someone be denied the right to all of these things just because they...
Please join StudyMode to read the full document