It’s the United States of America a Christian Nation?
The controversies of this question have risen numerous of opinions and views over the decade. The nation’s fundamentals in which allowed the practice of any religion without any boundaries and strains between government can and can’t be viewed as a Christian nation. The religious conservators and American Christian population uphold the idea that this nation it’s a Christian nation. This nation been raised by Christian values and ideas, values that equalize the wrights of each citizen and ideas that maintain fairness and union between each other’s. While, other liberal side of the population identify this question as contrary. Since last decades, a numerous immerging of immigrates to the United Sates lead to recognition of many different religions and believes. In addition, to a more liberal society it became more of a secular religion in which numerous of people believe this nation it’s not a Christian nation. Despite the controversy between United States now being a Christian nation or not, the real question is that was United States really build in Christian believes and values? The first amendment clearly estates that introduction of new religions are acceptable and there’s no uphold to the citizens in which they have to be Christians. The amendment undoubtedly mentions the root of the United States as “We the people” and not declaring any religious believes or ideas. In the entire Constitution, there’s not mentioning of even once the name of God, Jesus or any particular deities. The treaty of Tripoli it’s an evidence that many didn’t acknowledge it yet, this treaty was made by George Washington. This treaty lucidly estates that Unites States it is not a Christian nation. By looking at our past history, we can find vast examples that lead to an uncertainty about these Christian values and believes. One of the examples are when Americans Indians. American Indians were victims by...
Please join StudyMode to read the full document