Apologetics Forum: Ask questions about Christianity/Debate doctrinesAmerica is a Christian country. Are all there good? :: Re: America is a Christian country. Are all there good?Says who? America was first colonized by Christians but this does not make America a Christian country. The United States of America does not have an official religion. In fact the 1st Amendment to the US Constitution prohibits Congress to establish or forbid the free exercise of religion. Jesus gave us this new commandment, "love your neighbor as yourself". Therefore, we best demonstrate our relationship with the Lord by loving our neighbors regardless of their faith of lack of faith. |
🌈Pride🌈 goeth before Destruction
When 🌈Pride🌈 cometh, then cometh Shame