Religion has shaped American since its beginning, it is evident in the countless traditions of the American society. “Unarguably [it] has been a significant aspect of our society and culture since the beginning of American civilisation.” Pilgrims traveled to America to have the freedom of religion, which is what makes America special, it was the sanctuary for free religion. Religion teaches nobility and justice, which is how it has attributed to the good morals in America. This is why “it is important to recognize that religion and religious movements have a massive impact on our society.” If we don’t recognize the importance of religion it may disappear before we notice and America would suffer for it the loss of the correct guidance: our morals would be skewed, our people lost, and our country corrupt. “When religion is not influential in a society or has ceased to be, the state inherits the entire burden of public morality, crime and intolerance.”
Religion is against crime, so if the majority of society is religious the less immorality tends exist. It is the base of America, its identity. In the beginning “the founding fathers founded the U.S. on the principles of religion.” It’s what makes America the successful country it is. Religion is essential to the
…show more content…
There is proof of “religious influence on politics has [increased] over the years” and improved politics since the early 1900s as well. Religious ways were fading during that time which brought the Temperance Movement into play. Which spoke about Religious Revival and how drunkenness affected families. These religious women helped improve America once they realized the importance of religions in America. Its is important for people of the U.S. to realize this as the protestant woman did 100 years ago. They realized the repercussions and made a move to stop it, we have known for a while and still make no large move in advocating the importance of