The American Civil War, which took place from 1861 to 1865, significantly changed American society. Its impact was felt in every aspect of American society, from politics to culture, and economics to social norms. Following the war, the nation entered into a period of reconstruction and reconciliation during which a few notable contrasts began to emerge. Although the Civil War ended slavery, it did not end racism or discrimination against African Americans. In the aftermath of the war, many white Americans continued to treat African Americans as second-class citizens, and segregation persisted in many parts of the country. Segregation would continue in the US for many decades, and racism is still prevalent in modern times, although in different forms. Acts are still being passed to prevent discrimination against African Americans, especially in the south where racism is more outspokenly present. America has improved a lot in terms of becoming a more diverse and accepting place, but discrimination is still a very real problem in society that war can’t solve. The Civil War also had a profound impact on the American …show more content…
The war led to the emergence of two different political parties, the Republicans and the Democrats, who held fundamentally different views on issues such as race, economics, and government power. George Washington said that a bipartisan society would tear America apart. He thought that such a divisive way of living would lead politics to be about power rather than the good of the people. This split between the, at the time, majority republican south and more democratic north began to rapidly push Americans to become firm in whichever side they chose, often coming with ignorance and an unwillingness to hear others out. This is another effect that is still clearly reflected in modern American