Religion in the United States

Religion and Social Change

Religion has historically been an impetus to social change. The translation of sacred texts into everyday, nonscholarly language empowered people to shape their religions. Disagreements between religious groups and instances of religious persecution have led to wars and genocides. The United States is no stranger to religion as an agent of social change. In fact, the United States' first European arrivals were acting largely on religious convictions when they were compelled to settle in the United States.