What is Christian Nationalism?

Updated: June 09, 2023
Christian nationalism is the belief that the United States is, and should be, a Christian nation. This belief is based on the premise that the country was founded by Christians, and on the Bible.
Detailed answer:

Christian nationalism is a political and social movement in the United States. It is a form of religious nationalism, which seeks to place religion at the center of civic life, and it is often used interchangeably with the term civil religion.

Christian nationalists believe that America was founded as such by its Founding Fathers, who were overwhelmingly Protestant Christians. They believe this history has been marginalized or overlooked by secularists and progressives.

Many Christians see themselves as patriots — people who love their country above all else — despite whatever religious differences they may have with other Americans (or even with those who share their faith).

Sometimes they seek to promote Christian values through legislation and public policy. This often includes advocating blasphemy laws, school prayer, anti-abortion legislation, and laws restricting homosexuality, as well as other policies designed to keep America “safe” from “the threat of Islam.” Christian nationalists also support Israel for theological reasons. They seek to establish Christianity as the official religion of the United States.

Christian nationalism is not a new phenomenon in American politics or society. However, it has become more prominent since the 1980s due to factors such as the rise of the Religious Right and increased immigration from non-European countries. The movement has been criticized by many scholars for its role in fueling xenophobia and Islamophobia in American politics.

Cite this page

What is Christian Nationalism?. (2023, Jan 19). Retrieved from

https://graduateway.com/qa/what-is-christian-nationalism/