Home » News » Is America a “Christian nation?”

By Free Speech Center, published on March 11, 2024

Select Dynamic field

Most Americans believe that the nation’s founders established the United States as a “Christian nation” and a slight majority say that it should be a Christian nation today, according to a 2022 study done by the Pew Research Center.

The characterization of the nation as Christian often comes up when some Americans bristle at the separation of church and state and assert that their Christian faith should be paramount in public schools and government.

As Pew noted in its report, “This raises the question: What do people mean when they say the U.S. should be a “Christian nation”? While some people who define the concept as one where a nation’s laws are based on Christian tenets and the nation’s leaders are Christian, it is much more common for people in this category to see a Christian nation as one where people are more broadly guided by Christian values or a belief in God, even if its laws are not explicitly Christian and its leaders can have a variety of faiths or no faith at all.”

In this Free Speech Center video, First Amendment scholar Dr. John Vile examines the history of America’s founding and explains why the U.S. should not be designated as a strictly Christian country.

YOU MIGHT ALSO LIKE

More than 1,700 articles on First Amendment topics, court cases and history