The United States Was Not Founded As A Christian Nation
You may have seen claims that the Founding Fathers created the United States to be a “Christian nation”. This Christian nationalism narrative rewrites history and ignores America’s Founders’ mission to create a secular government where religious freedom is protected but not enforced on its citizens. Read our guide below to explore the true role of […]
The United States Was Not Founded As A Christian Nation Read More »

