Was the united states founded as a Christian nation
The short answer: no—scholarly consensus and founding-era documents show the was not legally established as a Christian nation, even though Christian belief and rhetoric heavily shaped early American ...