There are those who say, "We need to take America back for God!" So, at one point then, America was, to say it as the pledge does, "under God."
Exactly when was America under God? In the eighteenth century when it was legal to own another human being? Did God not view black people as being in his own image? After all, what's a little slavery among friends?
How about the nineteenth century with the many atrocities committed against indigenous people-- the Battle of Tippecanoe (1811), the Trail of Tears (1830-1840), the Sand Creek Massacre (1840)? Just three out of many examples. What's a little murder, mayhem, and terrorism in the name of divine manifest destiny?
How about the twentieth century? Jim Crow throughout the first half+ of the 1900s? The illegality of interracial marriage? Race riots where whites murdered and destroyed the property of black Americans-- Georgia in 1906, Tulsa in 1921? How about repeatedly using the law to suppress the minority vote? And I cant forget lynching? After all, what's a little racism among friends?
Please clarify for me what has happened in the past fifty to seventy years that has been so disastrous that America has rejected God and needs to be welcomed back? What needs to be done politically to take America back for God? What kind of myopic view of American history, and spiritually truncated understanding of Christianity is it that sees American history from 1619 to 1950 as somehow God blessed, but in the past fifty years we have so fallen morally that we "have turned our backs on Him?"
"Tis a puzzlement.