The Death Of Truth In America Is A Direct Result Of Our Loss Of Faith
No matter where one looks, Americans can’t trust what is being told to them is the truth. All of Western civilization is in decline because the Christian church is in decline.
By Jim DeMint – SEPTEMBER 2, 2021
The American people are growing rightly skeptical of what they are told. For the past two years, they’ve faced an onslaught of lies and misinformation that has left them with little faith in our institutions — from Congress to the White House and even within the Christian church. A sermon I heard years ago should have opened my eyes to what would eventually become clear — we have seen the death of truth in America.
Several decades ago and with fewer gray hairs, my wife and I visited a church on Easter Sunday while on vacation. The sermon stunned me, but not for its incredible insights and biblical perspectives. The pastor, shortly after describing the miraculous and incredible resurrection of Jesus, concluded it didn’t matter if it was actually true — only that we believed it was true.
I thought that would be the one and only time I heard a pastor claim that believing in the death and resurrection of our Lord was not a fundamental principle of being a Christian, but I was wrong.
Many years later, Sen. Raphael Warnock from Georgia, also a pastor, tweeted, “The meaning of Easter is more transcendent than the resurrection of Jesus Christ. Whether you are Christian or not, through a commitment to helping others we are able to save ourselves.”
There it was again. The same trope that could not be more antithetical to Christianity. As Jesus’ disciple Paul, writing in 1 Corinthians 15: 17-19, said, “And if Christ has not been raised, your faith is futile; you are still in your sins. Then those also who have fallen asleep in Christ are lost. If only for this life we have hope in Christ, we are of all people most to be pitied.”
The question: How could we have gotten so far from the fundamental truth of Christianity, that Jesus Christ died for our sins and was resurrected to join the Father in eternal life?
The answer: We’ve deluded ourselves into believing that truth is relative. That same delusion I should have seen cropping up in the Christian church has now made its way into the mainstream, leading to the death of truth in America. Today, we see it manifesting itself in our norms, education system, media, and politics.
Men identify as women and we accept that as truth. Does it matter that if we examined their chromosomes they would be biologically male? Not in today’s world. What has been true since the dawn of time is suddenly out the window.
It should come as no surprise that the very same people who would refuse to acknowledge the incontrovertible genetic evidence that said woman is a biological male will listen to whatever armchair experts believe about COVID-19 and deem it “science.” But if you so much as question the efficacy of mask mandates, lockdowns, non-Food and Drug Administration-approved vaccines, or — heaven forbid — the origins of the COVID-19 virus, you are dubbed “anti-science.”
Politicians misrepresent election integrity legislation and big corporations genuflect with boycotts of those states — justifying their actions with deliberate lies. No matter where one looks, Americans can’t trust what is being told to them is the truth. If you question the new leftist orthodoxy of “it’s all relative,” you face the wrath of “cancel culture.”
Truth is the core principle of Judeo-Christian values, which are the foundation of Western civilization. Judeo-Christian values are derived from the Bible. If you want to know the reason for the death of truth in America, look no further than the decline of Americans who choose to live their lives based on the values laid out in the Bible.
It is not simply America. All of Western civilization is in decline because the Christian church is in decline. The fall of the church is due to the fact that it has become unmoored from truth — following the culture instead of leading it.
While politicians and the courts have stripped America of its foundations, mainstream Christian denominations have been largely silent. The Bible is not even allowed in our schools, we’ve replaced the biblical creation story, Christians are arrested for trying to practice their faith, and biblical morality is considered hate speech.
Winston Churchill once said that America will do the right thing after it had tried everything else. We’ve tried everything else. The Christian church in America must muster up some courage — and repentance — and lead.