In 1857, decades before Thomas Edison’s work on the phonograph, a Frenchman by the name of Edouard-Leon Scott de Martinville conceived the idea of permanently etching soundwaves on a durable medium. He succeeded in producing a “phonautograph,” which recorded human voice in the way that a polygraph traces the varying lines registering heart rate and respiration. The machine duly registered in lampblack the undulating forms of audible sound as human subjects spoke into the device. But the resulting squiggles were no more decipherable than binary code is to the human eye. It seemed the project had failed.

A century and a half later, French researchers discovered several of Scott’s lampblack recordings in the Academy of Sciences in Paris. One of them dated to April 1860, three years before Abraham Lincoln delivered his address at Gettysburg. The researchers conceived the idea of converting the waveforms into digital code then played it back through computer speakers. At first they thought they were hearing a woman’s voice, singing the French folk song “Au Clair de la Lune,” but later they realized they had been playing back the audio at double its recorded speed. When they dropped it down to the right tempo, a man’s voice appeared out of the crackle and hiss: Edouard-Leon de Martinville serenading us from the grave.

A voice sang its way into the oblivion of a century and a half. And then, like snatching a firefly from the evening sky, other humans in another century reached back into time, bridged the stretch of years and heard the melody. What we may be lacking in today’s world is a more strenuous listening.

Related
What will American religion look like in 50 years?

What are the challenges to faith, and are they particular to this modern moment? It has been commonplace in the age of growing religious disaffiliation and the rise of the “nones” to equate unbelief with the rise of secularism. As Charles Taylor points out, ours is the first era in which belief is one option among many: disbelief, indifference, agnosticism or atheism. If you had lived centuries ago in the West, everyone around you would have been, in Robinson Jeffers’ words, “taking the stars and the gods for granted.” 

The simplest expression of the secularization hypothesis is the idea that societies outgrow religion as they modernize. This happens when science and technology develop sufficiently to supplant religion. Why hope for some better world if this world can be rendered sufficiently comfortable? Why search for miraculous healing when modern medicine offers reliable cures? At first glance, the secularization hypothesis is borne out in the data. Numerous polls show that religiosity is on the decline by a variety of measures including religious preference, importance attributed to religion, church attendance, church membership or belief in God. Youngest Americans are, according to these measures, the least religious, with Gallup finding that just 45 percent of 18- to 29-year-olds “report that religion is very important in their daily lives.” 

The secularization hypothesis entails the belief that moderns are more sophisticated, less naïve and credulous than post-Enlightenment folks. That’s a bit simplistic, actually. Premoderns may not have had a lot of the knowledge we have, but they weren’t any less intelligent. Most of our vaunted achievements are the product of a combined cultural inheritance, not individual superiority. And one thing that premoderns had ample experience with — more so than we moderns — was death.

So the fundamental Christian claim — that Christ rose from the dead — was met with entrenched skepticism even among early disciples. The seeds of belief in the Resurrection only took initial root in the women of the movement, and who was going to take them seriously? The apostles did not. “He appeared first to Mary Magdalene, from whom he had cast out seven demons. She went out and told those who had been with him, while they were mourning and weeping. But when they heard that he was alive and had been seen by her, they would not believe it” (Mark 16:9–11).

Christianity set itself in opposition to self-interest and the dominant social paradigms of the day, insisting on self-denial, sacrifice and a total disregard for social, economic and political hierarchies.

We should not be surprised, therefore, that some of the earliest critics of Christianity pointed out its raw implausibility; apparently familiar with the Resurrection narrative, Celsus asked, “And who beheld this? A half-frantic woman, as you state, and some other one, perhaps, of those who were engaged in the same system of delusion.” No wonder that one of the oldest of all depictions of Jesus Christ is the Alexamenos graffito, a mocking depiction of Christians worshiping Jesus Christ on the cross, who is depicted with the head of a donkey. “Christ crucified (was) foolishness to Gentiles,” observed Paul (1 Corinthians 1:23).

According to the secularization hypothesis, religion flourished before science because conditions were so deplorable. For those living lives of unremitting hardship, religion offered an appealing form of consolation: suffering was given meaning, and a distant future promised better days and heavenly rewards. Religion was a collection of stories that people adopted to make life tolerable. In this light, religion seems superfluous when technology has eased so much of the pain from which religion was merely anesthetizing us. But such a focus on religion as a palliative neglects the tremendous costs that Christianity imposed on its disciples.

We need to revisit these careless assumptions that faith was easier, more natural and less “against the grain” in a more naïve, premodern age. Consider the rise of Christianity under the Roman Empire. The defining and most conspicuous attributes of early Christianity — as judged by its non-Christian contemporaries — were strict regulation of sexual behavior for both sexes and a total disregard for class distinctions. In contrast, most of the religions of that time ignored or even condoned and endorsed sexual promiscuity (at least for men) and rigid social hierarchy. One could hardly think of an outlook worse suited to make life tolerable than one which forbade partaking in many basic pleasures and offended the ruling elite. 

It was extremely difficult for early Christian converts to accept claims that contradicted their everyday experience (such as people rising from the dead), and it was also difficult for them to embrace a system of belief that was extraordinarily demanding. Christianity set itself in opposition to self-interest and the dominant social paradigms of the day, insisting on self-denial, sacrifice and a total disregard for social, economic and political hierarchies. After all, Christianity requires us to love our enemies, to seek perfection, to live by strict moral codes and to give up everything — including our time, our resources and even our lives. 

Related
How immigration, secularization and other forces are reshaping American religion

The task of Christianity, as with all moral systems, is to provoke us to bridge the enormous distance between what we are now and what we sense we can — and should — be. That is not an easy or pleasant journey to make. Christianity promises a lot, but it also asks a lot in return. Until the late fourth century, being a Christian required conspicuous courage — social, moral and often physical. It was more than marginally demanding and cost more than mere social marginalization. Christians who accepted these claims often paid for it with their lives.

In fact, we suggest one crucial circumstance behind the rise of the “nones” is not that belief is too hard in a secular age; perhaps it has become too easy. After the fourth century, the costs of being a Christian were turned inside out. Christian affiliation was now an asset, not a liability. From the late Roman era to the 21st century American political arena, being Christian has been a virtual prerequisite for office and advancement within the regions of the world where Christianity flourished.

While the core message of sacrifice and loving one’s enemies never changed, the practical requirements for becoming a Christian shifted dramatically when it became the official state religion of the empire. Christianity as a doctrine remained as austere as ever, but persons bearing its name went from holding tickets to the lion’s den to brandishing passes to high government positions.

The political and social advantages, rather than costs, continue to this day — most prominently in American political life. An overwhelming majority of voters — 80 percent to 95 percent — would support a Catholic or evangelical candidate for president. Advertise your atheism, and the number drops to 60 percent.

Clearly, in many places and seasons, Christian affiliation has been easy — even desirable. If anything, such comfortable contexts make casual belief a perennial temptation. The consequence, we are arguing, has been a more fragile Christianity, a more vulnerable discipleship: one that asks much but requires little. This is the sense in which genuine, costly, investment-laden faith — even in the “Age of Faith” — has always been difficult. Only when Christianity is not the default religion, when the cost of membership is high, do those costs make discipleship the product of a highly deliberate, willful choice. 

The task of Christianity, as with all moral systems, is to provoke us to bridge the enormous distance between what we are now and what we sense we can — and should — be.

Revisiting the current religious environment, one other data point invites us to reevaluate the nature of the current trends. Complicating the popular picture of religious decline is the fact that the cohort of young Americans, while highest in religious disaffiliation, is also the highest in terms of belief in most superstitions (like knocking on wood or throwing salt over your shoulder.) An Insider poll also found that 44 percent of 20-something Americans believe in astrology “a lot” or “somewhat.” We should not be hasty to chalk up declining religiosity to the secularization hypothesis if the people fleeing the pews are holding onto or even increasing their belief in good luck charms and star charts. What is happening? One erstwhile believer may give us a clue.

Elna Baker, a onetime Latter-day Saint, shared, on an episode of “This American Life,” a conventional conversion story from her youth in Snowflake, Arizona. She was on a church youth trip, was encouraged to pray for a personal witness, and did so. “Then,” she recalled, “the sun came through the clouds and warm light hit my face. I felt like someone was wrapping their arms around me and hugging me. My body rocked back and forth, and I knew it wasn’t me who was doing it.”

Later in the episode, she described her last spiritual experience before abandoning her childhood faith:

“Three years ago, maybe four years ago, ... I felt like it was getting so hard to believe for me. And I just was like, ‘You know, I want a sign again like the one I had when I was young. And I just want you to tell me that you’re there, God.’

“And I knelt down and I prayed and I asked this. And then I looked up at the sky and I was like, ‘The sky? That’s the sign?’ Like, anyone can see this. This isn’t a sign. You just see a few stars. It’s New York — you see, like, maybe five stars. And just as I was saying, ‘This isn’t anything, this is just what’s always there,’ one of the stars shot across the sky. And it was the biggest shooting star I’d ever seen.”

Almost immediately thereafter, however, she reflects on the nature of that evidence. “As soon as it happened, I did the thing I do now — I started questioning. Was that meant for me? Or did I just happen to look up at the exact moment when a star shot across the sky? ... And that’s when I realized I don’t just want a sign, I want to be myself at 14 again — the kind of person who believes in signs.”

This story perfectly illustrates that what seems plausible to a given individual is not just a consequence of the evidence and reasoning available to us, but of our assumptions about what evidence and reasoning we’re willing to consider. This may be our most important claim about faith. Faith is not an escape from or even a bracketing of rationality or evidentiary claims. Faith involves an expansion of the domain of rationality. 

Only when Christianity is not the default religion, when the cost of membership is high, do those costs make discipleship the product of a highly deliberate, willful choice. 


We do not believe a Divine Maker would ask us to diminish or discredit the reason that God embodies. We do believe a healthier way to envision faith, and its moral value, is as a suite of dispositions toward openness; a willingness and even passionate embrace of eternity as a multilayered adventure, accessible to us through a more expansive epistemology than our narrowly conceived methodologies. Faith is a response to the “intuition,” in Marilynne Robinson’s words, “that reality is rooted in a profounder matrix of Being than sense and experience make known to us in the ordinary course of things.” It is a stance of humility, an unflagging willingness to take correction, an imaginative resistance to our own prejudices.

View Comments

All of which takes us back to that French inventor, de Martinville, and that voice from the void. Mere church affiliation is not necessarily a faith-filled stance toward the divine. At the same time, the manifold ways in which persons of good intent strive to make sense of the universe, from crystals to nature immersion to meditative practices, even while abandoning the traditional institutional forms, is a good sign that faith — as a suite of dispositions toward openness — may be alive and well. Strenuous listening seems to be the key.

Some of the most daunting challenges to faith that have recently appeared on the social sciences horizon can cloud our confidence that we can make meaningful sense of the voices within and without. Many of these challenges relate to a reshaped understanding of free will, of how emotions are constructed, and of the subtle ways in which intellect is shaped and conditioned. Some of these developments throw into question the reliability of our beliefs and the grounds on which we construct a durable faith. Our response to this predicament has been to argue that certainty of any sort is a more complicated affair than we may have assumed. That the most fruitful engagement with these developments is to realize the indispensability of faith, of intuition and of epistemological openness — to any quest for understanding. Belief is hard — but there is no way around the headwinds.  

Terryl Givens is a senior research fellow at the Neal A. Maxwell Institute of Religious Scholarship at Brigham Young University. Nathaniel Givens is a writer living in Ashland, Virginia, and a frequent contributor to Public Square Magazine. This excerpt is from a soon-to-be released book, “Into the Headwinds: Why Belief Has Always Been Hard—and Still Is.” (Eerdmans, October 2022)

This story appears in the October issue of Deseret Magazine. Learn more about how to subscribe.

Join the Conversation
Looking for comments?
Find comments in their new home! Click the buttons at the top or within the article to view them — or use the button below for quick access.