Let us begin with a story from Odysseus’ journey. In Book 12 of “The Odyssey,” our hero is about to depart the island of the goddess Circe when she gives him some crucial advice about how to navigate the perils of the next leg of his voyage.

“Pay attention,” she instructs him sternly:

First, you will come to the Sirens who enchant all who come near them. If any one unwarily draws in too close and hears the singing of the Sirens, his wife and children will never welcome him home again, for they sit in a green field and warble him to death with the sweetness of their song. There is a great heap of dead men’s bones lying all around, with the flesh still rotting off them.

Odysseus listens as Circe provides him with a plan: Stuff wax in the ears of your crew, she says, so they cannot hear the Sirens, and have them bind you to the mast of the ship until you have sailed safely past.

Odysseus follows the plan to a tee. Sure enough, when the Sirens’ song hits his ears, he motions to his men to loosen him so that he can follow it. But as instructed, his crew ignores him until the ship is out of earshot.

This image is one of the most potent in the Western canon: Odysseus lashed to the mast, struggling against the bonds that he himself submitted to, knowing this was all in store.

It has come down to us through the centuries as a metaphor for many things. Sin and virtue. The temptations of the flesh and the willpower to resist them. The addict who throws his pills down the toilet in preparation for the cravings to come, then begs for more drugs.

It’s an image that illustrates the Freudian struggle between the ego and the id: what we want and what we know we should not, cannot have.

Whenever I’ve encountered a visual representation of the Sirens, they are always, for lack of a better word, hot. Seductive. From Shakespeare to Ralph Ellison and down through literature, the Sirens are most often a metaphor for female sexual allure.

In James Joyce’s “Ulysses,” Bloom describes the man who has taken up with Bloom’s wife as “falling a victim to her siren charms and forgetting home ties.”

Given this, it is a bit odd to reconcile the original meaning of the word with how we use it today, to describe the intrusive wail of the device atop ambulances and cop cars. But there’s a connection there, a profound one, and it’s the guiding insight to understanding life in the 21st century.

Attention: the substance of life

Stand on a street corner in any city on Earth long enough, and you will hear an emergency vehicle whiz past. When you travel to a foreign land, that sound stands out as part of the sensory texture of the foreignness you’re experiencing. Because no matter where you are, its call is at once familiar and foreign.

The foreignness comes from the fact that in different countries the siren sounds slightly different — elongated, or two-toned, or distinctly pitched. But even if you’ve never encountered it before, you instantly understand its purpose. Amid a language you may not speak and food you’ve never tried, the siren is universal. It exists to grab our attention, and it succeeds.

The Sirens of lore and the sirens of the urban streetscape both compel our attention against our will. And that experience, having our mind captured by that intrusive wail, is now our permanent state, our lot in life. We are never free of the sirens’ call.

Having our mind captured by that intrusive wail is now our permanent state, our lot in life. We are never free of the sirens’ call.

Attention is the substance of life. Every moment we are awake, we are paying attention to something, whether through our affirmative choice or because something or someone has compelled it. Ultimately, these instants of attention accrue into a life.

“My experience,” as William James wrote in “The Principles of Psychology” in 1890, “is what I agree to attend to.” Increasingly, it feels as if our experience is something we don’t fully agree to, and the ubiquity of that sensation represents a kind of rupture. Our dominion over our own minds has been punctured. Our inner lives have been transformed in utterly unprecedented fashion. That’s true in just about every country and culture on Earth.

In the morning, I sit on the couch with my precious younger daughter. She is 6 years old, and her sweet, soft breath is on my cheek as she cuddles up with a book, asking me to read to her before we walk to school.

Her attention is uncorrupted and pure. There is nothing in this life that is better. And yet I feel the instinct, almost physical, to look at the little attention box sitting in my pocket. I let it pass with a small amount of effort. But it pulses there like Gollum’s ring.

My ability to reject its little tug means I’m still alive, a whole human self. In the shame-ridden moments when I succumb, though, I wonder what exactly I am or have become. I keep coming back to James’ phrase “what I agree to attend to” because that word “agree” in his formulation carries enormous weight.

Even if the demand for our attention comes from outside us, James believed that we ultimately controlled where we put it, that in “agreeing” to attend to something we offered our consent. James was rather obsessed with the question of free will, whether we in fact had it and how it worked.

To him, “effort of attention” — deciding where to direct our thoughts — was “the essential phenomenon of will.” It was one and the same. No wonder I feel alienated from myself when the attention box in my pocket compels me seemingly against my own volition.

Related
Gen Z and mental health: A lesson in generational despair

The ambulance siren can be a nuisance in a loud, crowded city streetscape, but at least it compels our attention for a socially useful purpose.

The Sirens of Greek myth compel our attention to speed our own death. What Odysseus was doing with the wax and the mast was actively trying to manage his own attention. As dramatic as that Homeric passage is, it’s also, for us in the attention age, almost mundane.

Because to live at this moment in the world, both online and off, is to find oneself endlessly wriggling on the mast, fighting for control of our very being against the ceaseless siren calls of the people and devices and corporations and malevolent actors trying to trap it.

Divided and distracted

My professional life requires me to be particularly consumed by these questions, but I think we all feel this to some degree, don’t we? The alienating experience of being divided and distracted in spite of ourselves, to be here but not present.

I bet you could spend day and night in any city or town canvassing strangers and not find a single one who told you they felt like their attention span was too long, that they were too focused, who wished they had more distractions, or spent more time looking at screens.

The social effects of the attention age are well documented, with nearly all of the data we have pointing in the same direction. There has been a sharp rise in depression and suicide among not just teenagers, but children as well.

Self-reported happiness has been in secular decline since the mass adoption of smartphones across all ages in the U.S., but this is particularly acute among teenagers. People are spending less and less time interacting in the flesh and more time interacting through their devices, and along with that, we’ve seen people having fewer and fewer friends.

The presence of phones also seems to be having deleterious effects on students across the world. Globally, standardized test scores have been declining significantly since about 2012, across different societies and educational approaches. The culprit, based on the Program for International Student Assessment, which collects the data, is pretty clear.

As The Atlantic characterized the findings, “In sum, students who spend more time staring at their phone do worse in school, distract other students around them, and feel worse about their life.”

Like traffic, our phones are now the source of universal complaint, a way to strike up a conversation in a barber shop or grocery line. What began as small voices at the margins warning us that the tech titans were offering us a Faustian bargain has coalesced into something approaching an emerging consensus: Things are bad, and the technologies we all use every day are the cause. The phones are warbling us to death.

But before we simply accept this at face value and move on with our inquiry, it’s worth poking a bit at this quickly forming conventional wisdom.

I mean, don’t we always go through this cycle? Don’t people always feel that things are wrong and that it’s because of kids these days? Or the new technology (printing press, steam engine, etc.) has been our ruin?

Related
Religious communities can help solve our social isolation crisis

In Plato’s “phaedrus,” Socrates goes on a long rant — half persuasive and half ludicrous — about the peril posed by the new technology of ... writing: “If men learn (the art of writing),” Socrates warns, “it will implant forgetfulness in their souls: They will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. What you have discovered is a recipe not for memory, but for reminder.”

It seems safe to say in hindsight that writing was a pretty big net positive for human development, even if one of the greatest thinkers of all time worried about it the same way contemporaries fret over video games.

Indeed, it often feels that for all the legitimate criticism of social media and the experience of ubiquitous screens and connectivity, a kind of familiar neurotic hysteria undergirds the dire warnings.

Screen time

An entire subgenre of parenting advice books and blocking software now exists to manage “screen time” and the mortal peril introduced by our devices into the brain development of children; the broader cultural conversation has taken on all the overdetermined ferocity of a moral panic.

In 2009, the Daily Mail alerted its readers to “How using Facebook could raise your risk of cancer.” The New York Post warned that screens are “digital heroin” that turn kids into “psychotic junkies.” “Teens on social media go from dumb to dangerous,” CBS cautioned.

And The Atlantic was just one of many to ask the question: “Have smartphones destroyed a generation?”

In 2024, social psychologist Jonathan Haidt published “The Anxious Generation,” which argues that ubiquitous access to smartphones has consigned an entire generation of teens and children to unprecedented levels of depression, anxiety and self-harm. While some scholars who studied the issue criticized Haidt’s polemic for being overcooked, it was a runaway bestseller, and parents and schools across the country organized efforts to keep phones out of schools, as the book urged.

Some of the most grave and chilling descriptions of the effects of the attention age come from the workers who have engineered it. The hit Netflix documentary “The Social Dilemma” relies heavily on former Silicon Valley figures like whistleblower and former Google employee Tristan Harris to warn of the insidious nature of the apps mining our attention.

Sean Parker, the creator of Napster and one of Facebook’s earliest investors, describes himself as a “conscientious objector” when it comes to social media: “God only knows what it’s doing to our children’s brains,” he has said.

He is very much not alone.

A New York Times Magazine article from 2018 tracks what the author calls the “dark consensus about screens and kids” among the Silicon Valley workers who themselves helped engineer the very products they now bar their own children from using. “I am convinced,” one former Facebook employee told The New York Times in 2018, that “the devil lives in our phones and is wreaking havoc on our children.”

I’m inclined to agree, but also find myself shrinking more than a little at how much the conversation around the evils of our phones sounds like a classic moral panic. Sociologist Stanley Cohen first coined the term “moral panic” in his 1972 book “Folk Devils and Moral Panics,” a study of the hysteria that surrounded different kinds of youth culture, particularly the Mods and Rockers in the U.K. in the 1960s.

We can also see this familiar pattern when the target is a new technology rather than a cultural trend or group: excitement and wonder that quickly turn to dread and panic.

In 1929, as radio rose to become a dominant form of media in the country, The New York Times asked, “Do Radio Noises Cause Illness?” and informed its readers that there was “general agreement among doctors and scientific men that the coming of the radio has produced a great many illnesses, particularly caused by nervous troubles.

The human system requires repose and cannot be kept up at the jazz rate forever.”

Modernity

The brilliant illustrator Randall Munroe, creator of the webcomic “xkcd,” captures much of this in a timeline called “The Pace of Modern Life,” chronicling the anxiety of contemporary critics about the development of industrial modernity, particularly the speed of communication and proliferation of easily accessible information and its impact on our minds. He starts with the Sunday Magazine in 1871 mourning the fact that the “art of letter-writing is fast dying out. ... We fire off a multitude of rapid and short notes, instead of sitting down to have a good talk over a real sheet of paper.”

He then quotes an 1894 politician decrying the shrinking attention spans: Instead of reading, people were content with a “summary of the summary” and were “dipping into ... many subjects and gathering information in a ... superficial form” and thus losing “the habit of settling down to great works.” And my personal favorite, a 1907 note in the Journal of Education that laments the new “modern family gathering, silent around the fire, each individual with his head buried in his favorite magazine.”

Related
Are the kids all right? A new debate over therapy

All of this now seems amusingly hyperbolic, but there are two different ways to think about these consistent warnings and bouts of mourning for what modernity has taken from us. One way is to view it all as quaint: There will always be some set of people who will freak out about the effects of any new technology or media, and over time those people will find out that everything is fine; that the rise of, say, magazines, of all things, doesn’t rot children’s brains or destroy the fabric of family life.

But I don’t think that’s right. Rather, I think these complaints and concerns about accelerating technology and media are broadly correct.

When writing was new, it really did pose a threat to all kinds of cherished older forms of thinking and communicating. Same too with the printing press and mass literacy, and then radio and television. And it is when a technology is newest, when it’s hottest to the touch, that it burns most intensely.

The very experience of what we call modernity is the experience of a world whose pace of life, scope of information and sources of stimulus with a claim on our attention are always increasing. At each point up this curve, the ascent induces vertigo.

When Henry David Thoreau escaped to Walden Pond in the summer of 1845, it was as a refuge from this precise experience, the invasive omnipresence of modernity and the way it can cloud a person’s faculties. Of our so-called modern improvements, he writes, “There is an illusion about them; there is not always a positive advance. ... Our inventions are wont to be pretty toys, which distract our attention from serious things.”

The scale of transformation we’re experiencing is far more vast and more intimate than even the most panicked critics have understood.

To achieve clarity about what it means to be human in this specific era, it’s necessary at each moment to ask what’s new and what’s not, what’s being driven by some novel technology or innovation and what’s inherent in human society itself. For example, it’s not a new phenomenon for masses of people to believe things that aren’t true.

People didn’t need Facebook “disinformation” for witch trials and pogroms, but there’s also no question that frictionless, instant global communication acts as an accelerant. Also not new: our desires to occupy our minds when idle. Look at pictures of streetcar commuters of the early 20th century and you’ll see cars packed with men in suits and hats, every last one reading the newspaper, their noses buried in them as surely as modern commuters are buried in their phones.

But there’s also no question that the relationship we have to our phones is fundamentally different in kind than the relationship those streetcar commuters had to their newspapers.

Attention economy

In his book on the attention economy, “Stolen Focus,” writer Johann Hari gets into a bit of this debate with Nir Eyal (author of “Hooked: How to Build Habit-Forming Products”). Eyal makes the case that the freak-outs about social media are today’s version of the mid-20th-century moral panic over comic books, which got so heated there were a series of high-profile Senate hearings into what comic books were doing to America’s youth.

All the grave warnings about phones and social media are, he contends, “literally verbatim, from the 1950s about the comic book debate,” when people “went to the Senate and told the senators that comic books are turning children into addicted, hijacked (zombies) — literally, it’s the same stuff. ... Today, we think of comic books as so innocuous.”

In the end, it turned out comic books weren’t worth the worry, which is why the panic looks silly in retrospect. But that’s another key question, isn’t it? Along with the question of what is and is not new, there’s also the deeper question of what is and is not harmful.

It is easy to conflate the two. When tobacco use first exploded in Europe, there were those who rang the alarm bells. As early as 1604, England’s King James decried the new habit as “lothsome to the eye, hatefull to the Nose, harmeful to the braine, daungerous to the Lungs, and in the blacke stinking fume thereof, neerest resembling the horrible Stigian smoke of the pit that is bottomelesse.”

As hysterical and prudish as that must have sounded at that time, it was 100 percent correct. When I recently watched the incredible Peter Jackson documentary about the Beatles’ “Let It Be” sessions, the sheer number of cigarettes being inhaled in every recording session was both distracting and unsettling.

In 1969, when the Beatles were recording what would become their final released album, there was already substantial research demonstrating that cigarettes were dangerous. It would be another 30 years until culture and law and regulation turned decisively against smoking and the practice started to decline and disappear from most public spaces.

One wonders sometimes if 50 years from now, people will look at footage from our age, with everyone constantly thumbing through our phones, the way I look at Ringo Starr chain-smoking. Stop doing that! It’s gonna kill you!

In fact, the surgeon general of the United States has called for social media to come with a mandatory mental health warning label like the ones on cigarette packs. In response, researchers who study teen mental health have pushed back, saying the research just doesn’t justify such a drastic step.

The debate over our digital lives, at least as it’s been reflected in the discourse, basically comes down to this: Is the development of a global, ubiquitous, chronically connected social media world more like comic books or cigarettes?

What i want to argue here is that the scale of transformation we’re experiencing is far more vast and more intimate than even the most panicked critics have understood. In other words: The problem with the main thrust of the current critiques of the attention economy and the scourge of social media is that (with some notable exceptions) they don’t actually go far enough. The rhetoric of moral condemnation undersells the level of transformation we’re experiencing.

As tempting as it is to say the problem is the phones, they are as much symptom as cause, the natural conclusion of a set of forces transforming the texture of our lives.

Most important resource

The attention economy isn’t like a bad new drug being pushed onto the populace, an addictive intoxicant with massive negative effects or even a disruptive new form of media with broad social implications. It’s something more profound and different altogether.

My contention is that the defining feature of this age is that the most important resource — our attention — is also the very thing that makes us human. Unlike land, coal or capital, which exist outside of us, the chief resource of this age is embedded in our psyches. Extracting it requires cracking into our minds.

We all intuitively grasp the value of attention, as least internally, because what we pay attention to constitutes our inner lives. When it is taken from us, we feel the loss. But attention is also supremely valuable externally, out in the world. It is the foundation for nearly all we do, from the relationships we build to the way we act as workers, consumers and citizens.

Attention is a kind of resource: It has value, and if you can seize it, you seize that value. This has been true for a very long time. What has changed is attention’s relative importance. Those who successfully extract it command fortunes, win elections and topple regimes.

The battle to control what we pay attention to at any given instant structures everything from our inner life (who and what we listen to, how and when we are present to those we love) to our collective public lives (which pressing matters of social concern are debated and legislated, which are neglected; which deaths are loudly mourned, which ones are quietly forgotten).

Every single aspect of human life across the broadest categories of human organization is being reoriented around the pursuit of attention.

Stephan Schmitz for the Deseret News

How did it get this way? Toward the end of the 20th century, many wealthy nations began moving from an industrial, manufacturing economy to a digital one.

In 1961, six of the 10 largest U.S. companies by assets were oil companies. The assets these companies controlled — fossil fuels — were the single most valuable resource in the postwar global order. Alongside fossil fuel companies were car companies like Ford Motor and industrial behemoths like DuPont.

Today, Forbes’ list of the largest U.S. companies is dominated by banks and tech firms: Microsoft; Apple; Google’s parent, Alphabet; Meta; and Amazon. The central locus of economic activity has moved from those firms that manipulate atoms to those that manipulate bits.

Typically, we tend to think of the rise of this new form of economic production as being dependent on information and data. “Data is the new oil” has become a kind of mantra of the age; whoever controls large stores of information are the power brokers of our time.

This view is not completely wrong; information is vitally important. But it crucially misstates what’s both so distinct and so alienating about the era we’ve entered. Information is the opposite of a scarce resource: It is everywhere and there is always more of it. It is generative. It is copyable. Multiple entities can have the same information.

Think for a moment about your personal data, information about who you are and what you like. Maybe there are half a dozen firms that have it or maybe there are a hundred, or maybe a thousand, and while it might have some effect on you in terms of which advertising you get, you don’t really know and functionally it doesn’t really matter. But if someone has your attention, you know it. It can’t be in multiple places at once, the way information can.

If we return to the largest corporations of our times, they are dominated not by information companies, but more accurately by finance and attention companies. Apple is the company most singularly responsible for inaugurating the attention age with its 2007 introduction of the iPhone.

Microsoft runs the operating system that hundreds of millions of people spend their attention on all day long, along with another attention magnet, the Xbox gaming console. Alphabet runs YouTube, as well as the internet’s largest advertising network, which profits from our attention. Meta and the Chinese social media company Tencent (which makes WeChat, the largest social network in China) similarly convert eyeballs into cash.

Amazon is also on the list of largest companies and is the world’s largest online retailer outside China, but even to call Amazon a “retailer” misstates the source of its market power. Amazon is an attention and logistics company, and the products it sells are an afterthought.

You see this anytime you search for a product on Amazon and are confronted with dozens of nearly identical versions, all produced by companies you’ve often never heard of, in places you couldn’t name, primarily competing for the attentional space at the top of the search results, attentional space that Amazon owns. In many cases, Amazon has seen which products dominate that attentional space and then started producing them itself, cutting out the middleman.

Related
Protecting teens from Big Tech — it’s time for the states to step up

Amazon is the most extreme example of how in the attention age, even the sale of stuff to consumers has more to do with getting their attention than making the stuff itself.

The basic model of industrial age advertising was that a firm developed a product or service and then sought to advertise and market it, to capture people’s attention as a means of introducing them to the firm’s wares. But there’s another model, also present from the early days of the industrial age, which is the snake oil and supplements model.

In the snake oil model, the attention and marketing are the most important part of the enterprise — capturing the imagination of customers — and the product is an afterthought, in fact often outright fraudulent.

As global incomes rise, and the variety of consumer choices expands accordingly, attentional competition becomes ever more ferocious. We’re seeing the relative emphasis between these two models shift rapidly. In so many instances, the ability to grab the attention of the consumer is more important than the actual product or service offered.

Unlike land, coal or capital, which exist outside of us, the chief resource of this age is embedded in our psyches. Extracting it requires cracking into our minds.

It is not just commercial life that is driven by the extraction of attention. Increasingly, social life, public life and political life are dominated by it as well. In the 19th and 20th centuries, wage labor and urbanization utterly transformed the contest for attention in politics.

Mass media

As democracy spread across rapidly industrializing Europe, a recognizably modern mass public took shape. Public opinion mattered more than ever, and “what the public thought” was largely determined by which issues people paid attention to and which they didn’t, which candidates they recognized and which remained strangers.

On top of that, as society grew orders of magnitude more complex, the sheer number of issues presenting themselves with a claim to a citizen’s attention exploded as well.

In 1925, the critic Walter Lippmann pointed out that the duties citizens inherited in the 20th century were overwhelming even for the most educated and informed people like the author himself. “My sympathies are with (the citizen),” Lippmann wrote, “for I believe that he has been saddled with an impossible task and that he is asked to practice an unattainable ideal. I find it so myself for, although public business is my main interest and I give most of my time to watching it, I cannot find time to do what is expected of me in the theory of democracy; that is, to know what is going on and to have an opinion worth expressing on every question which confronts a self-governing community.”

It was the same year Lippmann published these words in his book “The Phantom Public” that Europe watched the rise of charismatic fascist dictator Benito Mussolini, who unburdened the Italian citizenry from the onerous labor they’d been tasked with by offering instead a cult of personality.

“Under ... Fascism there appears for the first time in Europe a type of man who does not want to give reasons or to be right, but simply shows himself resolved to impose his opinions,” wrote Spanish intellectual José Ortega y Gasset in “The Revolt of the Masses.” “Here I see the most palpable manifestation of the new mentality of the masses, due to their having decided to rule society without the capacity for doing so.”

The experience of charismatic demagogues and genocidal world war in the 20th century left an entire generation of intellectuals to wonder how compatible mass media and mass democracy truly were.

Though they didn’t necessarily conceptualize it in these terms, they were wrestling with the ability of mass media — sometimes, but not always, in the hands of tyrants — to successfully monopolize attention, and therefore control of a nation: Did the presence of mass media itself extinguish the individual conscience that made human decency possible?

“It is not an exaggeration,” wrote Pope Pius XII in 1950, “to say that the future of modern society and the stability of its inner life depend in large part on the maintenance of an equilibrium between the strength of the techniques of communication and the capacity of the individual’s own reaction.”

The TV age spawned dire warnings, from Marshall McLuhan to Neil Postman, that the broad narcotic effect of the new device was making the public stupider, duller and less capable of self-governance.

“Americans no longer talk to each other, they entertain each other,” Postman wrote. “They do not exchange ideas; they exchange images. They do not argue with propositions; they argue with good looks, celebrities and commercials.”

But all of that was a prologue for the attention age. Attention has never been more in demand, more contested and more important than it is now.

Unlike, say, oil, a chemical compound buried in the earth, attention cannot be separated from who we are and what it means to be alive.

Fundamental human need

In fact, attention is the most fundamental human need. The newborn of our species is utterly helpless. It can survive only with attention — that is, if some other human attends to it. That attention will not itself sustain an infant, but it is the necessary precondition to all care. If you neglect a child, it will perish.

We are built and formed by attention; destroyed by neglect. This is our shared and inescapable human fate. Now our deepest neurological structures and social impulses are in a habitat designed to prey upon, to cultivate, distort or destroy that which most fundamentally makes us human.

We don’t have to accept this. It does not need to be this way. There are already bills in state legislatures as well as in Congress that would create age minimums by law for social media platforms. While the details vary, as a general matter this seems obvious and sensible.

We as a society and government can say that the attention of children should not be sold and commodified in the aggressive and alienating fashion that social media networks currently do it. Just as 12-year-olds can’t really consent to a wage contract, we could say they can’t really consent to expropriation of their attention in the way that, say, Instagram exploits it.

10
Comments

We need to use every tool and strategy imaginable to wrest back our will, to create a world where we point our attention where we, the willful, conscious “we,” want it to go.

A world where we can function and flourish as full human beings, as liberated souls, unlashed from the mast, our ears unplugged and open, listening to the lapping of the waves, making our way back home to the people we love, the sound of the sirens safely in the distance.

From “The Siren’s Call: How Attention Became the World’s Most Endangered Resource” by Chris Hayes, published by Penguin Press, an imprint of Penguin Publishing Group, a division of Penguin Random House, LLC. Copyright (c) 2025 by Christopher Hayes.

This story appears in the May 2025 issue of Deseret Magazine. Learn more about how to subscribe.

Join the Conversation
Looking for comments?
Find comments in their new home! Click the buttons at the top or within the article to view them — or use the button below for quick access.