Facebook Twitter

Game theory

The surprising case for video games as literature

game_theory.jpg

Harry Campbell for the Deseret News

The windows are dark and my alarm clock, tucked behind a skateboard rack on the wall of my bedroom, reads 4:47 a.m. I ignore it, swiping unkempt hair from my eyes to focus on the screen. As my fingers flick at the controller, my character — an armored cyborg — moves through the interior of a massive spaceship. A headset envelops me in a cloud of sound and chatter from the friends who are with me on this raid, a challenge far more engaging than school, which starts in three hours. But before we can make it out safely, my father opens the door. 

He’s naturally upset, but not surprised. It’s 2018. I’m 16 years old, a junior at Hightstown High School near Princeton, New Jersey, and this has been our routine for years. Diagnosed with ADHD and hearing loss when I was a small child, I never fit into the classroom environment. If I could hear what the teacher was saying, I struggled to care. If I applied myself, I could do well at most subjects, but it all felt like a waste of effort, and I couldn’t see it leading to anything. So why try? 

Not that I didn’t wish things were different. My parents prioritized working with passion, and in our family, that generally revolves around storytelling. My father is a painter and filmmaker, my sister started acting and writing plays as soon as she could, and even my mother uses narrative as a motivational tool in her personal training business. Conversations about which art form could best tell stories were common fare at the dinner table. I loved stories too, but I never had anything to argue for because I hadn’t found my “thing.”

I felt adrift, and video games were my refuge. Their virtual worlds felt more real to me, the accomplishment of beating a final boss more palpable than anything school could offer. Like so many other nights, my father interrogates me. Why am I awake? Why haven’t I done my homework? Sleepless, I struggle to follow. All I know is I can’t put the games down. Years later, with a bachelor’s degree in game development under my belt, I still don’t have the answers, but I think I know where to find them: the English department. Hear me out.

A notice lies on my desk among the clutter of unfinished assignments: I’m failing English, so I’ve been kicked off the Robotics team. Nearby, the Xbox monitor flickers.

Video games have become unavoidable. More than three billion people play them; that’s about one player in 74 percent of American households. Almost anyone I meet had at least one gaming console in their home growing up. Games are not a universal good. They can become dark places for some people. But innocuous fun like dancing or playing tennis on consoles like the Wii, which reads body movements to guide on-screen actions, was a formative experience for Generation Z. So was building worlds in Minecraft, playing army with friends in Halo, or engaging a robust narrative in The Legend of Zelda. 

Of course, consoles weren’t always that capable. One of the first video games ever developed was Tennis for Two, in 1958. Built with a physics simulator for a science exhibit at the Brookhaven National Laboratory in Upton, New York, the game consisted of a two-dimensional “court,” with lines as rackets and a block-shaped “ball” sent back and forth by clicking a single button. When a California company called Atari converted this concept into an arcade game named Pong in 1972, it was exciting enough to sell more than 8,000 units and eventually found its way into our living rooms, along with the first console.

Released in 1977, the Atari 2600 was a simple, 8-bit device that could play games stored on cartridges — something like 8-track tapes. Soon kids across the country were gathering at that one friend’s house to play games like Combat or Space Invaders. The corresponding financial boom gave designers room to experiment with more complex projects. Atari Adventure introduced a narrative: good vs. evil. And Pac-Man added cutscenes, animated segments that advanced the storyline between the game’s iconic maze levels. 

Still, the industry faced pitfalls of its own. By the early 1980s, new consoles were flooding the market. Game play became repetitive while quality was inconsistent. Looking to stand out, companies started adapting blockbuster movies like “E.T.,” the story of a boy who bonds with a benign alien. But the game was rushed to market despite major problems, becoming a frustrating disaster emblematic of an industry on the brink. There didn’t seem to be a way back until the Nintendo Entertainment System was released in 1985. 

The Japanese parent company behind the NES soon reshaped gaming. By limiting who was allowed to produce cartridges for its console, Nintendo ensured fewer games and higher quality. It focused on a younger audience with series like Super Mario Bros. and Zelda, incorporating stories with greater emotional depth. As that approach took hold, developers began to study other media, like film, to learn how to tell stories better. Cutscenes started generating suspense or laying out complex plots that gameplay could not yet cover — though that would change as technology caught up to creators’ visions. 

Development evolved, too, with independent studios taking more creative risks. In 2013, Naughty Dog released The Last of Us, which blended action sequences into narrative progression and player engagement, telling the story of an anti-hero and a plucky teenage girl navigating a zombie apocalypse. That tale resurfaced this year as HBO’s hit “The Last of Us,” starring Pedro Pascal, a landmark for this powerful new medium. Academics today are exploring the finer details that separate games from theater, poetry or film. But games don’t just tell stories; they tell them through us, a unique way to find something within ourselves.

It was like real life, where I was just going through the motions without believing I could make a difference. I didn’t realize it was teaching me something.

A few months later, a notice lies on my desk among the clutter of unfinished assignments: I’m failing English, so I’ve been kicked off the Hightstown High robotics team. Nearby, the Xbox monitor flickers. I know that if I hand in my late work along with a sufficient apology, I’ll be fine. And I’ve stopped taking ADHD meds, so my motivation is inert. So I scroll through the titles, landing on Dark Souls III, famous for its difficulty. Even so, I’m shocked when the first boss pummels my avatar into the dirt without giving me time to react. Frustrated but a little excited, I try again. And again. 

The game occurs in a dying and hopeless world, where the character is simply trying to stay alive. I’m caught up in the general sense of aimlessness. The game resets every time my character dies or I take a break. No matter how well I do, nothing changes. Any progress is strictly internal; I can’t change the world around me. It was like real life, where I was just going through the motions without believing I could make a difference. I didn’t realize it was teaching me something.

To make the game, lead developer Hidetaka Miyazaki defied his bosses at FromSoftware, a Japanese video game company, and ignored industry norms that typically let players prevail with minimal or manageable friction. Instead, he built a game about pain for pain’s sake — and a genre-defining cult classic. It provides a rigid set of challenges and doesn’t extend a hand to make them easier. Even the narrative is frustrating, as the player’s actions mean nothing to the broader story. It’s pure adversity, and that’s the charm. 

Writing for CNET, critic Andrew Gebhart argued that Dark Souls “served as a genuine catalyst for entering adulthood.” Its strict, consistent difficulty helped him to learn personal responsibility. The player had to find meaning in his own struggles. In a perhaps inadvertent echo of “Night,” the classic memoir by holocaust survivor Elie Wiesel, the player’s only power was how he dealt with it. The outcome never changed. All that mattered was the effort. That lesson filtered into Gebhart’s life, as he pivoted toward “the pleasure of pursuing the experience and knowing the most memorable moments were ones that I made for myself.” 

After slamming my head into this first boss enough times to finally beat it, I feel oddly proud. I didn’t win in a skillful way, and I was an inch away from death, but I had white-knuckled my way through the encounter and that felt great. Glancing away from the screen, my eyes fall on the pile of homework. I put down the controller and pick up a pencil. By semester’s end, an A in English gets me back on robotics. 

I take off my headphones and realize I’ve been crying. The feeling is profound and familiar, but also something only a video game could evoke in this way.

My hair is short now and my vision is clear. It’s 2022 and my desk is still messy, but the assignments are done, my grades solid. On the screen, I move across a glistening desert — or my character does. A mountain looms in the distance, not unlike the ridgelines of the Wasatch Front outside my apartment window. I’m a senior at the University of Utah majoring in games, so this is homework now. Everything I encounter seems to be nudging me toward that mountain, even a second character that appears next to mine. So we team up and head that way.

I came here to study game design, a newer field that incorporates classic design and psychology with a focus on the rules at work in video games. But I’m most interested in how these things tell stories, so I’m also taking an English class that compares games to movies and literature. Tonight’s assignment: Journey, released in 2012 for the PlayStation 3. The title references the hero’s journey, an archetypal plot structure derived from common themes in world mythologies as compiled by Joseph Campbell, a prominent thinker and literature professor at Sarah Lawrence College who died in 1987. 

Many beloved stories, as varied as “The Odyssey” by Homer and the original “Star Wars” movie, follow this arc of a central hero who must leave his ordinary world in order to resolve some problem or challenge. From the Call to Adventure (Luke Skywalker is given his father’s lightsaber) to the Belly of the Beast (Luke helps infiltrate the Death Star) and Apotheosis (using the Force, Luke destroys the Death Star and saves the galaxy), this classic framework lets the hero advance both physically and mentally or even spiritually. Journey follows along, almost exactly.

The game also taught me something unique. This was a story of movement, the player’s ability to walk and fly. The only characters or plot were implicit, baked into the experience of traversing this world. This is an experimental project that pushed the bounds of the medium. Today, indie developers can access the same technology as major brands, through websites like itch.io and Steam. Anyone can be a “game dev” now — like Salt Lake City resident David Payne, who builds craft cabinet games for his Gallery of Fine Hyper-Art, or “GOFHA,” a specialized arcade housed in a converted school bus. And games can now be anything they want to be, including literature.

But games can do things books cannot, like bringing two strangers together. As I struggle toward the snowy peak in Journey, my ad-hoc partner, who can only communicate through chirps, moves next to me, warms me up, and we find that the game lets us move faster when we’re together. “The players exist to enable each other’s journey to the top of the mountain,” explains my professor, Alf Seegert, “and the mountain exists to bring the two players together.” We know that reading novels fosters empathy, but this is another level. When our avatars finally reach the summit and the screen fades to white, I take off my headphones and realize I’ve been crying. The feeling is profound and familiar, but also something that for me only a video game could evoke in this way. Immediately I need to understand why and share it with others. I feel motivated in a way I’d never felt before. 

During a recent visit home to New Jersey, my father and I get to talking about art. I’m in a gap year, hoping to get accepted to my school’s graduate English program to study video games as literature. Utah is at the vanguard, hiring people to research and teach these new ideas within the context of classic fiction. So I remind him of our old dinner table conversations, and make a case that experiencing narrative through game play is as potent as any form of storytelling. And he agrees. I finally found my “thing.” Turns out it was with me all along. 

This story appears in the December issue of Deseret Magazine. Learn more about how to subscribe.