vulture cover story

Video Games Are Better Than Real Life

On the evening of November 9, having barely been awake to see the day, I took the subway to Sunset Park. My objective was to meet a friend at the arcade Next Level.

In size, Next Level resembles a hole-in-the-wall Chinese restaurant. It does indeed serve food — free fried chicken and shrimp were provided that night, and candy, soda, and energy drinks were available at a reasonable markup — but the sustenance it provides is mostly of a different nature. Much of Next Level’s space was devoted to brilliant banks of monitors hooked up to video-game consoles, and much of the remaining space was occupied by men in their 20s avidly facing them. It cost us $10 each to enter.

I had bonded with Leon, a graphic designer, musician, and Twitter magnate, over our shared viewership of online broadcasts of the Street Fighter tournaments held every Wednesday night at Next Level. It was his first time attending the venue in person and his first time entering the tournament. I wasn’t playing, but I wanted to see how he’d do, in part because I had taken to wondering more about video games lately — the nature of their appeal, their central logic, perhaps what they might illuminate about what had happened the night before. Like so many others, I played video games, often to excess, and had done so eagerly since childhood, to the point where the games we played became, necessarily, reflections of our being.

To the uninitiated, the figures are nothing if not staggering: 155 million Americans play video games, more than the number who voted in November’s presidential election. And they play them a lot: According to a variety of recent studies, more than 40 percent of Americans play at least three hours a week, 34 million play on average 22 hours each week, 5 million hit 40 hours, and the average young American will now spend as many hours (roughly 10,000) playing by the time he or she turns 21 as that person spent in middle- and high-school classrooms combined. Which means that a niche activity confined a few decades ago to preadolescents and adolescents has become, increasingly, a cultural juggernaut for all races, genders, and ages. How had video games, over that time, ascended within American and world culture to a scale rivaling sports, film, and television? Like those other entertainments, video games offered an escape, of course. But what kind?

In 1993, the psychologist Peter D. Kramer published Listening to Prozac, asking what we could learn from the sudden mania for antidepressants in America. A few months before the election, an acquaintance had put the same question to me about video games: What do they give gamers that the real world doesn’t?

The first of the expert witnesses at Next Level I had come to speak with was the co-owner of the establishment. I didn’t know him personally, but I knew his name and face from online research, and I waited for an opportune moment to approach him. Eventually, it came. I haltingly asked if he’d be willing, sometime later that night, to talk about video games: what they were, what they meant, what their future might be — what they said, perhaps, about the larger world.

“Yes,” he replied. “But nothing about politics.”

In June, Erik Hurst, a professor at the University of Chicago’s Booth School of Business, delivered a graduation address and later wrote an essay in which he publicized statistics showing that, compared with the beginning of the millennium, working-class men in their 20s were on average working four hours less per week and playing video games for three hours. As a demographic, they had replaced the lost work time with playtime spent gaming. How had this happened? Technology, through automation, had reduced the employment rate of these men by reducing demand for what Hurst referred to as “lower-skilled” labor. He proposed that by creating more vivid and engrossing gaming experiences, technology also increased the subjective value of leisure relative to labor. He was alarmed by what this meant for those who chose to play video games and were not working; he cited the dire long-term prospects of these less-employed men; pointed to relative levels of financial instability, drug use, and suicide among this cohort; and connected them, speculatively, to “voting patterns for certain candidates in recent periods,” by which one doubts he meant Hillary Clinton.

But the most striking fact was not the grim futures of this presently unemployed group. It was their happy present — which he neglected to emphasize. The men whose experiences he described were not in any meaningful way despairing. In fact, the opposite. “If we go to surveys that track subjective well-being,” he wrote, “lower-skilled young men in 2014 reported being much happier on average than did lower-skilled men in the early 2000s. This increase in happiness is despite their employment rate falling by 10 percentage points and the increased propensity to be living in their parents’ basement.” The games were obviously a comforting distraction for those playing them. But they were also, it follows, giving players something, or some things, their lives could not.

The professor is nevertheless concerned. If young men were working less and playing video games, they were losing access to valuable on-the-job skills that would help them stay employed into middle age and beyond. At the commencement, Hurst was not just speaking abstractly — and warning not just of the risk to the struggling working classes. In fact, his argument was most convincing when it returned to his home, and his son, who almost seemed to have inspired the whole inquiry. “He is allowed a couple of hours of video-game time on the weekend, when homework is done,” Hurst wrote. “However, if it were up to him, I have no doubt he would play video games 23 and a half hours per day. He told me so. If we didn’t ration video games, I am not sure he would ever eat. I am positive he wouldn’t shower.”

My freshman year, I lived next door to Y, a senior majoring in management science and engineering whose capacity to immerse himself in the logic of any game and master it could only be described as exceptional. (This skill wasn’t restricted to electronic games, either: He also played chess competitively.) Y was far and away the most intrepid gamer I’d ever met; he was also an unfailingly kind person. He schooled me in Starcraft, let me fiddle around on the PlayStation 2 he kept in his room while he worked or played on his PC. An older brother and oldest child, I had always wanted an older brother of my own, and in this regard, Y, tolerant and wise, was more or less ideal.

Then, two days before Thanksgiving, a game called World of Warcraft was released. The game didn’t inaugurate the genre of massively multiplayer online role-playing games (MMORPGs), but given its enormous and sustained success — ­augmented by various expansions, it continues to this day — it might as well have. Situated on the sprawling plains of cyberspace, the world of World of Warcraft was immense, colorful, and virtually unlimited. Today’s WoW has countless quests to complete, items to collect, weapons and supplies to purchase. It was only natural that Y would dive in headfirst.

This he did, but he didn’t come out. There was too much to absorb. He started skipping classes, staying up later and later. Before, I’d leave when it was time for him to sleep. Now, it seemed, the lights in his room were on at all hours. Soon he stopped attending class altogether, and soon after that he left campus without graduating. A year later, I learned from M, his friend who’d lived next door to me on the other side, that he was apparently working in a big-box store because his parents had made him; aside from that, he spent every waking hour in-game. Despite having begun my freshman year as he began his senior one, and despite my being delayed by a yearlong leave of absence, I ended up graduating two years ahead of him.

Y’s fine now, I think. He did finally graduate, and today he works as a data scientist. No doubt he’s earning what economists would term a higher-skilled salary. But for several years he was lost to the World, given over totally and willingly to a domain of meanings legible only to other players and valid only for him. Given his temperament and dedication, I feel comfortable saying that he wasn’t depressed. Depression feels like an absence of meaning, but as long as he was immersed in the game, I believe that his life was saturated with meaning. He definitely knew what to do, and I would bet that he was happy. The truth is, as odd as it might sound, considering his complete commitment to that game, I envy this experience as much as I fear it. For half a decade, it seems to me, he set a higher value on his in-game life than on his “real” life.

What did the game offer that the rest of the world could not? To begin with, games make sense, unlike life: As with all sports, digital or analog, there are ground rules that determine success (rules that, unlike those in society, are clear to all). The purpose of a game, within it, unlike in society, is directly recognized and never discounted. You are always a protagonist: Unlike with film and television, where one has to watch the acts of others, in games, one is an agent within it. And unlike someone playing sports, one no longer has to leave the house to compete, explore, commune, exercise agency, or be happy, and the game possesses the potential to let one do all of these at once. The environment of the game might be challenging, but in another sense it is literally designed for a player to succeed — or, in the case of multiplayer games, to have a fair chance at success. In those games, too, players typically begin in the same place, and in public agreement about what counts for status and how to get it. In other words, games look like the perfect meritocracies we are taught to expect for ourselves from childhood but never actually find in adulthood.

And then there is the drug effect. In converting achievement into a reliable drug, games allow one to turn the rest of the world off to an unprecedented degree; gaming’s opiate-like trance can be delivered with greater immediacy only by, well, actual opiates. It’s probably no accident that, so far, the most lucid writing on the consciousness of gaming comes from Michael Clune, an academic and author best known for White Out, a memoir about his former heroin addiction. Clune is alert to the rhetoric and logic of the binge; he recognizes prosaic activities where experience is readily rendered in words and activities like gaming and drugs, where the intensity eclipses language. Games possess narratives that have the power to seal themselves off from the narratives in the world beyond it. The gamer is driven by an array of hermetic incentives only partially and intermittently accessible from without, like the view over a nose-high wall.

In Tony Tulathimutte’s novel Private Citizens, the narrator describes the feeling near a porn binge’s end, when one has “killed a week and didn’t know what to do with its corpse.” An equally memorable portrait of the binge comes from the singer Lana Del Rey, who rose to stardom in 2011 on the strength of a single titled “Video Games.” In the song, Del Rey’s lover plays video games; he watches her undress for him; later, she ends up gaming. Pairing plush orchestration with a languid, serpentine delivery, the song evokes an atmosphere of calm, luxurious delight where fulfillment and artifice conspire to pacify and charm. The song doesn’t just cite video games; it sounds the way playing video games feels, at least at the dawn of the binge — a rapturous caving in.

Images from Javier Laspiur’s “Controllers” series, in which he photographed himself with each video-game system he played over the years, beginning with Teletenis in 1983 and ending with Playstation Vita in 2013. The composite image that opens this story was built by Laspiur from these images. Photo: Javier Laspiur

Of course, it was not video games generally that removed Y from school but, allegedly, one specific and extraordinary game. In much the same way that video gaming subsumes most of the appeals of other leisure activities into itself, World of Warcraft fuses the attractions of most video games into a single package. It’s not just a game; in many ways, it’s the game of games. Set in a fantasy universe influenced by Tolkien and designed to support Tolkienesque role-playing, the game, digitally rendered, is immeasurably more colorful and elaborate than anything the Oxford don ever wrote: If The Lord of the Rings books are focused on a single, all-important quest, World of Warcraft is structured around thousands of quests (raids, explorations) that the player, alone or teaming with others, may choose to complete.

Whether greater or lesser, the successful completion of these quests leads to the acquisition of in-game currency, equipment, and experience points. Created by the Irvine-based developer Blizzard (in many ways the Apple of game developers), WoW is rooted in an ethos of self-advancement entirely alien to that of Tolkien’s ­Middle-Earth, where smallness and humility are the paramount virtues. There is little to be gained by remaining at a low level in WoW, and a great deal to be lost. The marginal social status of the gamer IRL has been a commonplace for some time — even for those who are, or whose families are, relatively well-off. What a game as maximalist and exemplary as WoW is best suited to reveal is the degree to which status is in the eye of the beholder: There are gamers who view themselves in the light of the game, and once there are enough of them, they constitute a self-sufficient context in which they become the central figures, the successes, by playing. At its peak, WoW counted 12.5 million subscribers, each of them paying about $15 monthly for the privilege (after the initial purchase). When you consider how tightly rationed status is outside the game, how unclear the rules are, how loosely achievement is tied to recognition, how many credentials and connections and how much unpleasantness are required to level up there, it seems like a bargain.

Of course, there are other games, and other reasons to play beyond achieving status. Richard Bartle, a British game-design researcher and professor, constructed a much-cited taxonomy of gamers based on his observations of MUD, an early text-based multiplayer game he co-created in 1978. These gamers, according to Bartle, can be subdivided into four classes: achievers, competing with one another to reap rewards from the game engine; explorers, seeking out the novelties and kinks of the system; socializers, for whom the game serves merely as a pretext for conversations with one another; and killers, who kill. It isn’t hard to extend the fourfold division from gamers to games: Just as there are video games, WoW chief among them, that are geared toward achievers, there are games suited to the other three branches of gamers.

In many major games of exploration, like Grand Theft Auto or Minecraft, the “objectives” of the game can be almost beside the point. Other times, the player explores by pursuing a novel-like narrative. The main character of the tactical espionage game Metal Gear Solid 3 is a well-toned Cold War–era CIA operative who finds himself suddenly in the forests of the USSR; the hero of the choose-your-own-adventure game Life Is Strange is a contemporary high-school student in Oregon, and her estrangement results from her discovery that she can, to a limited extent, reverse time. These games are all fundamentally single-player: Solitude is the condition for exploring within games in much the same way that it is for reading a novel.

While explorers commune with a story or storyteller, socializers communicate with one another: The games that serve as the best catalysts for conversation are their natural preference. Virtually any game can act as a bonding agent, but perhaps the best examples are party games like Nintendo’s Mario Party series, which are just board games in electronic form, or the Super Smash Brothers series, in which four players in the same room select a character from a Nintendo game with which to cheerfully clobber the other. The story, in these games, isn’t inside the game. It’s between the players as they build up camaraderie through opposition.

The ultimate games for killers aren’t fighting games so much as first-person shooters: Counter-Strike when played in competitive mode obliges you to play as one member of a team of five whose task is to eliminate an enemy quintet. The teams take turns being terrorists, whose task is to plant and detonate a bomb, and counter­terrorists, whose task is to deny them. What beauty exists, is found only in feats of split-second execution: improbable headshots, inspired ambushes, precisely coordinated spot rushes.

What’s odd is that across these groups of games there’s perhaps as much unity as difference. Many of the themes blend together. Achievement can be seen as a mode of exploration and seems as viable a basis for socializing as any other. Socializing can be grouped with achievement as a sign of self-actualization. And killing? Few things are more ubiquitous in gaming than killing. Each one of the trio of novel-like games cited above forces the player-protagonist to kill one or more of his or her closest friends. Even a game as rudimentary as Tetris can be framed as an ­unending spree of eliminations.

Perhaps psychological types are a less useful rubric than, say, geological strata. As much as games themselves are divided into distinct stages, levels divide the game experience as a whole.

The first, most superficial level is the most attractive: the simple draw of a glowing screen on which some compelling activity unfolds. There will always be a tawdry, malformed aspect to gaming — surely human beings were made for something more than this? — but games become more than games when displayed vividly and electronically. Freed from the pettiness of cardboard and tokens, video games, like the rest of screen culture, conjure the specter of a different, better world by contrasting a colorful, radiant display with the dim materials of the dusty world surrounding them.

Second: narrative. Like film and television, many video games rely heavily on narrative and character to sustain interest, but just as those mediums separated themselves from theater by taking full advantage of the camera’s capacity for different perspectives, video games distinguish themselves from film and television in granting the viewer a measure of control. What fiction writing achieves only rarely — the intimate coordination of reader and character — the video-game system achieves by default. Literary style pulls together character and reader; technology can implant the reader, as controller, within the character.

Third: objectives, pure and simple. Action games and platformers (like Mario) in which the player controls a fighter; strategy games in which the player controls an army; grand strategy games in which the player controls an empire; racing games in which the player controls a vehicle; puzzle games in which the player manipulates geometry; sports games; fighting games; SimCity: These are genres of games where plot is merely a function of competition, character is merely a function of success, and goals take precedence over words. Developing characters statistically by “leveling up” can feel more important, and gratifying, than developing characters psychologically by progressing through the plot. The graphics may or may not be polished, but the transactional protocol of video games — do this and you’ll improve by this much — must remain constant; without it, the game, any game, would be senseless.

Fourth: economics. Since every game is reliant on this addictive incentive system, every gamer harbors a game theorist, a situational logician blindly valorizing the
optimization of quantified indices of “growth” — in other words, an economist. Resource management is to video games what ­African-American English is to rap music or what the visible sex act is to pornography — the element without which all else is unimaginable. In games as in the market, numbers come first. They have to go up. Our job is to keep up with them, and all else can wait or go to hell.

And there is something sublime, though not beautiful, about the whole experience: Video games are rife with those Pythagorean vistas so adored by Americans, made up of numbers all the way down; they solve the question of meaning in a world where transcendent values have vanished. Still, the satisfaction found in gaming can only be a pale reflection of the satisfaction absent from the world beyond. We turn to games when real life fails us — not merely in touristic fashion but closer to the case of emigrants, fleeing a home that has no place for them.

Gamers have their own fantasies of ­prosperity, fantasies that sometimes come true. For a few, gaming has already become a viable and lucrative profession. Saahil Arora, an American college dropout who plays professional Dota 2 under the name UNiVeRsE, is reportedly the richest competitive gamer: He has earned $2.7 million in his career so far. But even Arora’s income is dwarfed by those of a handful of YouTube (and Twitch) broadcasters with a fraction of his talent: Just by filming themselves playing through games in a ludicrously excitable state for a young audience of fellow suburbanites, their income from ads and subscriptions adds up to earnings in the mid seven figures. The prospects for those who had gathered at Next Level that chilly November night were not quite so sunny. The fighting-game community (FGC), which has developed around one-on-one games like Streetfighter, and for which Next Level serves as a training ground, has yet to reach the popularity of multiplayer online battle arenas (MOBAS) like Dota 2, or first-person shooters, such as Counter-Strike. (The scene is taking steps in that direction: 2016 marked the first year that the Street Fighter V world championships were broadcast on ESPN2 as well as the first time that an American FGC player — Du Dang, from Florida — took the title over top players from Japan.)

Still, according to a veteran of the community (16 of his 34 years), Sanford Kelly, the fighting-game community scene has a long way to go. Though he personally isn’t fond of Street Fighter V, the latest iteration in the series, his energies are devoted to guiding the New York FGC to become more respectable and therefore more attractive to e-sports organizations that might sponsor its members: “We have to change our image, and we have to be more professional.” Compared with other branches of American e-sports, dominated by white and Asian players, the FGC has a reputation that’s always been more colorful: It’s composed primarily of black players like Kelly, Asian players like his longtime Marvel rivals Justin Wong and Duc Do, and Latino gamers, and its brash self-presentation is influenced by the street culture that gave rise to hip-hop. With a typical mixture of resignation and determination, Kelly internalized the fact that, locally and nationally, his scene would have to move away from its roots to move to a larger stage. But the competitive gaming economy had already reached the point where, as the streamer, commentator, and player Arturo Sanchez told me, the earning potential of the FGC were already viable. “So long as you don’t have unrealistic ambitions.” Between the money gleaned from subscriptions to his Twitch channel, payments for streaming larger tournaments, sponsor fees from businesses that pay for advertising in the breaks between matches, crowdfunding, merchandise, and YouTube revenue, Sanchez is able to scratch out a living, comfortably if not prosperously, as a full-time gamer.

Next Level itself is not financially self-sufficient: Without additional income, including from its co-owner and co-founder Henry Cen (a former day trader), it couldn’t pay the rent. “Only rich countries can have places like this,” says the bespectacled and crane-thin Cen. “You wouldn’t see this in Third World countries.” He describes the people who make up the majority of New York’s FGC as coming from blue-collar families: “They’re not the richest of people. There are some individuals that are, but most people that do have money, they want to do something more with their money.” He’s relatively pessimistic about the possibility of becoming a professional gamer: Considering the economic pressures on FGC members and the still small size (roughly 100,000 viewers at most) of the viewing audience, it’s a career that’s available only to the top “0.01 percent” of players. Family pressures to pull back from gaming are strong: Even Justin Wong, one of the happy few who succeeded in becoming a professional, reportedly hid the fact from his family for a long time. “His family did not accept him as a gamer, but recently, they have changed their opinion,” says Cen.

“Because he started bringing in money,” I speculated.

“Yes. If you’re doing gaming, especially if you’re an Asian, your progress in life is measured by only one thing: money.”

Like Professor Hurst, I was interested in the political valence of gaming: Was there something fundamental to the pastime that inevitably promoted a dangerous politics? I was intrigued by the data Hurst cited, and during the recent campaign and immediately after, a number of writers noted the connection between Trump supporters and the world of militant gamer-trolls determined to make gaming great again through harassment and expulsion. But as a gamer myself, I found this ominous vision incomplete at best: Most gamers weren’t Trump-adjacent, and if Trumpism corresponded to any game, I thought, it was one that, in its disastrous physicality, could never become a video game: not Final Fantasy but Jenga. (Jenga is now on Nintendo Wii, I’m told.) On the other hand, I’ve never found it easy to trust my own perceptions, so I reached out to friends and acquaintances who were also gamers to learn from their experiences.

Though none of us is a Trumpist, no discourse could unite us. We were trading dispatches atop the Tower of Babel. We got different things out of gaming because we were looking for different things. Some of us greatly preferred single-player games, and some could barely stand to play games alone. Some of us held that writing about games was no more difficult than writing about any other subject; some of us found, and find, the task insanely difficult. Some of us just played more than others — Tony Tulathimutte listed 28 games as personal favorites. He and Bijan Stephen, also a writer, both had a fondness for secondary characters. (Stephen: “I love the weird helpers like Toad and the wizards in Gauntlet — not because they’re necessarily support characters but because they’ve got these defined roles that only work in relation to the other players.”) Meanwhile, Emma Janaskie, an associate editor at Ecco Books, spoke about her favorite games’ main characters, especially Lara Croft. Janaskie’s longest run of gaming lasted ten hours, compared with Stephen’s record of six hours and Tulathimutte’s of 16. When likewise queried, the art critic and gaming writer Nora Khan laughingly asked if she could go off the record, then recalled: “I’ve gotten up to take breaks and stuff, but I’ve played through all of Skyrim once,” adding parenthetically that “Skyrim is a 60-to-80-hour game.”

Janaskie and Tulathimutte made strong avowals that gaming fell squarely within the literary field (Tulathimutte: “Gaming can be literary the same way books can be. DOS for Dummies and Tetris aren’t literary, but Middlemarch and The Last of Us are, and each has its purpose”); I found the proposition more dubious.

“It seems to me that writers get into games precisely because it’s almost the antithesis of writing,” I said to Khan.

“Absolutely,” she said.

“When you’re writing, you don’t know what the stakes are. The question of what victory or defeat is — those questions are very hard to pin down. Whereas with a game, you know exactly what the parameters are.”

“Yes. I wouldn’t say that for everyone. Completing a quest or completing the mission was never really very interesting to me personally. For me, it’s more meditative. When I play Grand Theft Auto V, it’s just a way to shut off all the noise and for once be in a space where I don’t need to be critical or intellectualize something. Because I’m doing that all the time. I just go off and drive — honestly, that’s what I do in real life, too. When I just want to drop out of the situation, I’ll go and drive outside of the city.”

I wouldn’t trade my life or my past for any other, but there have been times when I’ve wanted to swap the writing life and the frigid self-consciousness it compels for the gamer’s striving and satisfaction, the infinite sense of passing back and forth (being an “ambiguous conduit,” in Janaskie’s ­poignant phrase) between number and body. The appeal can’t be that much different for nonwriters subjected to similar social or economic pressures, or for those with other ambitions, maybe especially those whose ambitions have become more dream state than plausible, actionable future. True, there are other ways to depress mental turnout. But I don’t trust my body with intoxicants; so far as music goes, I’ve found few listening experiences more gratifying or revealing than hearing an album on repeat while performing some repetitive in-game task. Gaming offers the solitude of writing without the strain of performance, the certitude of drug addiction minus its permanent physical damage, the elation of sports divorced from the body’s mortality. And, perhaps, the ritual of religion without the dogma. For all the real and purported novelty of video games, they offer nothing so much as the promise of repetition. Life is terrifying; why not, then, live through what you already know — a ­fundamental pulse, speechless and without thought?

After college graduation, once I’d been living back home unemployed with my father for a few months, he confronted me over the dinner table with a question. Given the vast sums of time he’d witnessed me expend on video games both recently and in my youth, wouldn’t it be right to say that gaming, not writing, was what I really wanted to do with my life?

I responded that my goal was to become a writer, and I meant it. But first I had to pause a few seconds to be sure. It’s true that the postgraduate years I spent jobless with my father laid the foundation for what I can do as a writer. I read literature, read history, studied maps, watched films and television, listened to music. I lifted weights in the basement. I survived my final episode of clinical depression and finished translating a mid-19th-century French poet who laid the foundation for literary modernism. But when I was too weak to do these things, and I often was, that so-called writer (zero pitches, zero publications) was, in Baudelaire’s phrase, a “drunkard of his own blood” obsessively replaying the video games of his adolescence — so as to re-create a sense, tawdry and malformed but also quantifiable, of status advancement in an existence that was, by any worldly standard, I knew, stagnant and decrepit. It didn’t matter that the world, by its own standards of economic growth, was itself worn down and running on fumes. Regardless of the rightness of the world, one cannot help but feel great individual guilt for failing to find a meaningful activity and position within it. And regardless of whether it benefits one in the long run, video games can ease that guilt tremendously in the immediate present.

The strange thing is that that guilt should be gone now. I have made a name as a writer. Yet I can’t say that I’ve left the game. In the weeks after writing my first major piece, a long book review, I fired lasers at robots in space for 200 hours. Two summers ago, I played a zombie game in survival mode alone for a week; eventually the zombies, which one must take down precisely and rapidly lest one be swarmed, started to remind me of emails. A few months back, I used a glitch to amass, over the course of several hours, a billion dollars in a game where there’s nothing to buy besides weapons, which you can get for free anyhow. I reinstalled the zombie game on Election Night.

Is it an addiction? Of course. But one’s addiction is always more than a private affair: It speaks to the health and the logic of society at large. Gaming didn’t impact the election, but electing to secede from reality is political, too. I suspect that the total intensity of the passion with which gamers throughout society surrender themselves to their pastime is an implicit register of how awful, grim, and forbidding the world outside them has become — the world that is gaming’s ultimate level, a space determined by finance and labor, food and housing, race and education, gender and art, with so many tests and so many bosses. Just as a wrong life cannot be lived rightly, a bad game cannot be played well. But for lack of an alternative, we live within one, and suffer from its scarcity.

*This article appears in the February 20, 2017, issue of New York Magazine.

Video Games Are Better Than Real Life