what were the 2010s?

2010s Recap, Part 1: Dancin’ Till the World Ends

On season two of The New Millennium, EDM, Four Loko, and a little place called the Jersey Shore.

Clockwise from top left: Thor’s hammer from Avengers; Ned Stark holding his sword; zombies from The Walking Dead; Pauly D from Jersey Shore; Kim Kardashian using a selfie stick. Illustration: by Ari Liloan
Clockwise from top left: Thor’s hammer from Avengers; Ned Stark holding his sword; zombies from The Walking Dead; Pauly D from Jersey Shore; Kim Kardashian using a selfie stick. Illustration: by Ari Liloan
Clockwise from top left: Thor’s hammer from Avengers; Ned Stark holding his sword; zombies from The Walking Dead; Pauly D from Jersey Shore; Kim Kardashian using a selfie stick. Illustration: by Ari Liloan

The first installment in a three-part recap of the 2010s. Read more about this series here.

Part One: 2010-2013

Time magazine dubbed the aughts “the Decade from Hell,” and as the calendar shifted, optimism for the future was muted at best. On New Year’s Day, the United States was not officially in a recession, but the hangover was long and painful. Unemployment had peaked at 10 percent in October 2009, and would not return to precrash levels until 2017. Those lucky enough to keep their jobs had to navigate an employment landscape where efficiency and productivity overran all other concerns. Bowing to unconscious pressure, white-collar workers began refraining from taking vacations or lunch breaks, while hourly workers often had to contend with schedules that were at best unstable or at worst literally backbreaking. In the wake of the crash, a new breed of job emerged in what became known as the “gig economy.” Companies like Uber, Amazon, and TaskRabbit promised newfound freedom and flexibility to the formerly jobless. Alongside apps such as Seamless and Airbnb, these businesses would offer the urban haute bourgeoisie a new frictionless convenience, though their main innovation turned out to be classifying workers as independent contractors.

For the Americans just coming of age, these years of privation created habits that would last into adulthood. The later years of the decade would be filled with obituaries for the items “killed” by millennials, and a scan of the victims read like a survey of the shifting consumer patterns of an indebted generation. Among the departed were paper napkins (replaced by paper towels, which did the same job); fabric softener (an easily forsworn luxury); credit cards (scarred memories from the crash); and casual dining (too expensive, and who had the time?). In this atmosphere of thrift, it was companies that positioned themselves around values like simplicity, humility, and asceticism that survived, which is another way of saying it was a big time for Greek yogurt and activities like Tough Mudder. Even the great American tradition of flaunting your material possessions was looked down upon, as Jay-Z and Kanye West learned from the minor backlash that greeted their album Watch the Throne.

From left: Occupy Wall Street and Tough Mudders: protests against precarity, and a fantasy of the same. Photo: Emmanuel Dunand/AFP via Getty ImagesPhoto: Michael Nagle/Bloomberg via Getty Images
From left: Occupy Wall Street and Tough Mudders: protests against precarity, and a fantasy of the same. Photo: Emmanuel Dunand/AFP via Getty ImagesPho... From left: Occupy Wall Street and Tough Mudders: protests against precarity, and a fantasy of the same. Photo: Emmanuel Dunand/AFP via Getty ImagesPhoto: Michael Nagle/Bloomberg via Getty Images

Also thriving were what became known in the parlance of the time as the “one percent.” The phrase was popularized by the Occupy movement, which in the fall of 2011 took over public spaces around the world in a series of wide-ranging protests against income inequality, corporate greed, and basically everything else. Critics mocked these self-described representatives of the 99 percent for their iPhones and Apple laptops, though in this case, that critique proved misguided: Over the course of the decade, high-tech gadgets like smartphones got cheap enough to become commonplace; it was necessities like health care and education that skyrocketed in price.

As with Germany in the Weimar era, with economic disaster a new hedonism was in the air. (The haircuts, too, were similar.) Only now, the vibe was a little brawnier than Brecht. The exemplars of the age were the bronzed, shirtless partiers of Jersey Shore, whose litany of “gym, tanning, laundry” infused a lifestyle built around sex and alcohol with a sweetly childlike fussiness. Their fellow travelers’ first-bumping efforts were aided by cans of Four Loko, an ultrapotent malt beverage that combined four beers’ worth of alcohol with the caffeine equivalent of a cup and a half of coffee. The health risks of such an elixir spurred statewide bans and an FDA warning, leading the beverage’s manufacturers to recalibrate the recipe in the autumn of 2010, though there was enough lead time for fans to stockpile the original version.

Music of this period reflected the apocalyptic air. R&B developed a woozy, 4 a.m. vibe in the work of artists like Drake and the Weeknd, while mainstream pop absorbed the aesthetic of electronic dance music and its throbbing 4/4 beats. Hit singles emphasized the need to disregard the future: “Till the World Ends,” “Die Young,” “One More Night,” “Live While We’re Young.” With no tomorrow worth living for, the watchword was tonight: Party rock is in the house tonight; give me everything tonight; tonight, we are young. This sentiment was given its purest expression in Drake’s “The Motto,” which repopularized the adage “You only live once,” distilling it down to the acronym YOLO, which soon became a rallying cry for the nation’s youth.

From left: LMFAO and Mumford and Sons: Opposing ends of the authenticity spectrum. Photo: Kevin Winter/DCNYRE2012/Getty Images for DCPPhoto: Kevin Mazur/WireImage
From top: LMFAO and Mumford and Sons: Opposing ends of the authenticity spectrum. Photo: Kevin Winter/DCNYRE2012/Getty Images for DCPPhoto: Kevin Maz... From top: LMFAO and Mumford and Sons: Opposing ends of the authenticity spectrum. Photo: Kevin Winter/DCNYRE2012/Getty Images for DCPPhoto: Kevin Mazur/WireImage

EDM’s influence gave the pop of this era a shiny, metallic vibe, though some musicians had fun deconstructing this mechanical sound, imagining it all collapsing into a shrieking wreck. This was called dubstep. On the other side of the spectrum, the artificiality of Top 40 spurred another folk revival, made up of self-consciously authentic groups like Mumford & Sons, the Lumineers, and Of Monsters and Men. In wardrobe, these groups picked from the closet of 20th-century Americana, equipping themselves with beards, banjos, and collarless shirts worn with vests. In sound, they attempted to mimic the effect of 40 people all stomping in unison in one tiny room. This roots relaunch proved to have nearly as much mainstream appeal as EDM pop, though as Nitsuh Abebe observed, much of the feeling the music aroused was “inspiring on roughly the same level that your bank would like to inspire you to enjoy the freedom of no-fee checking.” In 2013, thesis and antithesis finally came together in the form of Avicii’s “Wake Me Up,” at which point both genres reached their natural states of exhaustion. Later that year Beyoncé would rap, “Radio say ‘Speed it up,’ I just go slower,” and just like that, the rest of the music industry did, too.

Social media was a creation of the aughts. As the decade turned, Facebook was so well-established that The Social Network was already months into production; Twitter had blown up the preceding summer as users closely monitored the health-care-reform negotiations and the Iranian election protests. However, those companies had been created in a world where most people were accessing new tech through desktop computers. The new generation of social networks would be native to smartphones. The trio of Instagram (which debuted in 2010), Snapchat (2011), and Tinder (2012) exemplified the shift: All three were built around the primacy of the image, often the cell-phone photo: blurry and indecipherable at the beginning of the decade, but gradually clearer and more colorful as time went on. The summer of 2010 brought the first iPhone with a front-facing camera, and the selfie soon became the decade’s defining social currency. This rise in digital self-portraiture sparked endless public debate: Were selfies a symbol of the inherent narcissism of the age, or a daring form of self-expression? Whatever the case, it was too simplistic to paint them as merely a pastime for the young and dewy; once introduced to the concept, older generations were equally transfixed.

Each of these networks sought to occupy as much of its users’ time as possible, and hours spent in what technologists called “the stream” could not help but have a psychological effect. Facebook allowed friends and family to stay connected in ways that were previously unimaginable; it also made its users feel ever more lonely — and Instagram was even worse. Twitter helped voices often ignored by mainstream media come to the fore; it also rewarded snap judgments and uninhibited cruelty. Tinder and its older cousin Grindr held the promise of instant casual sex, though for many users the expectation of such far outstripped reality. This writer was too old for Snapchat, but a brief observation of Generation Z suggests the app imbued those raised on it with a penchant for surrealist humor totally impenetrable to outsiders.

Taken together, these channels transformed the experience of mass media from a series of messages sent out by powerful central authorities to a back-and-forth conversation between near-infinite points of view. This shift brought forth new business incentives. The most important quality in what was now called “content” was shareability, which proved to be a moving target thanks to the capriciousness of the Facebook algorithm, but in this period mostly meant uplift and relatability. In this environment the base unit of journalism became the headline, the only part of an article that most people saw. Thus from its earliest stages social media was rife with misinformation and context collapse, a development milked by right-wing tricksters like Andrew Breitbart and James O’Keefe, whose selectively edited videos created frequent political firestorms.

Kim Kardashian weds Kris Humphries in 2011. They are married for 72 days. Photo: Albert Michael/StarTraks Photo

The celebrities who captured public attention during this time were those best able to capitalize on the breakdown of barriers between the physical world and the digital world. A key figure was Kim Kardashian, a California socialite who had come up the old-fashioned way, starring alongside her alliterative siblings in the reality show Keeping Up With the Kardashians. But it was the family’s savvy use of social media — masterfully planned, it was said, by “momager” Kris Jenner — that kept them in the spotlight through a time when contemporaries like Paris Hilton had long since faded. Countless more found celebrity on a smaller scale. On YouTube, vloggers amassed successful followings by speaking to viewers as if they were trusted friends. On Instagram, those who were not yet known as influencers were able to support themselves by posing in dramatic outfits in front of memorable backdrops, as long as they shilled for products on the side. Twitch, launched in 2011, introduced the concept of the celebrity gamer. Beauty and charm had always mattered in society, but now those qualities were monetizable like never before.

Even the upper heights of the A-list felt the shift. Glamorous, remote figures like Angelina Jolie were supplanted by endearingly relatable stars next door like Jennifer Lawrence, Chris Pratt, and Emma Stone. Classically restrained stars like Ryan Gosling saw their profiles rise after being turned into memes, while Charlie Sheen pioneered a new breed of fame by essentially transforming himself into one in the midst of a public breakdown. (As Sheen, the musical The Book of Mormon, and the rap group Odd Future demonstrated, this was still an age where purposeful offensiveness could enthrall.) The decade’s new idols would embrace a messy fallibility — in part, perhaps, because the development of fan culture on Tumblr and Twitter meant they were widely praised no matter what they did.

The literary world was not immune to these changes. When Jonathan Franzen’s Freedom arrived in the summer of 2010 it already felt like the last of its breed, a self-conscious attempt to write the great American novel that would summarize the era. Shortly thereafter, the citadel of masculine literary genius was stormed. Instead of zooming out, the era’s new writers zoomed in. Autofiction was back in vogue, as writers like Sheila Heti, Ben Lerner, and Karl Ove Knausgård mined their own personal histories to great acclaim. Thanks to social media, the lines between an artist and their work were blurring, particularly in the works of Tao Lin, who developed a purposeful anti-style focused on a Spartan accounting of minute interpersonal interactions. Micro-famous though Lin and his ilk were, at least they had book deals; scores more now-anonymous authors contributed to the boom of the personal essay, which was becoming an appealingly economical form of journalism for cash-strapped news outlets. It didn’t work out for everyone: In this era many aspiring writers had the regrettable experience of being defined by the parts of themselves they had sold to the likes of xoJane.

Commercially, though, Franzen’s true successors were explicitly female-focused books like Elena Ferrante’s Neapolitan novels, E.L. James’s steamy Fifty Shades trilogy (whose success was perhaps aided by the invention of the Kindle), and Gillian Flynn’s Gone Girl, which kicked off a wave of similarly feminine-titled thrillers. By decade’s end, it would be male authors like Dan Mallory (who wrote The Woman in the Window under the pen name A.J. Finn) who were resorting to gender-neutral pseudonyms.

The shift from universal to personalized experiences was also being felt in the world of screen entertainment. In November 2010, Netflix began offering a separate unlimited-streaming option for less than half the price of its DVD-by-mail subscription, a boon for budget-conscious consumers. In the space of six months, the company’s streaming audience would represent nearly 30 percent of U.S. internet traffic in the evenings. The ripple effects of this decision would be felt in Hollywood for the rest of the decade. In the short term, the DVD and Blu-ray market, already wounded by the recession, was dealt a fatal blow. In the longer term, the move put streaming services in direct competition with the networks. But not just yet. For the still-ongoing golden age of television, early results of the streaming revolution were mostly beneficial, as shows like Breaking Bad used Netflix to attract new fans, who then tuned in when fresh batches of episodes premiered on cable. Spotify, which launched in the U.S. in 2011, would have a similarly revolutionary effect on music, ushering in an era of hip-hop dominance. Alongside social media, streaming would usher in a world of endless content niches, with recommendation algorithms providing experiences tailored to each individual viewer’s taste.

Old forms of mass culture struggled to retain their place in the cultural firmament, though they did not die out entirely. Among young people, traditional television viewership declined precipitously. And with so many options available at home, getting audiences to go out to the movies proved an increasingly difficult proposition. But there were exceptions. Vampires had been the “It” creatures of the late aughts; as the boom times subsided, those glamorous bloodsuckers were eclipsed by zombies, which in The Walking Dead embodied fears of societal collapse, resource scarcity, and constant existential precarity. This grim vision of zero-sum competition was made even more explicit in the Hunger Games franchise, which imagined a dystopia where the only means of social advancement for poor Americans was slaughtering their fellow citizens as entertainment for the rich. Both were wildly popular, with social metaphors mutable enough to be claimed by left and right alike. But over the long run, their impact was eclipsed by two lumbering behemoths that debuted at roughly the same time.

From left: The things we couldn’t stop talking about, all decade long. Photo: HBOPhoto: Walt Disney Studio Motion Pictures
From top: The things we couldn’t stop talking about, all decade long. Photo: HBOPhoto: Walt Disney Studio Motion Pictures

The Marvel Cinematic Universe and Game of Thrones were, in tone, not much alike. The former, which brought Thor and Captain America to the screen in 2011 in advance of the following year’s team-up in The Avengers, presented superhero spectacle as workplace sitcom, with cheery quips and constant assurances that, no matter what sort of destruction seemed in store, things would never get too serious. (Even when half the living beings in the universe disappeared in an instant, the wisecracks remained.) Meanwhile the latter, which debuted on HBO three weeks before Thor hit theaters, was almost too serious for its own good, a quasi-medieval fantasy series filled with numerous rapes, beheadings, and dragon-charred corpses. But in a world where innumerable tiny dramas were playing out on phones, the gigantic canvases of GOT and the MCU offered recession-weary audiences an escape from the real world, as well as the opportunity to dive into intricate mythologies and scrutinize tantalizing hints about what was to come. (In both cases, it helped to have read the books.) Together they embodied nerd culture’s new hegemony over mass entertainment, and for the rest of the decade, projects that aimed at broad mainstream success would operate in their shadow. As Mark Harris wrote a few years into their reign, entertainment was now “no longer about the thing; [it’s] about the next thing, the tease, the Easter egg, the post-credit sequence, the promise of a future at which the moment we’re in can only hint.”

When did this era end?

Sometime around 2013, when the colors of my memories shift from sparkling and metallic to muted and desaturated. (Compare the music videos for “Scream & Shout” and “Wrecking Ball” to see what I’m talking about.) Matthew Perpetua, too, pinpointed in that year’s hits “Thrift Shop” and “Royals” a backlash to early-’10s hedonism. But of course, history rarely gives us clean borders; I’ve omitted from this section certain debates around, say, Girls, that seem in retrospect to have more in common with the cultural tides of Obama’s second term. Those conversations will have to wait until part II. Everybody get up.

Stray Observations

• Overwhelming audiences with sheer spectacle was not limited to film and TV — the devastating vocal power of Adele briefly staunched the record industry’s bleeding.

• Despite the dour mood of the postrecession period, hipster culture of these years retained its aughts tweeness, an aesthetic simultaneously parodied and celebrated in the Fox sitcom New Girl. For a brief period of time, I recall women being very into drawing fake mustaches on their fingers.

• Speaking of New Girl, in this period there was much fretting about the state of the contemporary male, though for the moment it centered on whether downwardly mobile millennial men could be adequate economic matches for their female counterparts. Events to come would prove that those anxieties were only the tip of the iceberg.

• Remember the six months in 2012 when everything had spikes on it? Weird.

Read Part Two, “We’ve Come Too Far to Give Up Who We Are,” here.

As my colleague Max Read has noted, just as their ancestors’ lifestyles had been propped up by government intervention, the 21st-century middle class was subsidized both by precarious gig workers at the bottom, and money-losing venture capitalists at the top. The percentage of American adults who owned a smartphone would cross 50 percent by January 2013. With their dark middle-parted hair, rigorous plastic-surgery regimens, and semi-problematic racial ambiguity, the sisters would also become the Gibson Girls of their age. Later in the decade, this technique would be perfected by the novelist Sally Rooney, whose characters spoke in a flat digital-age dialect that made it hard to remember they were supposed to be Irish. On the business side though, the decade would see greater consolidation around a few big winners.
2010s Recap, Part 1: Keep on Dancin’ Till the World Ends