How TikTok Holds Our Attention

On the popular short-video app, young people are churning through images and sounds at warp speed, repurposing reality into ironic, bite-size content.
Image may contain Advertisement Poster Flyer Paper and Brochure
TikTok doesn’t ask you to pretend that you’re on the Internet for a good reason.Illustration by Nick Little

Marcella is eighteen and lives in a Texas suburb so quiet that it sometimes seems like a ghost town. She downloaded TikTok last fall, after seeing TikTok videos that had been posted on YouTube and Instagram. They were strange and hilarious and reminded her of Vine, the discontinued platform that teen-agers once used for uploading anarchic six-second videos that played on a loop. She opened TikTok, and it began showing her an endless scroll of videos, most of them fifteen seconds or less. She watched the ones she liked a few times before moving on, and double-tapped her favorites, to “like” them. TikTok was learning what she wanted. It showed her more absurd comic sketches and supercuts of people painting murals, and fewer videos in which girls made fun of other girls for their looks.

When you watch a video on TikTok, you can tap a button on the screen to respond with your own video, scored to the same soundtrack. Another tap calls up a suite of editing tools, including a timer that makes it easy to film yourself. Videos become memes that you can imitate, or riff on, rapidly multiplying much the way the Ice Bucket Challenge proliferated on Facebook five years ago.

Marcella was lying on her bed looking at TikTok on a Thursday evening when she began seeing video after video set to a clip of the song “Pretty Boy Swag,” by Soulja Boy. In each one, a person would look into the camera as if it were a mirror, and then, just as the song’s beat dropped, the camera would cut to a shot of the person’s doppelgänger. It worked like a punch line. A guy with packing tape over his nose became Voldemort. A girl smeared gold paint on her face, put on a yellow hoodie, and turned into an Oscar statue. Marcella propped her phone on her desk and set the TikTok timer. Her video took around twenty minutes to make, and is thirteen seconds long. She enters the frame in a white button-down, her hair dark and wavy. She adjusts her collar, checks her reflection, looks upward, and—the beat drops—she’s Anne Frank.

Marcella’s friends knew about TikTok, but almost none of them were on it. She didn’t think that anyone would see what she’d made. Pretty quickly, though, her video began getting hundreds of likes, thousands, tens of thousands. People started sharing it on Instagram. On YouTube, the Swedish vlogger PewDiePie, who has more than a hundred million subscribers, posted a video mocking the media for suggesting that TikTok had a “Nazi problem”—Vice had found various accounts promoting white-supremacist slogans—then showed Marcella’s video, laughed, and said, “Never mind, actually, this does not help the case I was trying to make.” (PewDiePie has been criticized for employing anti-Semitic imagery in his videos, though his fans insist that his work is satire.) Marcella started to get direct messages on TikTok and Instagram, some of which called her anti-Semitic. One accused her of promoting Nazism. She deleted the video.

In February, a friend texted me a YouTube rip of Marcella’s TikTok. I was alone with my phone at my desk on a week night, and when I watched the video I screamed. It was terrifyingly funny, like a well-timed electric shock. It also made me feel very old. I’d seen other TikToks, mostly on Twitter, and my primary impression was that young people were churning through images and sounds at warp speed, repurposing reality into ironic, bite-size content. Kids were clearly better than adults at whatever it was TikTok was for—“I haven’t seen one piece of content on there made by an adult that’s normal and good,” Jack Wagner, a “popular Instagram memer,” told The Atlantic last fall—though they weren’t the only ones using the platform. Arnold Schwarzenegger was on TikTok, riding a minibike and chasing a miniature pony. Drag queens were on TikTok, opera singers were on TikTok, the Washington Post was on TikTok, dogs I follow on Instagram were on TikTok. Most important, the self-made celebrities of Generation Z were on TikTok, a cohort of people in their teens and early twenties who have spent a decade filming themselves through a front-facing camera and meticulously honing their understanding of what their peers will respond to and what they will ignore.

I sent an e-mail to Marcella. (That’s her middle name.) She’s from a military family, and likes to stay up late listening to music and writing. Marcella is Jewish, and she and her brothers were homeschooled. Not long before she made her video, her family had stopped at a base to renew their military I.D.s. One of her brothers glanced at her new I.D. and joked, accurately, that she looked like Anne Frank.

In correspondence, Marcella was as earnest and thoughtful as her video had seemed flip. She understood that it could seem offensive out of context—a context that was invisible to nearly everyone who saw it—and she was sanguine about the angry messages that she’d received. TikTok, like the rest of the world, was a mixed bag, she thought, with bad ideas, and cruelty, and embarrassment, but also with so much creative potential. Its ironic sensibility was perfectly suited for people her age, and so was its industrial-strength ability to turn non-famous people into famous ones—even if only temporarily, even if only in a minor way. Marcella had accepted her brush with Internet fame as an odd thrill, and not an entirely foreign one: her generation had grown up on YouTube, she noted, watching ordinary kids become millionaires by turning on laptop cameras in their bedrooms and talking about stuff they like. The videos that I’d been seeing, chaotic and sincere and nihilistic and very short, were the natural expressions of kids who’d had smartphones since they were in middle school, or elementary school. TikTok, Marcella explained, was a simple reaction to, and an absurdist escape from, “the mass amounts of media we are exposed to every living day.”

TikTok has been downloaded more than a billion times since its launch, in 2017, and reportedly has more monthly users than Twitter or Snapchat. Like those apps, it’s free, and peppered with advertising. I downloaded TikTok in May, adding its neon-shaded music-note logo to the array of app icons on my phone. TikTok’s parent company, ByteDance, is based in China, which, in recent years, has invested heavily and made major advances in artificial intelligence. After a three-billion-dollar investment from the Japanese conglomerate SoftBank, last fall, ByteDance was valued at more than seventy-five billion dollars, the highest valuation for any startup in the world.

I opened the app, and saw a three-foot-tall woman making her microwave door squeak to the melody of “Yeah!,” by Usher, and then a dental hygienist and her patient dancing to “Baby Shark.” A teen-age girl blew up a bunch of balloons that spelled “PUSSY” to the tune of a jazz song from the beloved soundtrack of the anime series “Cowboy Bebop.” Young white people lip-synched to audio of nonwhite people in ways that ranged from innocently racist to overtly racist. A kid sprayed shaving cream into a Croc and stepped into it so that shaving cream squirted out of the holes in the Croc. In five minutes, the app had sandblasted my cognitive matter with twenty TikToks that had the legibility and logic of a narcoleptic dream.

TikTok is available in a hundred and fifty markets. Its videos are typically built around music, so language tends not to pose a significant barrier, and few of the videos have anything to do with the news, so they don’t easily become dated. The company is reportedly focussing its growth efforts on the U.S., Japan, and India, which is its biggest market—smartphone use in the country has swelled, and TikTok now has two hundred million users there. ByteDance often hacks its way into a market, aggressively courting influencers on other social-media networks and spending huge amounts on advertising, much of which runs on competing platforms. Connie Chan, a general partner at Andreessen Horowitz, told me that investors normally look for “organic growth” in social apps; ByteDance has been innovative, she said, in its ability and willingness to spend its way to big numbers. One former TikTok employee I spoke to was troubled by the company’s methods: “On Instagram, they’d run ads with clickbaity images—an open, gashed wound, or an overtly sexy image of a young teen girl—and it wouldn’t matter if Instagram users flagged the images as long as the ad got a lot of engagement first.”

In April, the Indian government briefly banned new downloads of the app, citing concerns that it was exposing minors to pornography and sexual predation. (At least three people in India have died from injuries sustained while creating TikToks: posing with a pistol, hanging out on train tracks, trying to fit three people on a moving bike.) In court, ByteDance insisted that it was losing five hundred thousand dollars a day from the ban. The company announced plans to hire more local content moderators and to invest a billion dollars in India during the next three years. The ban was lifted, and the company launched a campaign: every day, three randomly selected users who promoted TikTok on other platforms with the hashtag #ReturnOfTikTok would receive the equivalent of fourteen hundred dollars.

TikTok is a social network that has nothing to do with one’s social network. It doesn’t ask you to tell it who you know—in the future according to ByteDance, “large-scale AI models” will determine our “personalized information flows,” as the Web site for the company’s research lab declares. The app provides a “Discover” page, with an index of trending hashtags, and a “For You” feed, which is personalized—if that’s the right word—by a machine-learning system that analyzes each video and tracks user behavior so that it can serve up a continually refined, never-ending stream of TikToks optimized to hold your attention. In the teleology of TikTok, humans were put on Earth to make good content, and “good content” is anything that is shared, replicated, and built upon. In essence, the platform is an enormous meme factory, compressing the world into pellets of virality and dispensing those pellets until you get full or fall asleep.

ByteDance has more than a dozen products, a number of which depend on A.I. recommendation engines. These platforms collect data that the company aggregates and uses to refine its algorithms, which the company then uses to refine its platforms; rinse, repeat. This feedback loop, called the “virtuous cycle of A.I.,” is what each TikTok user experiences in miniature. The company would not comment on the details of its recommendation algorithm, but ByteDance has touted its research into computer vision, a process that involves extracting and classifying visual information; on the Web site of its research lab, the company lists “short video recommendation system” among the applications of the computer-vision technology that it’s developing. Although TikTok’s algorithm likely relies in part, as other systems do, on user history and video-engagement patterns, the app seems remarkably attuned to a person’s unarticulated interests. Some social algorithms are like bossy waiters: they solicit your preferences and then recommend a menu. TikTok orders you dinner by watching you look at food.

After I had watched TikTok on and off for a couple of days, the racist lip-synchs disappeared from my feed. I started to see a lot of videos of fat dogs, teen-agers playing pranks on their teachers, retail workers making lemonade from the lemons of being bored and underpaid. I still sometimes saw things I didn’t like: people in horror masks popping into the frame, or fourteen-year-old girls trying to be sexy, or rich kids showing off the McMansions where they lived. But I often found myself barking with laughter, in thrall to the unhinged cadences of the app. The over-all effect called to mind both silent-movie slapstick and the sort of exaggerated, knowing stupidity one finds on the popular Netflix sketch show “I Think You Should Leave.” Some videos displayed new forms of digital artistry: a Polish teen-ager with braces and slate-blue eyes, who goes by @jeleniewska, makes videos in which she appears to be popping in and out of mirrors, phones, and picture frames. Others drew on surprising sources: an audio clip from Cecelia Condit’s art piece “Possibly in Michigan,” from 1983, went viral under the track label “oh no no no no no no no no silly” after a sixteen-year-old found the film on a list of “creepy videos” that had been posted on YouTube.

I found it both freeing and disturbing to spend time on a platform that didn’t ask me to pretend that I was on the Internet for a good reason. I was not giving TikTok my attention because I wanted to keep up with the news, or because I was trying to soothe and irritate myself by looking at photos of my friends on vacation. I was giving TikTok my attention because it was serving me what would retain my attention, and it could do that because it had been designed to perform algorithmic pyrotechnics that were capable of making a half hour pass before I remembered to look away.

We have been inadvertently preparing for this experience for years. On YouTube and Twitter and Instagram, recommendation algorithms have been making us feel individually catered to while bending our selfhood into profitable shapes. TikTok favors whatever will hold people’s eyeballs, and it provides the incentives and the tools for people to copy that content with ease. The platform then adjusts its predilections based on the closed loop of data that it has created. This pattern seems relatively trivial when the underlying material concerns shaving cream and Crocs, but it could determine much of our cultural future. The algorithm gives us whatever pleases us, and we, in turn, give the algorithm whatever pleases it. As the circle tightens, we become less and less able to separate algorithmic interests from our own.

One of TikTok’s early competitors was Musical.ly, a lip-synching app based in Shanghai that had a large music library and had become extremely popular with American children. In 2016, an executive at an ad agency focussed on social media told the Times that Musical.ly was “the youngest social network we’ve ever seen,” adding, “You’re talking about first, second, third grade.” ByteDance bought Musical.ly the following year, for an amount reportedly in the vicinity of a billion dollars, and merged the app with TikTok in August, 2018. In February, the Federal Trade Commission levied a $5.7-million fine against the company: the agency found that a large percentage of Musical.ly users, who were now TikTok users, were under the age of thirteen, and the app did not ask for their ages or seek parental consent, as is required by federal law. The F.T.C. “uncovered disturbing practices, including collecting and exposing the location” of these children, according to an agency statement. TikTok handled this in a blunt, makeshift fashion: it added an age gate that asked for your birthday but which defaulted to the current date, meaning that users who failed to enter their age were instantly kicked off the app, and their videos were deleted. TikTok did not seem terribly worried about the complaints that followed these deletions. It was now big enough not to care.

A few months after TikTok arrived in the U.S., a nineteen-year-old rapper and singer from Georgia named Montero Lamar Hill uploaded a song that he had been trying for weeks to promote as the basis of a meme. Hill, who goes by the stage name Lil Nas X, had spent much of his teens attempting to go viral on Twitter and elsewhere. There is a sweetness to his self-presentation, which seems optimized for digital interaction; he wears ten-gallon hats and fringe and glitter, a laugh-crying-cowboy emoji come to life. “The Internet is basically, like, my parents in a way,” he told the Times this spring, after people began making videos featuring a snippet of his song “Old Town Road,” in which they would drink “yee yee juice” and turn into cowboys and cowgirls. The song went to No. 1 on the Billboard Hot 100 in April, and stayed there longer than any song ever had.

Certain musical elements serve as TikTok catnip: bass-heavy transitions that can be used as punch lines; rap songs that are easy to lip-synch or include a narrative-friendly call and response. A twenty-six-year-old Australian producer named Adam Friedman, half of the duo Cookie Cutters, told me that he was now concentrating on lyrics that you could act out with your hands. “I write hooks, and I try it in the mirror—how many hand movements can I fit into fifteen seconds?” he said. “You know, goodbye, call me back, peace out, F you.”

TikTok employs an artist-relations team that contacts musicians whose songs are going viral and coaches them on how to use the platform. Some videos include links to Apple Music, which pays artists per stream, though not very much. Virality can thus pay off elsewhere, relieving the pressure for TikTok to compensate artists directly. It is, these days, a standard arrangement: you will be “paid” in exposure, giving your labor to a social platform in part because a lot of other people are doing it and in part because you might be one of the people whom the platform sends, however briefly, to the top.

If you are one of those people, TikTok can be a godsend. Sub Urban, a nineteen-year-old artist from New Jersey, got a deal with Warner Records after millions of TikTokers started doing a dance from the video game Fortnite to his song “Cradles.” In August, a twenty-one-year-old rapper from Sacramento who goes by the name Stunna Girl learned that a song of hers had gone viral on the app, and soon signed a record deal with Capitol. TikTok also offers artists the uniquely moving experience of watching total strangers freely and enthusiastically produce music videos for them. Jonathan Visger, an electronic artist known as Absofacto, told me that it had changed his entire outlook on his career to see nearly two million TikToks all set to his 2015 single “Dissolve,” a heady pop song that inspired a meme in which people appeared to be falling through a series of portals.

“I think the song worked well for the platform because the lyrics are ‘I just wanted you to watch me dissolve, slowly, in a pool full of your love,’ ” Visger told me recently. “Which is a lot like ‘I’m on the Internet, I want to be seen, and I want you to like it.’ ” I asked him if he’d been thinking about the Internet when he was writing it. “No!” he said, laughing. “I was thinking about unrequited love.”

ByteDance is developing a music-streaming service—which will likely launch first in emerging markets, such as India—and it is currently negotiating the renewal of old Musical.ly licensing agreements with the three companies that control roughly eighty per cent of music globally. ByteDance also has acquired a London-based startup called Jukedeck, which has been developing A.I. music-creation tools, including a program that can interpret video and compose music that suits it. Incorporating such technology into TikTok could give ByteDance total ownership of content created within the app. Multiple people at TikTok and ByteDance told me that they were not aware of any plans to add this sort of tool, but TikTok’s plans have a way of abruptly changing.

In some respects, what’s sonically valuable on TikTok isn’t any different from what has long succeeded on radio; no pop-songwriting practice is more established than crafting a good hook. But the app could begin to influence composition in other ways. Digital platforms and digital attention spans may make hit songs shorter, for instance. (“Old Town Road” clocks in at under two minutes.) Adam Friedman has begun producing music directly for influencers, and engineering it for maximum TikTok success. “We start with the snippet, and if it does well on TikTok we’ll produce the full song,” he told me. I suggested that some people might think there was a kind of artistic integrity missing from this process. “The influencer is playing a central role in our culture, and it’s not new,” he said. “There’ve always been socialites, people of influence, the Paris World’s Fair. Whatever mecca that people go to for culture is where they go to for culture, and in this moment it’s TikTok.”

TikTok’s U.S. operations are currently based at a co-working space in a generic four-story building on a busy thoroughfare in Culver City, in Los Angeles. I visited the office twice this summer, after an extensive e-mail correspondence with a company spokesperson. The first person TikTok offered for an on-the-record chat was a twenty-year-old TikToker named Ben De Almeida, who lives in Alberta and, on the app, goes by @benoftheweek. De Almeida first went viral on TikTok with a video that noted his resemblance to the actor Noah Centineo, best known for his roles on “The Fosters” and in teen movies on Netflix. De Almeida wore red striped pants and a yellow shirt and was accompanied by a handler; he radiated good-natured charisma. When I extended my hand, he immediately went in for a hug. “I’m excited to share what it’s like to be a TikToker,” he said.

De Almeida was in L.A. for the summer, “collabing,” he told me. He said that he’d “always wanted to be a creator,” using the term that has become a catchall identity for people who make money by producing content for social platforms. He’d grown up admiring YouTubers, “people like Shane Dawson and iJustine,” and had begun making online videos when he was twelve. He used to post videos on Snapchat, but he got on TikTok in November and now has two million followers. In conversation, De Almeida, like other TikTok teens I talked to, mixed the ecstatically strange dialect of people who love memes—a language in which every word sets off a chain of incomprehensible referents—with the sort of anodyne corporate jargon I associate with marketing professionals. “In this generation, you get steeped in the culture of online video,” he said. “You naturally pick up on what can be a trend.” He pulled out his phone and showed me one of his early TikTok hits, in which he pretended to put a can of beans in the microwave and burn his mom’s house down.

“Now show him projected sea levels on his golf course.”
Cartoon by Elizabeth Montague

Later that day, in West Hollywood, at an outpost of Joe & the Juice, I met with Jacob Pace, the ebullient twenty-one-year-old C.E.O. of a content-production company called Flighthouse. Pace wore a charcoal T-shirt and had the erratic energy of a champion sled dog on break. Flighthouse has more than nineteen million followers on TikTok, and its videos reflect an intuitive understanding of its audience: Pikachu in a baseball cap, dancing; a girl eating Flamin’ Hot Cheetos in a bowl full of milk. Pace has fifteen employees working under him to make TikToks, some of which serve as back-end marketing for record labels that have paid Flighthouse to promote particular songs. He was about to travel to New York to present to ad agencies. “What gets me out of bed in the morning is creating and impacting culture,” he said. Figuring out how to make TikToks that people liked and related to was, he said, like “helping to perfect a machine that will one day start running perfectly.”

Many of the people whose professional lives are dependent on or tied to TikTok were eager to talk to me, but that eagerness was not shared by people who actually work for the company. A former TikTok employee told me, in a direct message, “As strategic as it appears from the outside it’s a complete chaos on the inside.” After my first visit to the L.A. office, I sent a TikTok representative a list of questions asking for basic information, including the number of employees at the company, the number of moderators, the demographics of its users, and the number of hours of video uploaded to the platform daily. The representative informed me, weeks later, that there were “a couple hundred people working on TikTok in the US” and “thousands of moderators” across all of TikTok’s markets, and she said that she couldn’t answer any of the other questions.

TikTok’s primary selling point is that it feels unusually fun, like it’s the last sunny corner on the Internet. I asked multiple TikTok employees whether the company did anything to insure that this mood prevailed in the videos that the app served its users. Speaking with an executive, in August, about the app’s “Discover” page, I asked, “What if the most trending thing was something that you didn’t want to be the most trending thing? Would you put something else in its place?” The executive said that doing so would run counter to TikTok’s ethos. A few weeks later, the online trade magazine Digiday reported that TikTok had begun sending select media companies a weekly newsletter that previewed “the trending hashtags that the platform plans to promote.” A copy of the newsletter that I obtained lists such hashtags as #BeachDay and #AlwaysHustling, and it instructs, “If you’re interested in participating, make sure to upload your video no earlier than one day before the hashtag launch.” Later, a representative told me that the company might choose not to include certain hashtags on the “Discover” page, and that TikTok was interested in highlighting positive trends, like #TikTokDogs.

TikTok employees in Los Angeles declined to talk in any detail about their relationship to ByteDance headquarters, in Beijing, and everyone I spoke to emphasized that the U.S. operation was fairly independent. But one former employee, who left the company in 2018, described this as a “total fabrication.” (A ByteDance spokesperson, in response, said that the markets were becoming more independent and that much of that process had happened within the past year.) TikTok’s technology was developed in China, and it is refined in China. Another ex-employee, who had worked in the Shanghai office, said that nearly all product features are shipped out from Shanghai and Beijing, where most of ByteDance’s engineers are based. “At a tech company, where the engineers are is what matters,” the writer and former Facebook product manager Antonio Garcia-Martinez told me. “Everyone else is a puppet paid to lie to you.”

The direct predecessor of TikTok is Douyin, a short-video platform that ByteDance launched in China in 2016. Douyin is headquartered in Shanghai, and ByteDance says that it has more than five hundred million monthly active users. Zhou Rongrong, a twenty-nine-year-old Ph.D. candidate at the Central Academy of Fine Arts, in Beijing, who has studied Internet art in China, said that most young people in the country are on Douyin. In particular, she said, the app has opened up new kinds of economic potential for people outside the country’s traditional centers of power. “For example, I had no way before to see these ways that rural people can cook their dishes,” Zhou said. Douyin has given rise to influencers like Yeshi Xiaoge—the name means “brother who cooks in the wilderness”—who films himself preparing elaborate meals, and who has released his own line of beef sauce. Rural administrations have begun advertising their regions’ produce and tourist attractions on the app.

Though it remains broadly similar to TikTok, Douyin has become more advanced than its global counterpart, particularly with respect to e-commerce. With three taps on Douyin, you can buy a product featured in a video; you can book a stay at a hotel after watching a video shot there; you can take virtual tours of a city’s stores and restaurants, get coupons for those establishments, and later post geo-tagged video reviews. Fabian Bern, the head of a marketing company that works closely with Douyin influencers, told me that some power users can make “fifteen to twenty thousand U.S. dollars” on a shopping holiday like Singles’ Day.

So far, TikTok has concentrated more on expanding its user base than on offering opportunities for e-commerce. If TikTok wants to keep growing, it will need to attract more people who are no longer in their teens, and it will need to hold their attention. Many people are not terribly interested in even the choicest memes the world has to offer; in August, the Verge reported that a “significant majority” of new TikTok users give up on the app after thirty days. Bern thinks that TikTok content will soon become more mature, as has already happened with Douyin, which now contains micro-vlogs, life-style content, business advice, and videos from local police. Selected users on Douyin can upload videos as long as five minutes. Fictional mini-dramas have begun to appear.

“This meme content, people will get bored with it,” Bern said. “And companies are, like, ‘We cannot make this type of content or we’ll damage our brands.’ ”

ByteDance’s founder, Zhang Yiming, was twenty-nine when he started the company, in 2012. Zhang, who rarely gives interviews, was raised in Fujian Province, the son of a civil servant and a nurse, and attended university in the northern port city of Tianjin. He briefly worked at Microsoft in China, and bounced between startups for a while. He then pitched Chinese investors on the idea of a news-aggregation app that would use machine learning to provide people with whatever they wished to read. The app, called Jinri Toutiao, was launched within the year. Its name means “today’s top headlines.” It’s a bit like Reddit, if Reddit were guided by A.I. rather than by the upvotes and downvotes of its readers.

Like TikTok, Toutiao starts feeding you content as soon as you open it, and it adjusts the mix by tracking and analyzing your scrolling behavior, the time of day, and your location. It can deduce how its users read while commuting, and what they like to look at before bed. It reportedly has around a hundred and twenty million daily active users, most of whom are under thirty. On average, they read their tailored feeds for more than an hour each day. The app has a reputation for promoting lowbrow clickbait.

In China, daily life has become even more tech-driven than it is in the U.S. People can pay for things by letting cameras scan their faces; last year, a high school in Hangzhou installed scanners that recorded classrooms every thirty seconds and classified students’ facial expressions as neutral, happy, sad, angry, upset, or surprised. The Chinese government has been assembling what it calls the Social Credit System, a network of overlapping assessments of citizen trustworthiness, with opaque calculations that integrate information from public records and private databases. The government has also set benchmarks for progress in artificial-intelligence development at five-year intervals. Last year, Tianjin announced plans to put sixteen billion dollars toward A.I. funding; Shanghai announced a plan to raise fifteen billion.

There are two principal approaches to artificial intelligence. In symbolic A.I., humans give computers a set of elaborate rules that guide them through a task. This works well for things like chess, but everyday tasks—identifying faces, interpreting language—tend to be governed by human instinct as much as by rules. And so another approach, known as neural networks, or machine learning, has predominated in the past two decades or so. Under this model, computers learn by recognizing patterns in data and continually adjusting until the desired output—a correctly labelled face, a properly translated phrase—is consistently achieved. In this sort of system, the quantity of data is, broadly speaking, more important than the sophistication of the program interpreting it. The sheer number of users that Chinese companies have, and the types of data that come from the integration of tech with daily life, give those companies a crucial advantage.

Chinese tech companies are often partly funded by the government, and they openly defer to its requests, turning over user messages and purchase data, for instance. Tencent, which owns WeChat, has a “Follow Our Party” sign on a statue in front of its headquarters. The Wall Street Journal has reported that a ByteDance office in Beijing includes a room for a cybersecurity team of the Chinese police, which the company informs when it “finds criminal content like terrorism or pedophilia” on its apps. Last year, ByteDance was ordered to suspend Toutiao and to shut down a meme-centric social app called Neihan Duanzi—the name means something like “implied jokes”—because the content had become too vulgar, too disorderly, for the state. Zhang issued an apology, written in the language of government control. ByteDance had allowed content to appear that was “incommensurate with socialist core values, that did not properly implement public opinion guidance,” he said.

Three days later, the Times reported that the Chinese government had deployed facial-recognition technology to identify Uighurs, a Muslim minority in the country, through its nationwide network of surveillance cameras. China has imprisoned more than a million Uighurs in reëducation camps, in Xinjiang, and has subjected them to a surge in arrests, trials, and prison sentences. In August, I asked a ByteDance spokesperson about the fear that the massive trove of facial closeups accumulated on its various products could be misused. Even if people trusted ByteDance not to do anything sinister, I said, what if a third party got hold of the company’s data? The spokesperson told me that the data of American users was stored in-country—TikTok’s data is now kept in the U.S. and Singapore, the rep said—and noted, nonchalantly, that people made their faces available to other platforms, too. Of course, U.S. tech companies often don’t seem answerable enough to the government—or, rather, to the public. The American system has its own weaknesses.

Dinesh Raman, an A.I.-alignment researcher in Tokyo, who has studied ByteDance as a consultant for some of its investors, spoke with a mixture of alarm and admiration about the company’s A.I. capabilities. “The system is doing billions of calculations per second,” he said. “It’s data being transmitted at a scale I’ve never seen before.” Raman insisted that TikTok had kept its platform tightly policed in part through its algorithm, which, he said, is able to identify videos with dangerous content. (TikTok’s moderators are trained to apply different standards to every market, the company told me.) He pointed me to the “Gaga Dance” challenge, a meme on Indonesian TikTok that asked users to mirror the poses of cheerful yellow stick figures that floated across the screen. The A.I., he suggested, was training itself in pose estimation, a deep-learning capability with major surveillance implications. OpenPose, a program developed at Carnegie Mellon, has been used by a Japanese telecom company to alert shopkeepers to customers whose movements supposedly signal that they are likely to steal something.

The Chinese government is more interested in surveilling and controlling its own citizens than it is in monitoring foreign nationals; one of the reasons that ByteDance launched TikTok as a separate entity from Douyin was to establish a firewall between the Chinese state and users outside China. But state interference can cross borders. In August, Facebook and Twitter revealed that they’d found evidence of a Chinese-government campaign to spread disinformation about the protests in Hong Kong, which began in June by calling for the withdrawal of an extradition bill and have since widened in scope, demanding democratic reforms.

If you pull up the hashtag #HongKong on TikTok, you’ll find plenty of videos, but few, if any, about the protests. The hashtag #protest elicits demonstrations from around the globe—London, Melbourne, South Africa, and, especially, India—but almost none from Hong Kong. (On Instagram, both #HongKong and #protest call up plenty of such images.) Meanwhile, a search for one of the primary Chinese-language hashtags that Hong Kong protesters have used on other platforms yields a small handful of videos, with a total of a hundred and ten thousand views. (As the Washington Post noted, in a piece investigating the relative absence of the Hong Kong protests on TikTok, videos hashtagged #snails have more than six and a half million views.) It’s true that the Hong Kong user base is not large, relatively speaking—TikTok told me that the app had fewer than a hundred and fifty thousand daily active users there—though that is the case for Twitter, too, and videos from the protests have gone viral on that platform. TikTok is generally thought of as a place for goofing off rather than for engaging in political discourse, and a TikTok executive dismissed the idea that the company was manually or algorithmically suppressing Hong Kong-related content. But one of the risks of giving our attention to entertainment governed by privately controlled algorithms is that those who own the algorithms will always be able to say that they are merely delivering what we want to see.

A platform designed for viral communication will never naturally be politics-free. In August, a new sort of video started appearing on Douyin. Uighurs in China were using the app’s editing suite to place themselves against a backdrop of loved ones who have disappeared, as sad string music plays. In one, a tearful young woman wearing a yellow shirt holds up four fingers, one for each person in the photo behind her. It may be a double signal: “four” and “death” are pronounced similarly in Mandarin. Douyin has deleted many of these videos, although, like everything that goes viral on TikTok, they have found an audience on Instagram and Twitter.

TikTok is not the first social-media app to begin its life with an air of freewheeling fun. The darker and more complicated parts of life never stay away forever. A college student from Philadelphia recently went viral with a multipart video account of her relationship with the rapper and onetime Vine star Riff Raff, which began when she was seventeen. A Miami student was arrested after his videos were interpreted as threats to shoot up schools. TikTok may figure out how to maintain or enforce a jovial vibe more effectively than its predecessors have—but, even if it does, the kids who made it popular may get bored and move on to the next thing.

Whatever comes along will likely owe something to TikTok. Facebook has already released a TikTok clone, called Lasso, which flopped, and the app researcher Jane Manchun Wong recently discovered that Instagram has been testing TikTok-like features. A.I.-powered algorithms are becoming central to the ways that we process our everyday existence. Someday, other companies could use ByteDance’s A.I. systems the way they now use Google’s cloud-computing services: like a utility—gas or electricity for the new A.I.-driven world.

“People say TikTok will run out of money, that it’s going to end up like Vine,” Bern, the marketer, said. “But TikTok has one of the biggest companies in China behind them. ByteDance is way ahead of everyone else already, in terms of the way they use A.I. They know everything about a person. They can give that person everything they want.”

In August, I took the train from Atlantic Terminal, in Brooklyn, to Patchogue, on the South Shore of Long Island, where the eighteen-city Boys of Summer teen-influencer tour was stopping for the day. It was sultry and cloudless, and as I walked up to the designated venue on Ocean Avenue I saw a pack of girls, who looked to be thirteen or fourteen, in jean shorts and braces and tube tops, and a few floppy-haired boys who looked slightly older—TikTok-famous heartthrobs named Sam and Josh and Payton, who were hugging their fans, taking selfies, accepting scrunchies as offerings and stacking them on their arms. “I love you,” the girls yelled. “I love you, too,” the guys said back. Video-making had been incorporated into this ritual in a startlingly seamless way: before one girl could finish asking a TikToker to make a video saying hi to her friend Adrianne, the TikToker was halfway through a video saying hi to her friend Adrianne.

Inside the venue, parents were drinking Michelob Ultra and staring into the middle distance. Kids were making TikToks everywhere, phones propped up on bar railings; they were moving on and off the Internet, dead serious about getting their content. In the meet-and-greet line, I talked to a blond fourteen-year-old in a white bucket hat named Dylan Hartman, who has more than half a million followers, and whose videos often feature him shirtless, brushing his hair back, lip-synching to rap. “That’s the one they all want to marry,” a mom who was chaperoning her daughter and a friend whispered to me. Another TikToker, Grasyn Hull, was wearing a “Virginity Rocks” shirt that a fan had given him. “I make memes and stuff, and I just blew up,” Hull said.

The crowd was almost entirely female, and about three-quarters of the TikTokers were male; occasionally, a sharp hormonal whiff of agony and longing would enter the air. Nearly everyone was white, and nearly everyone was mouthing along to hip-hop and doing viral dances, making sinuous, jerky movements. This is the way people learn to move, perhaps, when the ruling idea is that your physical presence should pop when viewed on a smartphone. I watched Zoe Laverne, a blond social-media star, make content on outstretched phones as reflexively and smoothly as a President shakes hands along a receiving line.

Then the lights went down and the children started screaming. The m.c. asked us to raise our left hands and promise, in unison, to have a “lit time.” Later, in line at the merch table, I talked to a thirteen-year-old girl named Beau, from New Jersey, who told me that a good TikToker was someone who “did things that made you want to watch them.” She’d been on short-form-video platforms since the third grade, when she downloaded Musical.ly. Many of the kids I talked to said that TikTok made them feel connected to other people their age. The memes surfaced glancing sensations that might otherwise be forgotten, or stay private: what it was like to sit in the back seat while your mom drove around listening to Calvin Harris; what it was like to be little, and sleepless, standing nervously outside your parents’ bedroom door at 3 A.M.

I had stopped impulsively checking TikTok after a month—I already have enough digital tools to insure that I never need to sit alone with the simple fact of being alive. But I could understand being thirteen and feeling like the world would be better if as many people as possible could be seen by as many people as possible all the time. I could imagine experiencing a social platform as a vast, warm ocean of affection and excitement, even if that ocean needed money that it could generate only by persuading you not to leave. I wondered how many baby siblings of these TikTok fanatics were at home, sitting in front of iPads, adrift in an endless stream of YouTube videos. Perhaps the time had come to let the algorithm treat the rest of us like babies, too. Maybe it knows more about what we like than we do. Maybe it knows that if it can capture our attention for long enough it won’t have to ask us what we like anymore. It will have already decided. ♦