Tomorrow, the U.S. Supreme Court will decide whether it should step in to block or delay the implementation of a law that would ban TikTok from operating in the U.S. If not blocked, the law will force TikTok to cease operations in the U.S. on January 19, unless its Chinese corporate owner (Bytedance) sells to a buyer not controlled by a foreign adversary. The case hinges entirely on constitutional arguments pertaining to national security and free speech. The Justices will hear no evidence about addiction, depression, sexual exploitation, or any of the many harms to children that have been alleged, in separate lawsuits filed by 14 state Attorneys General, to be widespread on TikTok.
The upcoming ban will also be adjudicated in the court of public opinion as Americans try to decide whether the loss of access to TikTok would be a reason to protest or celebrate. In this post we argue that Americans should welcome the disappearance of TikTok because the company is causing harm to children, adolescents, and young adults at an industrial scale.
Our evidence comes mostly from research done by those 14 Attorneys General. Some of their briefs have been posted online for the world to see. The briefs include hundreds of quotations from internal reports, memos, Slack conversations, and public statements in which executives and employees of TikTok acknowledge and discuss the harms that their company is causing to children. We organize the evidence into five clusters of harms:
-
Addictive, compulsive, and problematic use
-
Depression, anxiety, body dysmorphia, self-harm, and suicide
-
Porn, violence, and drugs
-
Sextortion, CSAM, and sexual exploitation
-
TikTok knows about underage use and takes little action
We show that company insiders were aware of multiple widespread and serious harms, and that they were often acting under the orders of company leadership to maximize engagement regardless of the harm to children. As one internal report put it:
“Compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety,” in addition to “interfer[ing] with essential personal responsibilities like sufficient sleep, work/school responsibilities, and connecting with loved ones.”
Although these harms are known, the company often chooses not to act. For example, one TikTok employee explained,
“[w]hen we make changes, we make sure core metrics aren’t affected.” This is because “[l]eaders don’t buy into problems” with unhealthy and compulsive usage, and work to address it is “not a priority for any other team.”
Although the evidence below is all publicly available, no one we know of has compiled and combined direct quotations from company insiders and internal reports across multiple alleged harms. We think this compilation gives vital information to parents, who might want some insight into the character and business practices of a company that owns much of their children’s attention and influences their social development. Parents might want to know that TikTok knows that its parental controls are ineffective and rarely used:
In another internal document, TikTok admitted that “user research” shows that “[f]amilies do not use Family Pairing” and that “Family Pairing doesn’t address parents’ top concerns,” including “inappropriate content, offensive interactions, and lack of privacy.
And even if parental controls worked and parents chose to shield their kids from bad stuff, they can’t because TikTok’s content moderation is poor. An internal study found that the “leakage rate” (of bad stuff getting past moderators) is as follows: 35.71% of “Normalization of Pedophilia” content; 33.33% of “Minor Sexual Solicitation” content; 39.13% of “Minor Physical Abuse” content; 30.36% of “leading minors off platform”; 50% of “Glorification of Minor Sexual Assault”; and 100% of “Fetishizing Minors.”
For those who think that social media is relatively harmless, we urge you to read the quotations and internal studies described below, in which employees of TikTok discuss the vast and varied harms that they are causing to literally millions of American children each year.
The inspiration for this post was a legal brief filed by the Kentucky Attorney General that was improperly redacted. Redaction is the process in which the AG’s office will black out some of the most damning revelations and quotations before releasing their brief to the public. The redacted sections often contain trade secrets and other text that the company has a legitimate reason to keep private.
But when the Kentucky AG’s office was preparing to post their brief against TikTok, whoever was in charge of doing the redaction simply covered the relevant text with black rectangles. Even though you can’t see the text while reading the PDF, you can just use your cursor to select each black section, copy it, and then paste it into another file to read the hidden text. It is great fun to do this — try it yourself! Or just read our version of the brief in which we have done this for you.
In the rest of this post we organize the direct evidence of harm that is now available to us, taken directly from employees and leaders at TikTok. We give only some highlights here in this post, but you can see our more comprehensive listing of the relevant quotations in a separate Google doc.
We draw on four briefs filed by state AGs in their suits against TikTok: Kentucky v. TikTok, Utah v. TikTok, Nebraska v. TikTok, and New York v. TikTok. You can learn more about each in Footnote 9.
[Note that in harm clusters 1 through 5, below, text in bold is direct quotations from company employees and internal memos. Text not in bold is direct quotations copied from the indicated portion of the indicated AG brief, which sets up the relevant quotation from company insiders. [Italicized text in brackets is annotations from us — Jon and Zach.] For each harm, we draw from the four briefs, and we supplement some sections with reports from journalists in major outlets who discovered relevant information or ran their own experiments by setting up fake accounts for minors on TikTok.]
[Among the most widely reported harms from TikTok is its ability to pull young people in and not let them go, for hours at a time. TikTok’s algorithm is widely regarded as best-in-class for keeping users scrolling. A 2024 report from Pew finds that 33% of American teens (ages 13 to 17) say that they are on a social media platform “almost constantly,” with 16% saying that just for TikTok. (We estimate that in 2023, there were roughly 21.8 million teens (13-17) in the U.S., which translates to about 3.4 million American teens claiming they are on TikTok almost constantly). Below you can see that TikTok is aiming to create just such compulsive use, which in turn can lead to problematic use disorders and behavioral addictions, which then compound the harms in the other four clusters. The company does this even though many of its employees believe their product is bad for children’s development.]
-
KY P. 7, PARA 18
-
KY P. 8, PARA 19 [REDACTED BUT RETRIEVABLE TEXT]
-
KY P. 20, PARA 64 [REDACTED BUT RETRIEVABLE TEXT]
-
An internal presentation on the 2021 strategy for TikTok describes the company as being in an “arms race for attention[.]”
-
[Below is a redacted graph from para 67 of KY brief. It shows that TikTok has reached saturation among the 29.7 million US users under the age of 17 who own a smartphone. This means that they can’t get more young users, but they can get more time out of each user, especially if they pull them away from competing platforms.]
-
-
KY P. 40, PARA 121 [REDACTED BUT RETRIEVABLE TEXT]
-
In an unnamed internal TikTok Defendants document from 2019 summarizing use by age, the author concluded:“As expected, across most engagement metrics, the younger the user the better the performance.”
-
-
KY P.40, PARA 125 [REDACTED BUT RETRIEVABLE TEXT]
-
KY P. 55, PARA 181 [REDACTED BUT RETRIEVABLE TEXT]
-
As an internal guide on push notifications explained, a key goal of TikTok’s push notifications is to “Activate & Engage users with the right content at the right time, to encourage users to open the App more and stay longer.” TikTok uses different kinds of push notifications to achieve this goal. For example, TikTok’s “Interest Push” aims to “activate users so they will return to the app.”
-
-
KY P. 67, PARA 223 [REDACTED BUT RETRIEVABLE TEXT]
-
UT P. 4, PARA 11
-
Despite admitting internally that LIVE poses “cruel[]” risks to minors— encouraging “addiction and impulsive purchasing of virtual items,” leading to “financial harm,” and putting minors at “developmental risk”—TikTok continues to use manipulative features to increase the time and money users spend on the app. [This quote is referencing TikTok’s LIVE feature]
-
-
NE P. 14, PARA 52
-
According to Defendants, TikTok’s incredible advertising success is attributable to the fact that its users are “fully leaned in and immersed compared to other platforms.” Defendants describe TikTok as “the leading platform for Information Density” because of its “algorithm and shorter video formats” that “create continuous cycles of engagement.”
-
-
NE P. 20, PARA 72
-
As Defendants have explained, TikTok’s success “can largely be attributed to strong . . . personalization and automation, which limits user agency” and a “product experience utiliz[ing] many coercive design tactics,” including “numerous features”—like “[i]nfinite scroll, auto-play, constant notifications,” and “the ‘slot machine’ effect”—that “can be considered manipulative.”
-
-
NE P.21, PARA 76
-
Defendants admit that teens are especially susceptible to compulsive usage of the TikTok platform. Internal documents highlight the fact that minor users are “particularly sensitive to reinforcement in the form of social award,” have “minimal ability to self-regulate effectively,” and “do not have executive function to control their screen time.”
-
-
NE P. 27, PARA 97
-
In a “TikTok Strategy” presentation, Defendants celebrated the fact that users spend inordinate amounts of time on the platform. “TikTok is in most people’s lives like this,” Defendants explained, referring to online posts that read, “go on tiktok for 5 mins and 3 hours have passed” and “my night routine: watch 3 hours of tiktok videos, try to follow the dance steps, realise u suck at dancing n cry about it, continue watching tiktok videos, sleep.”
-
-
NE P. 27, PARA 99
-
As one internal report noted, after surveying academic literature on the effects of social media on adolescents, “TikTok is particularly popular with younger users, who are seen as more vulnerable to online harms and the negative impacts of compulsive use.”
-
-
NE P. 28, PARA 102
-
Another internal report based on in-depth interviews with TikTok users found that overuse of TikTok caused “negative emotions,” “interfered with [users’] obligations and productivity,” and led to “negative impacts . . . on their lives,” including “lost sleep, missed deadlines, poor school performance, running late, etc.” It reported that “many participants described their use of TikTok disturbing their sleep, which limited their productivity and performance the following day,” and that “[e]very participant indicated that time management on TikTok was especially difficult compared to other social media platforms.”
-
-
NE P. 33, PARA 115
-
But internally, Defendants admit the truth, that real users report “feeling like they are trapped in a rabbit hole of what our algorithm thinks they like.”
-
-
NY P. 16, PARA 88
-
Alexandra Evans, again prior to becoming a TikTok executive, co-authored a report explaining how coercive design impacts teenagers: “Persuasive design strategies exploit the natural human desire to be social and popular, by taking advantage of an individual’s fear of not being social and popular in order to extend their online use. For young people, identity requires constant attention, curation and renewal. At key development stages it can be overwhelmingly important to be accepted by your peer group.”
-
[These are the main harms we focused on in The Anxious Generation, although as you can see in the other four clusters, the harms caused by TikTok go far beyond mental health problems.]
-
KY P. 60, PARA 196 [REDACTED BUT RETRIEVABLE TEXT]
-
KY P. 65, PARA 213 [REDACTED BUT RETRIEVABLE TEXT]
-
The TikTank [internal TikTok group studying issues affecting TikTok] Report also found that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.” Additionally, “compulsive usage also interferes with essential personal responsibilities like sufficient sleep, work/school responsibilities, and connecting with loved ones.”
-
-
KY P. 84, PARA 260 [REDACTED BUT RETRIEVABLE TEXT]
-
In one experiment, Defendants’ employees created test accounts and observed their descent into negative filter bubbles. One employee wrote, “After following several ‘painhub’ and ‘sadnotes’ accounts, it took me 20 mins to drop into ‘negative’ filter bubble. The intensive density of negative content makes me lower down mood and increase my sadness feelings though I am in a high spirit in my recent life.” Another employee observed, “there are a lot of videos mentioning suicide,” including one asking, “If you could kill yourself without hurting anybody would you?”
-
-
[No direct quotes from TikTok employees, but see pages 36-43 for a section of the brief that describes the videos that were sent to fictitious accounts created by the AG’s office, pretending to be 13, 15, and 17 year old Nebraska residents. “Within minutes of scrolling through TikTok’s “For You” feed—before the accounts had searched for any videos or “followed” any users—TikTok’s algorithm repeatedly exposed each Nebraska teen account to overtly mature and otherwise inappropriate content.” Some of the videos sent to young girls—just on the basis of their age and gender—clearly encouraged young girls to starve themselves.]
-
[Some of the videos also clearly celebrate suicide as the way to escape from psychological pain.]
[There is widespread exposure to pornographic, violent, and drug-related content on TikTok. This content is often viewed on one’s newsfeed and through TikTok’s “live” features. Although nudity, pornography, sexually explicit content, non-consensual sexual acts, the sharing of non-consensual intimate imagery and sexual solicitation violates TikTok’s guidelines, the content is easily accessed and recommended to users.]
-
KY P. 38, PARA 115 [REDACTED BUT RETRIEVABLE TEXT]
-
In an internal document discussing how to respond to the [Wall Street Journal] series, TikTok employees acknowledged material failures in their process, including but not limited to the fact that “46.5% sexualized and drug content shared by WSJ is not covered by [the existing moderation] policy (ANSA 55%, Drug 24%).” Similarly, “[t]he moderation leakage rate of sexualized and drug content is 73.5% (ANSA 58%, Drug 90%).” The reason for this moderation failure is that “most prevalent policy titles are sexually explicit language and mention of drugs,” whereas “implicit language [e.g., coded language] is often used in videos and failed to be captured [sic] by moderators.”
-
-
KY P. 53, PARA 168 [REDACTED BUT RETRIEVABLE TEXT]
-
Horrifyingly, the report (TT Live & US Safety Summit, “Project Meramec”) also confirms that “Minors Easily Access Livestream Feed” and that there is “[n]o age-related feed strategy.” Further, the report acknowledges that “[o]ne of our key discoveries during this project that has turned into a major challenge with Live business is that the content that gets the highest engagement may not be the content we want on our platform. Transactional sexual content incorporates hundreds of signals that inform the [algorithm] as well as LiveOps metrics of success – # of gifts, frequency of hosts going live, # of comments, etc.”
-
-
KY P.106, PARA 341 [REDACTED BUT RETRIEVABLE TEXT]
-
Although TikTok boasts thorough content review processes, it does not disclose significant “leakage” rates, measuring the percentage of violative content that is not moderated or removed. Internally, TikTok knows the rate at which certain categories of content leak through its moderation processes, including: 35.71% of “Normalization of Pedophilia” content; 33.33% of “Minor Sexual Solicitation” content; 39.13% of “Minor Physical Abuse” content; 30.36% of “leading minors off platform”; 50% of “Glorification of Minor Sexual Assault”; and “100% of “Fetishizing Minors.”
-
-
UT P. 5, PARA 13
-
TikTok also knows that LIVE is being used for money laundering and other criminal activities.
-
PARA 14: In 2021, TikTok launched “Project Jupiter” to investigate suspicions that organized crime was using LIVE to launder money through TikTok’s gifting feature. TikTok discovered that criminals were selling drugs and running fraud operations on LIVE. [TikTok has a virtual currency system where users can “gift” one another].
-
PARA 15: TikTok admits that sexual exploitation and illegal activities on LIVE are “controversial” and worsened by its own monetization scheme. Despite acknowledging internally that “sexually suggestive LIVE content is on the rise,” TikTok refuses to warn consumers about these dangers. Instead, TikTok plans to “make better use of monetization methods such as gifting and subscription to gain revenue . . . .”
-
-
UT P. 10, PARA 32
-
The Division’s presuit investigation also confirmed that TikTok’s platform facilitates the sale of illegal drugs to underage children right here at our doorstep—including easily allowing TikTok users to offer the sale and delivery of drugs like Xanax, Valium, and MDMA to children in Salt Lake City.
-
-
NE P. 32, PARA 114
-
When The Journal shared “a sample of 974 videos about drugs, pornography, and other adult content that were served to minor accounts,” a spokesperson for Defendants stated that “the majority didn’t violate guidelines”—though several hundred were subsequently removed—and that “the [TikTok] app doesn’t differentiate between videos it serves to adults and minors.”
-
-
[See pages 35-36 and 43-50 for a section of the brief that describes the videos that were sent almost immediately to fictitious accounts created by the AG’s office, pretending to be 13, 15, and 17 year old Nebraska minors. Some of the videos are adult porn actresses engaging in lewd and obscene behavior on TikTok, in order to lure customers over to their Onlyfans pages, sometimes via Instagram.]
-
NY P. 45, PARA 215
-
On its website, TikTok says that users in Restricted Mode “shouldn’t see mature or complex themes, such as: [p]rofanity[, s]exually suggestive content[, r]ealistic violence or threatening imagery[, f]irearms or weapons in an environment that isn’t appropriate[, i]llegal or controlled substances/drugs[, and e]xplicit references to mature or complex themes that may reflect personal experiences or real-world events that are intended for older audiences.” [But they do, as you can see in the leakage rates found in KY P. 106, PARA 341]
-
[Recent revelations reported out from the Wall Street Journal and other outlets have shown that many social media companies and device providers (e.g., Apple) have rampant and rarely addressed cases of sextortion, child sexual abuse material (CSAM), and sexual predation occurring via their platforms/devices. This is also the case with TikTok.]
-
UT P. 3, PARA 7
-
But TikTok has long known—and hidden—the significant risks of live streaming, especially for children. By TikTok’s own admission: “we’ve created an environment that encourages sexual content.”
-
-
UT P. 4, PARA 9
-
In early 2022, TikTok’s internal investigation of LIVE, called “Project Meramec,” revealed shocking findings. Hundreds of thousands of children between 13 and 15 years old were bypassing TikTok’s minimum age restrictions, hosting LIVE sessions, and receiving concerning messages from adults. The project confirmed that LIVE “enable[d the] exploitation of live hosts” and that TikTok profited significantly from “transactional gifting” involving nudity and sexual activity, all facilitated by TikTok’s virtual currency system.
-
-
UT P. 36, PARA 115
-
In response to the Forbes article, TikTok also conducted a formal investigation into issues on LIVE called “Project Meramec.” TikTok shared the results of the investigation internally during a May 2022 “Safety Summit”:
-
PARA 116: Project Meramec confirmed that young users well under the minimum age requirement could host LIVE sessions on TikTok. The study confirmed that in just the month of January 2022 alone, 112,000 “L1” users (i.e., a metric TikTok uses to categorize users between 13 and 15 years old) hosted LIVE sessions.
-
PARA 117: These underage users also received a significant number of direct messages from adult users, raising red flags to TikTok that these minors were likely being groomed by adults. Project Meramec revealed that TikTok received not only “significant revenue” from “transactional gifting”—to the tune of one million Gifts in January 2022 alone—but also that this revenue was in large part generated through transactions for sexual content.
-
-
UT P. 34-35, PARA 109
-
An internal study from December 2023, following the Forbes article, documented what TikTok admits is “the cruelty” of maintaining LIVE with its current risks for minors on the app. The study showed its LIVE feature had the following characteristics:
-
“[H]igher proportion[s] of minor users”;
-
“Minor users are more likely to access high severity risk LIVE content than adult users”;
-
For violating content like “[a]dult nudity and sexual activities (ANSA) . . . and minor-hosted LIVE rooms, minor views are likely 2 times higher than other LIVE rooms”; and
-
“Minor users lack self-protection awareness and interact more with risky LIVE content.”
-
-
-
UT P. 35, PARA 111
-
In February 2022, two TikTok leaders discussed the need to remove “egregious content from clearly commercial sexual solicitation accounts,” and were aware of issues with women and minors being sexually solicited through LIVE.
-
PARA 112: these leaders knew about agencies that recruited minors to create Child Sexual Abuse Material and commercialized it using LIVE.
-
PARA 113: In another example from a March 2022 LIVE safety survey, users reported that “streamer-led sexual engagements (often transactional) [were] commonly associated with TikTok LIVE.” Users also reported “often seeing cam-girls or prostitutes asking viewers for tips/donations to take off their clothes or write their names on their body . . . .” That same month, TikTok employees admitted “cam girls” (or women who do sex work online by streaming videos for money) were on LIVE and that these videos had a “good amount of minors engaging in it.” TikTok leaders have known since at least 2020 that TikTok has “a lot of nudity and soft porn.” An internal document from May 2020 also highlighted concerns about “camming” becoming more popular as sex workers turned to online platforms during the COVID-19 pandemic.
-
PARA 114: TikTok has long known that virtual gifting is used as a predatory grooming tactic on LIVE. TikTok has internally acknowledged that “perpetrators tend to use tactics such as gift giving, flattery, and gifting money to win the trust of minors.”
-
-
UT P. 38, PARA 125
-
In September 2022—five months after the Forbes story—an investigator found that “within minutes of browsing the [LIVE] feed” they were shown underage girls providing sexually suggestive content in exchange for money and young boys using filters to pose as girls to receive Gifts.
-
PARA 126: The investigator also found a “never-ending stream” of hosts who openly admitted that they were 14 and 15 years old while also “holding signs” or “standing in front of the camera” with a sign saying “Rose = say daddy,” “ice cream = 360 spin,” or “universe = cut shirt.”
-
[Although TikTok, like other social media companies, has an age minimum of 13 for account creation (in the U.S.) and higher age limits for certain features (e.g., TikTok LIVE at 18), underage use is common and is widely known about by the company, which does little to enforce those age limits. TikTok also regularly claims that it has effective safety features built in for users. However, the briefs make it clear that TikTok’s primary goal is keeping users on and engaged for as long as possible, which often comes at the cost of child safety.]
-
KY P. 93, PARA 288 [REDACTED BUT RETRIEVABLE TEXT]
-
Similarly, in a chat message discussing features purporting to help users manage their screentime, a TikTok employee confirmed that the company’s “goal is not to reduce the time spent” on the TikTok app, but rather to ultimately “contribute to DAU [daily active users] and retention” of users.
-
-
KY P. 93, PARA 289 [REDACTED BUT RETRIEVABLE TEXT]
-
Defendants also promote screen time management tools for Young Users that they know are ineffective. For example, an internal document seeking approval for the screentime dashboard noted that “we don’t expect significant impact to stay time with this feature since it is only improving awareness and is not an intervention.”
-
PARA 290: In fact, Defendants found—as expected—that the screen time dashboard did not affect Young Users’ usage because “minors do not have executive function to control their screen time.” The screentime dashboard did not appear to have any impact on the usage of minors.
-
-
UT P. 37-38, PARA 121
-
In May 2022, after the Forbes article came out, TikTok took steps to evaluate how ‘valuable’ its underage LIVE hosts were before it would decide to make safety changes to the feature, like increasing the minimum age requirement from 16 to 18. It found 384,833 hosts were 16 to 17—as far as TikTok was aware—and they spent over seven million minutes streaming themselves on LIVE.
-
PARA 122: Despite learning that there were a ‘high’ number of underage hosts on the platform and that these minors were receiving problematic messages from adult users, TikTok waited six months before raising the minimum age for a user to host a LIVE session from 16 to 18.
-
PARA 123: But raising the minimum age from 16 to 18 did nothing to solve the problem. TikTok’s age-gating is ineffective, and many kids still join LIVE events daily. TikTok also chose to forgo reasonable safety measures, prioritizing profits over safety, allowing unrestrained transactional sexual content and other illicit activities to thrive.
-
PARA 124: As a result, these activities have not just continued—they have exploded as LIVE has become even more popular. In 2023, a TikTok senior director was alerted by advocates who had noticed an increase in ‘teens in overtly sexualized situations on live streams controlled by someone older than 18 who is collecting money from viewers while the teen performs sexually suggestive acts.’ The advocates said they reported the streams through TikTok’s internal reporting tools, but TikTok found they did not violate its policies.
-
-
UT P. 40, PARA 132
-
TikTok recognizes internally that its age-gating is ineffective and that TikTok’s own moderation efforts on LIVE are ineffective and inconsistently applied, and TikTok hides this information from users and the public. TikTok knows this is particularly true for children, admitting internally: (1) “Minors are more curious and prone to ignore warnings” and (2) “Without meaningful age verification methods, minors would typically just lie about their age.”
-
-
UT P. 37, PARA 119
-
Given how lucrative LIVE is for TikTok, the company slow-rolled implementing safety measures, and once it did, these measures proved largely ineffective at keeping pace with the growing popularity of LIVE. This was by design—LIVE was “such a huge part of the strategy for the [TikTok app]” that TikTok employees recognized they “d[id not] know” if they could “reasonably expect to increase limitations for LIVE” even in February 2022, and even after recognizing that “it is irresponsible [of TikTok] to expect that users will use LIVE wisely.”
-
PARA 120: In other words, LIVE was too profitable to be interfered with, even to protect children.
-
-
UT P. 44-45, PARA 145
-
These policies do not adequately safeguard children and, furthermore, are not consistently applied. In 2020, TikTok unveiled an ‘internal program’ to ‘protect creators and other accounts that [TikTok] deem to be high value.’ The program featured policy shortcuts like ‘delayed enforcement,’ ‘deferred policy decisions,’ or ‘no permanent ban on Elite + Accounts,’ to protect its popular users who violate TikTok’s policies. TikTok deployed this look-the-other-way policy despite knowing that the ‘majority of elite accounts appear to run afoul of [TikTok’s] policies on sexually explicit content,’ among other violations. Approximately 1,400 minors were considered ‘elite creators.’
-
-
NE P. 58, PARA 187
-
To start, TikTok has no real age verification system for users. Until 2019, Defendants did not even ask TikTok users for their age when they registered for accounts. When asked why they did not do so, despite the obvious fact that “a lot of the users, especially top users, are under 13,” founder Zhu explained that, “those kids will anyway say they are over 13.”
-
-
NE P. 61, PARA 198
-
In another internal document, TikTok admitted that “user research” shows that “[f]amilies do not use Family Pairing” and that “Family Pairing doesn’t address parents’ top concerns,” including “inappropriate content, offensive interactions, and lack of privacy.
-
-
NE P. 65, PARA 211
-
Over the years, other of Defendants’ employees have voiced their frustration that “we don’t want to [make changes] to the For You feed because it’s going to decrease engagement,” even if “it could actually help people with screen time management.”
-
-
NE P. 65, PARA 212
-
Or as another employee put it, “[w]hen we make changes, we make sure core metrics aren’t affected.” This is because “[l]eaders don’t buy into problems” with unhealthy and compulsive usage, and work to address it is “not a priority for any other team.”
-
-
NE P. 65, PARA 213
-
As TikTok’s [redacted] candidly admitted in 2021, some of TikTok’s so-called “safety” features are little more than “good talking point[s].” Describing the “Take a Break” videos Defendants have promoted, explained that “[w]e found out through some research that they’re not altogether effective” but that “it’s good as a point to share with policymakers, ‘cause they’re kind of impressed that we’re spending time, money, and energy to get people off our platform, at least in theory.”
-
-
NE P. 65-66, PARA 214
-
Defendants, who admit internally that “screen time management” tools are “not . . . at expense of retention.” The goal is “not to reduce the time spent” but to “improve user experience and satisfaction” and ultimately “contribute to DAU [Daily Active Users] and retention.” According to internal documents, “[t]his aligns with leadership’s guidance” that there be “no impact to retention.”
-
How can it be that a product used by more than twenty million children and adolescents in the United States is also causing so much harm to its users? Many teens experience the harms of TikTok and complain about its addictive nature and its “brain rot” effects, so why don’t they just stop using it?
When Jon asks these questions to his students at NYU who are heavy users of TikTok, he commonly gets two related answers: 1) I’ve tried to quit but I just can’t do it, and 2) I can’t quit because then I won’t know what everyone else is talking about. In other words, TikTok is both behaviorally addictive and socially addictive, which means that many teens feel trapped. As Gen Z poet Kori James said, about social media: “I know it’s poison but I drink anyway.”
A recent study led by the University of Chicago economist Leonardo Bursztyn captured the dynamics of this trap. The researchers recruited more than 1,000 college students and asked them how much they’d need to be paid to deactivate their accounts on either Instagram or TikTok for four weeks. That’s a standard economist’s question to try to compute the net value of a product to society. On average, students said they’d need to be paid roughly $50 ($59 for TikTok, $47 for Instagram) to deactivate whichever platform they were asked about. Then the experimenters told the students that they were going to try to get most of the others in their school to deactivate that same platform, offering to pay them to do so as well, and asked, Now how much would you have to be paid to deactivate, if most others did so? The answer, on average, was less than zero. In each case, most students were willing to pay to have that happen.
We (Jon and Zach) teamed up with the Harris Poll to confirm this finding and extend it. We conducted a nationally representative survey of 1,006 Gen Z young adults (ages 18-27). We asked respondents to tell us, for various platforms and products, if they wished that it “was never invented.” For Netflix, Youtube, and the internet itself, relatively few said yes to that question (always under 20%). We found much higher levels of regret for the dominant social media platforms: Instagram (34%), Facebook (37%), Snapchat (43%), and the most regretted platforms of all: TikTok (47%) and X/Twitter (50%).
What, then, is the net value of TikTok to society? The harms are vast and varied, and they are hitting children, teens, and young adults the hardest, which means that TikTok may be altering developmental pathways and causing lasting changes. The net value is likely very negative. We believe that America would be much better off if TikTok were to go dark on January 19th.
No consumer product is 100% safe. We don’t remove a product if a child or two dies from it each year in a freak accident. But the harms documented here are not freak accidents. They are the common effects of the normal use of TikTok by children, many of them younger than the legal age of 13. Due to its current design, TikTok is perpetrating harm to millions of children—harm at an industrial scale—in America and around the world. These harms may not be presented tomorrow to the Justices of the Supreme Court, but we think they should be decisive in the court of public opinion. TikTok should be removed from American childhood.