Regulating Social Media: Like Cigarettes? Or Like Junk Food?


For this week’s Infinite Scroll column, Cal Newport is filling in for Kyle Chayka.

In 1881, James Bonsack patented an automated cigarette-making machine, and, in the years that followed, smoking became much more prevalent in America. Parents and reformers grew worried about kids picking up the habit. “It is not uncommon, it is said, to see a child not more than five years old smoking contentedly,” the L.A. Times warned in an article from this period. In response to this cultural trend, states began to pass measures that prohibited the sale of tobacco to minors. New Jersey established the first such law in 1883; New York followed suit three years later. By 1890, more than half the states in the union had passed similar restrictions.

Many decades later, in the nineteen-fifties, the Leo Burnett advertising agency helped invent Tony the Tiger, a cartoon mascot who was created to promote Frosted Flakes to children. In 1973, a trailblazing nutritionist named Jean Mayer warned the U.S. Senate Select Committee on Nutrition and Human Needs that the increasingly ubiquitous category of junk foods could be described as empty calories. He questioned why it was legal to apply the term “cereals” to products that were more than fifty-per-cent sugar. “I think they perhaps might more properly be called candy,” he said. Children’s-food advertisements, he claimed, were “nothing short of nutritional disasters.”

Mayer’s warnings, however, did not lead to a string of state bans on junk food. Laws were passed to improve the nutritional quality of school lunches; informational campaigns such as the food pyramid and the requirement for clear nutritional labelling were introduced. But advertising continued to target children, and consumers of all ages were free to buy and consume any amount of Frosted Flakes they desired. This health issue was ultimately seen as one that families should manage on their own.

In recent years, experts have been warning that social media harms children. Frances Haugen, a former Facebook data scientist who became a whistle-blower, told a Senate subcommittee that her ex-employer’s “profit optimizing machine is generating self-harm and self-hate—especially for vulnerable groups, like teenage girls.” A growing number of parents worry that their children are perpetually distracted and obsessed with their phones, and a mounting body of research supports these concerns. “It is time to require a surgeon general’s warning label on social media platforms, stating that social media is associated with significant mental health harms for adolescents,” Vivek Murthy, whose second term as the U.S. Surgeon General ended on Monday, wrote in an opinion piece last year.

History suggests that our collective approach to social media may be approaching a fork in the road. Haugen’s testimony helped inspire many new social-media laws. California passed the Age-Appropriate Design Code Act, which included provisions to limit data collection on children and reduce persuasive design features that might lead to overuse. Utah’s Social Media Regulation Act added a requirement that parents consent before children can set up an account. New York’s analogous legislation targeted algorithms that recommend content to kids, presumably to reduce the allure of these platforms. These laws are more like junk-food regulations than cigarette bans. They aim to make the product in question safer and give families more tools to manage their harms, but ultimately keep them available to all ages.

Other authorities, however, are choosing a different path. Last fall, Australia passed a first-of-its-kind national law that requires social-media platforms—including TikTok, Facebook, Snapchat, Reddit, X, and Instagram—to ban users under the age of sixteen. These companies were given twelve months to figure out how to enforce the law or face fines that could exceed thirty million dollars. Industry groups predictably complained, but the bold move has been broadly popular among the public. A YouGov poll revealed that seventy-seven per cent of Australians supported the law.

The question now is not whether these platforms are harmful to kids (they are) or even whether some action should be taken (it should). Instead, we have to decide whether, to children, social media is more like a Big Mac or a Marlboro. If the former, then the U.S. may be on the right track. If the latter, then Australia might provide a better example. To find an answer, it’s useful to understand why we treated cigarettes and junk food differently in the first place.

It wasn’t inevitable that cigarettes would be banned outright and junk food would be merely regulated; the twists and turns of history mattered, too. Cigarettes were first popularized in a more moralistic era. Legislators in New York, for example, assumed the enforcement of their 1886 ban would be aided by the Women’s Christian Temperance Union, the same group that would later help pass the Eighteenth Amendment, prohibiting the sale of alcohol. Jean Mayer’s Senate testimony about junk food, by contrast, took place in the midst of the Vietnam War and the Watergate scandal—hardly a period when the American people were eager for more government intervention in their affairs. The products in question were different, but that wasn’t the whole story; the vibes helped determine how we responded.

Legal standards also matter. I asked Meg Jones, a professor in Georgetown’s Communications, Culture, and Technology program who specializes in technology and law, how the legal system categorizes different kinds of possible harms to children. “We impose outright bans for kids when activities involve permanent or hard-to-reverse consequences (tattoos, contracts), addiction (tobacco, gambling), activities with high potential for exploitation (hazardous and entertainment jobs), and parents unable to assess or manage risks,” Jones told me in an e-mail. Cigarettes warrant a ban because we fear that childhood use could lead to a lifetime of addiction, increasing risks of lung cancer, heart disease, and other health problems. The consumption of junk food, on the other hand, seems like an activity that can be controlled by parents, and which, in moderation, might lead less directly to permanent damage.

How does social media fare against these more formal standards? Jones focussed on two considerations: “whether the platforms have redeeming development value, and whether parents can provide effective guidance.” Early social media was often described in utopian terms, and few people argued that it was harmful or addictive; in that context, a ban would have seemed arbitrary. But, in recent years, the utopian sheen has been stripped away, which casts doubt on the idea that social media benefits children enough to outweigh its harms. Meanwhile, parents increasingly lament their inability to control their children’s online obsessions. All of which suggests that, from a legal standpoint, platforms such as Instagram and TikTok may have key traits in common with tobacco.

Despite these realities, in the U.S., an Australian-style response to social media would be met with considerable headwinds. Many state laws have faced legal challenges, often filed on constitutional grounds by industry groups like NetChoice. In Utah, to name just one example, lawmakers had to repeal their original legislation and pass something less contestable. At the same time, Elon Musk, who paid some forty-four billion dollars to acquire Twitter, may be wielding his current influence with President Trump to prevent a similar ban.

Yet it takes a soothsayer’s confidence to predict how these legal and political obstacles will shake out in the near future. Florida’s conservative-majority legislature recently banned social-media accounts for children under fourteen—one of the strongest actions yet taken at the state level. And, last summer, the U.S. Court of Appeals for the Ninth Circuit vacated many parts of an earlier federal injunction that had been holding up implementation of California’s law.

Jones, for her part, thinks that social media is likely headed for robust restrictions. “I think age verification is going to pass constitutional scrutiny this year, and we’re going to see a wave of state laws restricting social media for kids,” she told me. “Or maybe that is just my wishful thinking.” Regardless of what happens next, what is clear is that, for the first time since kids started using social media, our society seems ready to have a real debate about whether it’s time for the government to act in a serious way. The image of screen-addled kids, while perhaps not as distressing as that of five-year-olds puffing contentedly on cigarettes, is becoming harder to ignore.

This past week, for a few hours, a bipartisan law effectively banned TikTok, rendering the app inaccessible to its hundred and seventy million American users. Then, after the incoming President vowed to work out a deal, TikTok came back, though it’s still unclear for how long. This turmoil is due to national-security concerns about the app, not its effects on children. But still these developments have potentially created a lasting impact on our civic imagination. If there was once a taboo on social-media bans, it may be gone now. Perhaps the vibe is shifting again. ♦



Source link

About The Author

Scroll to Top