But inside Meta, kids safety experts have long raised red flags about relying on such features. And their use has been shockingly infrequent.
By the end of 2022, less than 10 percent of teens on Meta’s Instagram had enabled the parental supervision setting, according to people familiar with the matter who spoke on the condition of anonymity to discuss private company matters; of those who did, only a single-digit percentage of parents had adjusted their kids’ settings.
Internal research described extensive barriers for parents trying to supervise their kids’ online activities, including a lack of time and limited understanding of the technology. Child safety experts say these settings are an industry-wide weakness, allowing tech companies to absolve themselves while requiring parents to do the heavy lifting.
“The dirty secret about parental controls is that the vast majority of parents don’t use them,” said Zvika Krieger, the former director of Meta’s responsible innovation team who now works as a consultant for technology companies. “So unless the defaults are set to restrictive settings, which most are not, they do little to protect users.”
The efficacy of parental controls is likely to be spotlighted Wednesday at a Senate Judiciary Committee hearing on the rising risk of child sexual exploitation online. Prominent CEOs — including Meta’s Mark Zuckerberg, Snap’s Evan Spiegel, TikTok’s Shou Zi Chew, and Linda Yaccarino of X, formerly Twitter — are expected to testify.
Parental controls have taken off in the technology industry, as concern rises about kids becoming targets of predators and being exposed to toxic content. Months after Meta launched a parental supervision tool for Instagram in March 2022, Snapchat followed suit. Though Discord previously condemned supervisory tools, it launched its own parental controls last year, after highly classified documents leaked on the platform. TikTok also offers parents a way to restrict their teens’ use of the app.
Wednesday’s Senate hearing comes as federal and state legislators push to expand protections for kids online, including by requiring tech companies to give parents more ways to manage their children’s activity. Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) are proposing legislation requiring platforms to let parents manage minors’ privacy settings, restrict their digital purchases and limit the amount of time they spend on apps.
And Republican-led state legislatures such as those in Utah and Arkansas have passed laws requiring tech companies to vet users’ ages and get consent from parents before letting teens access their sites. A law in Texas — the Securing Children Online through Parental Empowerment Act — requires that platforms provide parents with the ability to alter their children’s settings, restrict their transactions and limit their screen time, much like the congressional bill.
“By the time kids get to be teenagers and get into high school, the emphasis moves away from parental controls,” said Stephen Balkam, founder and CEO of the Family Online Safety Institute, a child advocacy group that works with tech companies on safety issues. “An increasing number of parents give up at that age, or the kids find ways around them.”
Meta spokesman Andy Stone said in a statement that the company has invested in building default protections to help teens have safe and “age appropriate” experiences online.
“On top of that, we’ve also created easy-to-use parental supervision features because parents told us they wanted even more options to shape their teens’ experiences — and we want to help them,” Stone said.
“Whether it’s via ad campaigns like those in the Washington Post, in-app promotion or events with parents, we’re always working to make sure parents know about and can choose to use these features.”
Stone added that the company believes the best way to support families is for an “industry-wide solution that allows parents to approve all their teen’s app downloads in the app store itself.”
Snapchat spokesman Pete Boogaard said, “We want to empower parents with tools and resources so they can make decisions for their teens based on their family values.”
Meta faced increasing pressure to protect young users in 2021, after Facebook whistleblower Frances Haugen disclosed internal research suggesting Instagram was hurting teen girls’ mental health.
Amid the resulting outcry, Meta released new parent supervision tools, along with a suite of updates to protect kids without involving their parents. One feature nudged teens to take a break after scrolling for a set amount of time. Other measures aimed to prevent adults from finding and messaging teens they didn’t already know.
In March 2022, the company launched an Instagram family center, allowing parents to view how much time their teens spend on Instagram or be notified when their teen reports an account, follows an account or is followed by someone. Meta uses a similar supervision system for its virtual reality headset line, Quest.
But there have been steep barriers to use. To ensure teens are being supervised by their actual parents and not a random adult, Meta requires a lengthy setup process — including an invitation from the teen. Adoption was slow to start, with only hundreds of parents initially opting in, one of the people familiar with the matter said.
To use the tools, parents have to navigate a bevy of settings. Tech companies often default to the lowest possible restrictions, which allow engagement and time on the apps to flourish. For instance, Meta offers teens time limits and scheduled breaks at night. But these settings are turned off and must be added by the parent.
“This is a company that’s not afraid of making decisions on users’ behalf — like there is a reason why your feed is primarily their recommendations,” said Arturo Béjar, a former consultant for Meta who recently testified before Congress about the company’s impact on children. “So I think it’s a matter of where they choose to make these decisions and where they choose not to make them.”
Additionally, supervision tools can lead to tense conversations between parents and children — in some cases exacerbating friction by design. Meta recently launched new tools for parents on Quest headset devices, including a menu of “age-appropriate apps” for preteens. But the tweens can also easily find apps that they are not eligible to use, setting up a potential conflict.
The company “could take on that responsibility” and restrict search results, one former employee said. “But instead, you’re saying no, parents, you make that final call.”
Despite the slow takeoff, Meta saw the tools as a success. The company regularly evaluated how the features improved perceptions of its products’ safety, according to the people familiar with the matter. One 2020 internal report on Facebook users with problematic social media habits said they valued features like time-management tools, parental controls and a temporary no-Facebook mode.
Even before Meta released the tools, researchers questioned whether parents had enough time to police their kids’ online activity. In one 2020 document called “Parents say they want parental controls, but actual use is low,” the authors drew on internal analysis and external research to illustrate some of the barriers facing adults trying to manage their kids’ online activity.
“Parents see digital management as part of parenting, but it is also a lot of WORK! — i.e., it requires effort that people don’t necessarily always want to, can give, or comprehend,” the researchers wrote. “As such, there’s sometimes a disjuncture.”
A separate 2020 Meta report found that parent supervision could be difficult. In homes where parents were more lenient, older teens often limited their younger siblings social media use. But when parents actively managed their kids’ digital experience, they had conflict over when and how those limits were enforced, according to the report.
“Parents also struggled to effectively enforce limits when many were considered ‘addicted’ to social media/phones themselves by their children,” the researchers wrote.
Still, Meta and other tech companies are unlikely to depart from the parent supervision strategy anytime soon. But Congress can evaluate whether their services are safe enough for younger users, said Vaishnavi J, a technology policy adviser who was Meta’s head of youth policy.
“I would argue that parents are really stretched thin, and this idea that they’re gonna have to turn on parental controls for anywhere from three to seven apps on their kids’ phones — it’s just really impractical,” she said.
Cristiano Lima-Strong and Will Oremus contributed to this report.