Meta wants parents, app stores to keep teens away from dangerous apps

Meta is pushing rival tech giants such as Google and Apple to play a bigger role in keeping teens away from potentially harmful sites, pushing for legislation for the first time to require app stores to obtain parental consent when users between the ages of 13 and 15 download apps.

The proposal, which the parent company of Facebook and Instagram will announce Wednesday, runs counter to growing calls from state and federal policymakers to proactively screen children on individual sites to limit use of social media platforms over safety concerns.

Antigone Davis, Meta's global head of security, argues that “the best way to support parents and young people is with a simple, industry-wide solution where all apps are secured to the same, consistent standards,” according to a blog post shared exclusively. With The Washington Post.

“With this solution, when a teen wants to download an app, app stores would be required to notify parents, just as parents would be notified if their teen tries to make a purchase,” Davis wrote. “Parents can decide if they want to approve the download.”

Meta's latest stance comes as policymakers debate what responsibility the various Silicon Valley giants should take to protect young people on internet platforms. In recent years, lawmakers and child safety advocates have largely focused on combating the harmful experiences of children on social media apps such as Instagram, Snapchat and TikTok.

While Davis' blog post did not name any specific company, if implemented, the company's proposal would shift much of the verification of children's ages to Google's Play Store and Apple's App Store.

“We're really trying to create something that's easy and consistent for parents … If parents are approving apps, what you don't want is for parents to be chasing every single app,” Davis said in an interview Tuesday.

Davis' comments come as states take new steps to limit children's access to social media amid renewed concerns that such products could harm the mental health and well-being of young users. States including Arkansas and Utah passed laws this year requiring minors to obtain parental consent to create accounts on platforms including Meta-owned Instagram and TikTok. Some of these proposals mandate that tech companies try to verify users' ages.

Federal lawmakers have proposed similar bills to create an age minimum for social media, amid concerns that the sites contribute to adolescent mental health problems such as anxiety and depression.

But the proposals face significant hurdles to implementation as companies struggle to develop non-intrusive and effective ways to verify users' ages.

Industry groups and digital rights advocates have criticized the effort, arguing that such laws would force companies to collect more information from young users, undermining children's privacy. Several laws have also faced major legal hurdles, with federal judges halting measures in Arkansas and Utah on the grounds that they may be unconstitutional.

In a blog post, Davis argues that allowing parents to “verify a teen's age when they set up their phone” negates “the need for everyone to verify their age multiple times with multiple apps.”

“Teens move interchangeably between many websites and apps, and social media laws that hold different platforms to different standards in different states mean teens are inconsistently protected,” Davis writes.

Both Google and Apple Provides optional services that allow parents to manage or block their children's app downloads. Meta's proposal would make it a federal requirement that app stores obtain parental consent before downloading apps for some teens.

Meta has faced increasing scrutiny in recent years over its efforts to protect children, which reached new heights in 2021 after Facebook whistleblower Frances Haugen published internal research showing that the company's products sometimes worsened body image problems among teenage girls.

Last week, another former Facebook employee testified to Congress that the company ignored internal warnings that it was failing to devote adequate resources and staff to protecting its most vulnerable users, especially children. At the same time, the company has been working to attract younger users as it competes with rival apps like Snapchat and TikTok, both popular among young people.

This isn't the first time the Meta facsimile has upset Apple. In 2021, Meta released A Marketing campaign Targeted online advertising has been found to have helped small businesses, following a phone maker's decision to reduce the practice under new privacy rules.

Meta's leadership has suggested that companies like Apple and Google should play a bigger role in verifying users' ages as early as 2021, but Wednesday's announcement is the first time the company has publicly called for federal legislation mandating such a system.

Davis told The Post that the company is “actively engaging” with Silicon Valley peers to “find a solution across the industry,” as well as with government officials.

“We've had discussions at the state level with state legislators as various pieces of legislation have been passed,” as well as with federal policymakers, he said.