F.T.C. Seeks ‘Blanket’ Ban on Meta’s Use of Young Users’ Data
The Federal Trade Commission escalated its fight with the tech industry’s biggest companies on Wednesday as it moved to impose what it called a “blanket prohibition” on the collection of young people’s personal data by Meta, Facebook’s parent company.
The commission wants to significantly expand a record $5 billion consent order with the company from 2020 and said that Meta had failed to fully meet the legal commitments it made to overhaul its privacy practices to better protect its users.
Regulators also said Meta had misled parents about their ability to control whom their children communicated with on its Messenger Kids app and misrepresented the access it gave some app developers to users’ private data.
The proposed changes mark the third time the agency has taken action against the social media giant over privacy issues.
“The company’s recklessness has put young users at risk,” Samuel Levine, the director of the F.T.C.’s Bureau of Consumer Protection, said in a press statement. “Facebook needs to answer for its failures.”
The F.T.C.’s administrative action, an internal agency procedure called an “order to show cause,” serves as a preliminary warning to Meta that regulators believe the company violated the 2020 privacy agreement. The document lays out the commission’s accusations against Meta as well its proposed restrictions.
Meta, which has 30 days to challenge the filing, was not given advance notice of the action by the F.T.C.
After Facebook responds, the commission said it will consider the company’s arguments and make a decision. Meta could then appeal the agency’s decision in a federal court of appeals.
The F.T.C.’s proposed changes would bar Meta from profiting from the data it collects from users under the age of 18, and would apply to Meta businesses including Facebook, Instagram and Horizon Worlds, the company’s new virtual reality platform. Regulators want to bar the company from monetizing on that data even after those users turn 18.
That means Meta could be prohibited from using details about young people’s activities to show them ads based on their behavior or market digital items to them, like virtual clothes for their avatars.
Whether a court would approve such changes is unknown. In a statement on Wednesday, Alvaro M. Bedoya, a commissioner who voted to issue the administrative order, said he had concerns about whether the agency’s proposal to restrict Meta’s use of young people’s data was sufficiently relevant to the original case.
In a statement, Meta called the F.T.C.’s administrative warning “a political stunt” and said the company had introduced an “industry-leading” privacy program under the agreement with the F.T.C. The company vowed to fight the agency’s action.
“Despite three years of continual engagement with the F.T.C. around our agreement, they provided no opportunity to discuss this new, totally unprecedented theory,” Meta said in a statement.
Meta had already announced limits on targeting ads to users under 18. In 2021, the company said advertisers would be able to customize ads based on minors’ locations, ages and genders but would no longer be able to target ads based on young people’s interests or activities on other websites. And this year, Meta said it would also stop ad-targeting based on minors’ gender.
The F.T.C.’s aggressive action is the first time that the commission has proposed such a blanket ban on the use of data to try to protect the online privacy of minors. And it arrives amid the most sweeping government drive to insulate young Americans online since the 1990s, when the commercial internet was still in its infancy.
Fueled by mounting concerns about depression among children and the role that online experiences could play in exacerbating it, lawmakers in at least two dozen states over the past year have introduced bills that would require certain sites, like social networks, to bar or limit young people on their platforms. Regulators are also intensifying their efforts, imposing fines on online services whose use or misuse of data could expose children to risks.
Over the past few years, critics have faulted Meta for recommending content on self-harm and extreme dieting to teenage girls on Instagram as well as failing to sufficiently protect young users from child sexual exploitation.
The F.T.C.’s case against the social media giant dates back more than a decade.
In 2011, the agency accused Facebook of deceiving users on privacy. In a settlement, Facebook agreed to implement a comprehensive privacy program, including agreeing not to misrepresent its privacy practices.
But after news reports in 2018 that a voter-profiling company, Cambridge Analytica, had harvested the data of millions of Facebook users without their knowledge, the F.T.C. cracked down again.
In a consent order finalized in 2020, Facebook agreed to restructure its privacy procedures and practices, and allow an independent assessor to examine the effectiveness of the company’s privacy program. The company also paid a record $5 billion fine to settle the agency’s charges.
The F.T.C. says Facebook has violated that agreement. In its administrative order on Wednesday, the agency cited reports from the privacy assessor, noting it had found “gaps and weaknesses” in Meta’s privacy program that required substantial additional work.
Although much of the report was redacted, it indicated that the assessor found issues with the way Meta assessed privacy risks to users’ data and managed privacy incidents. It also cited Meta’s oversight of its data-sharing arrangements with third parties.
The F.T.C.’s crackdown on Meta is the latest signal that the agency is following through on pledges by Lina M. Khan, its chair, to rein in the power of the tech industry’s dominant companies. In December, the agency moved to halt consolidation among video game makers when it filed a lawsuit to try to block Microsoft’s $69 billion acquisition of Activision Blizzard, the company behind the popular Call of Duty franchise.
The F.T.C. has also become more aggressive about privacy regulation. Rather than simply trying to protect consumers from increasingly powerful surveillance tools, regulators are working to prohibit certain kinds of data collection and usages that they consider high-risk.
The F.T.C. in December accused Epic Games, the company behind the popular Fornite game, of illegally collecting children’s data and of putting them at risk by matching them with strangers and enabling live chat. Epic agreed to pay a $520 million fine to settle those and other charges. The settlement order also required Epic to turn off live voice and text chat by default — the first time regulators had imposed such a remedy.
But the data restrictions the agency now wants to impose on Meta go much further.
The F.T.C.’s proposed changes would bar Meta-owned sites and products from monetizing young people’s data. That would allow company platforms like Horizon Worlds to collect and use minors’ information only to provide services to users and for security purposes.
The F.T.C. also wants to bar Meta from releasing any new products or features until the company can demonstrate, through written confirmation from an independent privacy assessor, that its privacy program fully complies with the 2020 consent order.
Source: Read Full Article