A former employee of Meta said in his testimony before a US Senate subcommittee on Tuesday that the parent company of Facebook and…Instagram It was aware of the harassment and other harm teens face on its platforms but failed to address it, according to a Reuters report.
Arturo Bejar worked in Instagram’s user wellness management from 2019 to 2021, and was previously the engineering director of the security and care team at Instagram. Facebook From 2009 to 2015.
Biggar testified before the Senate Judiciary Subcommittee on Privacy, Technology and the Law in a hearing on social media and its impact on teens’ mental health.
“It is time for the public and parents to understand the true level of harm these products cause, and it is time for young users to have the tools to report and suppress online abuse,” he said in written statements made public before the hearing.

Biggar’s testimony comes amid a bipartisan push in Congress to pass legislation requiring social media platforms to provide parents with the tools to protect children online.
Biggar told the hearing that the goal of his work at Meta is to influence the design of Facebook and Instagram in ways that would nudge users toward more positive behaviors and provide tools for young people to deal with unpleasant situations.
Meta said in a statement that it is committed to protecting young people online, indicating that it supports the same user concerns that Bejar mentioned in his testimony and is working to create tools such as notifications for anonymous content that may be harmful.
The statement added: “Every day countless people inside and outside of Meta are working on how to help keep young people safe online… All of this work continues.”

Biggar told senators that he met regularly with the company’s top executives, including the CEO Mark Zuckerberg, and stated that they were supportive of the work at that time. However, he later concluded that the executives had decided “not to address the issue,” he testified at the hearing.
Bigger Zuckerberg and other senior executives reported in a 2021 email about internal data that revealed that 51% of Instagram users reported having had a bad or harmful experience on the platform in the past seven days. Among this group, 24.4% of children between the ages of 13 and 15 reported receiving unwanted sexual advances.
A separate document showed that 13% of all Instagram users between the ages of 13 and 15 surveyed said they had received unwanted sexual advances.
Biggar also told executives that his 16-year-old daughter received misogynistic comments and obscene images, without adequate tools to report the experiences to the company.
In his testimony, Biggar recounted that in one meeting, Chris Cox, Meta’s chief product officer, was able to cite accurate statistics on the harm done to teens. “I found it heartbreaking, because it means they knew and they didn’t act on it,” he said.
Biggar met last week with two senators who are sponsoring the Children’s Online Safety Act, and said he shared evidence that Meta executives ignored harm caused to young people on the company’s platforms.
Source link