The appearance of Adam Mosseri, head of Instagram, before a subcommittee of the US Senate to respond to the alleged toxicity of the social network in the behavior and psychology of minors, has constituted a jug of cold water for all those who were waiting a tacit assumption of responsibility on the part of the company, which belongs to Meta (formerly Facebook). In a written statement, Mosseri, one of founder Mark Zuckerberg’s lieutenants, has brought to “the industry” the urgency of equipping himself with a body that addresses “how to check age, design appropriate experiences [para menores] and create parental controls “. Throwing balls out, the executive added that Instagram “would adhere to these standards” to benefit from the so-called section 230, a key Internet standard in the US that protects technology platforms from liability for content posted by users.
The hearing was of additional interest: Mosseri is the first executive to appear before Congress following revelations by former employee Frances Haugen about Facebook’s security deficit and malpractice, which caused the company’s worst reputational crisis. Mosseri’s purpose was to plug some of the waterways of the scandal, although legislators, Democrats and Republicans alike, have not made it easy for him, using data that shows that Instagram and its models of physical perfection and social success have harmed the mental health of one in three teenagers in America, one of Haugen’s most explosive revelations.
At its fifth hearing -in previous sessions, TikTok and YouTube officials appeared-, the Subcommittee on Consumer Protection and Data Protection has raised suspicions about the additional damage caused by algorithms to Mosseri, by recommending harmful websites and applications to minors for their physical and psychological health based on their interests and previous searches, as well as the lack of controls to keep the youngest away from the site. In September, Instagram reversed its project to create a version for children. Weeks later, Haugen revealed the results of internal research on the pernicious influence on the behavior of the youngest, especially adolescent girls.
One of those investigations, carried out in 2019, showed that Instagram returned a negative image to a third of users under 20 years of age. Another report from 2020 revealed that 32% of adolescents believed that the use of the photographic social network had aggravated the perception of their body, with which they were not satisfied. Facebook has defended itself by ensuring that the data has been taken out of context and minimizing investigations. In successive appearances before the US Congress, the British Parliament and the European Parliament, Haugen has underlined his concern about the perverse effect of the photographic network on children.
In one more example of the reactive policy of the company, which tries to mitigate the damage -or the oil stain of the scandal- after it was verified, Instagram announced this Tuesday, the eve of the Mosseri hearing, new security tools for minors , such as the automatic pause function to limit the time spent online; an announcement that was received with suspicion by the senators. Likewise, a day after a coordinated lawsuit by Rohingya refugees was filed against the company in the United States and the United Kingdom for promoting the dissemination of hateful messages – something that the company recognized in 2018 -, the firm this Wednesday blocked the profiles of companies linked to the Army in Myanmar (formerly Burma), the main promoter of the attempted genocide of the Muslim minority.
More than the distant Myanmar crisis, or the almost forgotten Cambridge Analytica data leak scandal, Haugen’s revelations reverberate about Facebook’s battered image, describing the moral bankruptcy of a company that tolerated violent content, such as hate speech. in countries at war, for the sake of profit. The reaction to Haugen’s leaks once again confirmed to the US Congress the need, even the urgency, to regulate giants like Facebook, and not only in terms of antitrust laws, the Administration’s other open battlefront against big technology.
Join EL PAÍS now to follow all the news and read without limits
Subscribe here
The alleged toxicity of the network in minors is the main burden dragged by the company, which does not suffer from scandals from an economic point of view and has decidedly opted for the metaverse to weather the storm. To justify Mosseri’s summons, Democrat Richard Blumenthal, who chairs the Senate subcommittee, cited hundreds of calls and emails from parents about their children’s negative experiences on Instagram. One parent recounted how his daughter’s interest in physical exercise led the app to recommend accounts of draconian diets, eating disorders, and self-harm.
The subcommittee points directly to the algorithms that drive such recommendations. The audience’s claim was “to hear directly from the leadership of the company why it uses algorithms that present harmful content to children, leading them to dark places, and what it will do [Meta] so that its platform is more secure ”, said before the session the Blumenthal, president of the committee. The Senate body has proposed stricter data privacy controls to protect children, greater enforcement of age restrictions, and the ability for young users to delete information online.
In mid-November, several states in the country announced an investigation to determine whether Meta had deliberately allowed children and adolescents to use Instagram even though it knew that the platform could affect their physical and mental health.
Subscribe here to the EL PAÍS América newsletter and receive all the informational keys of the current situation in the region