(Trends Wide) — You don’t have to be a whistleblower to know that Facebook’s vast platform is used to spread hate and misinformation.
But now we have a whistleblower: former Facebook Product Manager Frances Haugen. She alleges that the scale of the problem is far worse than the company hints or the public understands, that Facebook is hiding evidence that the platform is being used in this way, and that its algorithm accelerated American divisions by feeding content. hateful and wrong.
The list goes on.
His interviews with The Wall Street Journal, followed by his appearance on “60 Minutes” last Sunday night, and now in front of a Congressional committee this Tuesday, will focus even more attention on the idea that the invention more Democratizing of Recent Generations – A free website that allows strangers and friends to meet, organize and speak their minds, now poses a threat to democracy by not only allowing but actively promoting false narratives, conspiracy theories and hatred.
Haugen’s trove of documents, which includes an internal Facebook investigation – which he copied before leaving his job as Product Manager this year – suggests that Facebook:
- You know you have captured only a small fraction of the misinformation that spreads on the site
- Abandoned some efforts to reduce misinformation after the presidential election
- Learn about the effect Instagram has on some young minds
Previously, The Wall Street Journal reported on how drug dealers and human traffickers use the platform, and how Instagram creates a feedback loop that harms the mental health of young Americans, particularly adolescent girls. You may remember a previous whistleblower who alleged that Facebook was an accessory to the sale of opioids on their platform.
Haugen’s key takeaway: When disinformation is the primary weapon in an ongoing war of ideas, the huge company has chosen the profits it makes from catching eyes and luring users over its moral responsibility to reduce the toxic stuff that spreads on its platform. .
Haugen set the example that if a new user signed up and followed Donald Trump, it wouldn’t be long before the platform’s algorithm propelled them to QAnon conspiracy theories.
Read the full report here.
She told The Wall Street Journal that she doesn’t want people to dislike or stop using the platform, but that Facebook has to fix it.
“If people just hate Facebook more for what I did, then I failed,” he told the newspaper. “I believe in truth and reconciliation, we have to admit reality. The first step is documentation.”
Everyone, including Facebook, seems to agree that there should be some regulation of social media companies that requires more responsibility for what’s on their platforms. But Facebook is keen to direct that regulation.
What does Facebook say? The company called Haugen’s allegations misleading and said the platform does more good than harm. In an interview leading up to Haugen’s “60 Minutes” interview, Facebook’s vice president and former British liberal politician Nick Clegg spoke at length with Trends Wide’s Brian Stelter.
Here are some quotes from this excellent interview, in which Clegg agrees that there should be some regulation.
Is Instagram toxic to teenage girls? Not for all teenage girls, Clegg argued.
STELTER: For teenage girls, is the world better with Instagram or worse?
CLEGG: Well, the vast majority of adolescent girls and indeed adolescents who have been covered by some of the surveys you referred to say that for the vast majority of them, it makes them feel better or does not have a difference between if it makes them feel better or worse.
If you are skeptical that adolescent girls are exposed to inappropriate content that could promote eating disorders, read this Trends Wide Business report published Monday, where Instagram recognizes the promotion of pages glorifying eating disorders on adolescent accounts. This is kind of dark.
Why isn’t Facebook publishing the kind of research Haugen leaked? Clegg argued that Facebook has more than 1,000 members with PhDs on its staff and does a lot of research and not all of it is meant to be public. This research, he argued, was intended to help Facebook improve its platforms.
CLEGG: So we do a lot of research. We share it with outside researchers as much as we can. But remember, there is a … and I am not a researcher, but researchers will tell you that there is a big difference between doing a peer-reviewed exercise, in cooperation with other scholars, and preparing articles internally to provoke an informed internal discussion.
Is Facebook like tobacco companies? Making the often-repeated comparison between Facebook trying to hook users and tobacco companies trying to hook smokers, Stelter said he enjoys Instagram but feels the pull of an addiction to it.
CLEGG: Let me give you a very simple reason why this is such a misleading analogy. The people who pay for our lunch are advertisers. Advertisers don’t want their content to be next to hateful, extreme, or obnoxious content.
Is Facebook too big a monster to control? Possibly.
CLEGG: Even with the most sophisticated technology, which I think we implement, even with the tens of thousands of people we employ to try to maintain security and integrity on our platform, you’re right, Brian. We are never going to be absolutely aware of this 100% of the time, because this is a form of instantaneous and spontaneous communication, where billions of human beings can express themselves however they want, whenever they want, with each other.
Is Facebook responsible for the divisions that led to the January 6 insurrection? No, Clegg argued.
CLEGG: I think it gives people false comfort to assume that there must be a technological or technical explanation for the problems of political polarization in the United States.
STELTER: Do you think it’s too easy, too easy to say it’s Facebook’s fault?
CLEGG: Well, well, I think it would be too easy, surely, to suggest that with a tweak to an algorithm, somehow all the disfiguring polarization in American politics would suddenly evaporate. I think it absolves people from asking the toughest questions about the historical, cultural, social, and economic reasons that have led to the politics that we have in America today.
A strange coincidence. To make matters worse for an already bad day, Facebook took a crash that robbed many, many people of their noon Instagram fix. From Trends Wide Business: “I don’t know if I’ve seen a crash like this before at a major Internet company,” said Doug Madory, director of Internet Analytics at network monitoring company Kentik. For many people, Madory told Trends Wide, “Facebook is the Internet for them.”.