Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

TomCADem

(17,390 posts)
Sun Mar 14, 2021, 10:07 PM Mar 2021

How Facebook got addicted to spreading misinformation

This article provides a serious discussion of why Facebook continues to be a major conduit for spreading misinformation. This also explains why Republicans are trying to redirect the conversation from combatting misinformation to combatting "bias." Of course, a focus on bias tends to treat facts and "alternative facts" as having equal weight and thus promoting a false equivalency under which which misinformation can thrive.

https://www.technologyreview.com/2021/03/11/1020600/facebook-responsible-ai-misinformation/

By the time thousands of rioters stormed the US Capitol in January, organized in part on Facebook and fueled by the lies about a stolen election that had fanned out across the platform, it was clear from my conversations that the Responsible AI team had failed to make headway against misinformation and hate speech because it had never made those problems its main focus. More important, I realized, if it tried to, it would be set up for failure.

The reason is simple. Everything the company does and chooses not to do flows from a single motivation: Zuckerberg’s relentless desire for growth. Quiñonero’s AI expertise supercharged that growth. His team got pigeonholed into targeting AI bias, as I learned in my reporting, because preventing such bias helps the company avoid proposed regulation that might, if passed, hamper that growth. Facebook leadership has also repeatedly weakened or halted many initiatives meant to clean up misinformation on the platform because doing so would undermine that growth.

In other words, the Responsible AI team’s work—whatever its merits on the specific problem of tackling AI bias—is essentially irrelevant to fixing the bigger problems of misinformation, extremism, and political polarization. And it’s all of us who pay the price.

“When you’re in the business of maximizing engagement, you’re not interested in truth. You’re not interested in harm, divisiveness, conspiracy. In fact, those are your friends,” says Hany Farid, a professor at the University of California, Berkeley who collaborates with Facebook to understand image- and video-based misinformation on the platform.
4 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
How Facebook got addicted to spreading misinformation (Original Post) TomCADem Mar 2021 OP
This is an important article teach1st Mar 2021 #1
Worth reading the whole thing. Mosby Mar 2021 #2
Fake news is profitable to Facebook Stevano Mar 2021 #3
Works on lefties too Johnny2X2X Mar 2021 #4

teach1st

(5,935 posts)
1. This is an important article
Sun Mar 14, 2021, 10:52 PM
Mar 2021

Thanks for posting this, TomCADem!

If a model reduces engagement too much, it’s discarded. Otherwise, it’s deployed and continually monitored. On Twitter, Gade explained that his engineers would get notifications every few days when metrics such as likes or comments were down. Then they’d decipher what had caused the problem and whether any models needed retraining.

But this approach soon caused issues. The models that maximize engagement also favor controversy, misinformation, and extremism: put simply, people just like outrageous stuff. Sometimes this inflames existing political tensions. The most devastating example to date is the case of Myanmar, where viral fake news and hate speech about the Rohingya Muslim minority escalated the country’s religious conflict into a full-blown genocide. Facebook admitted in 2018, after years of downplaying its role, that it had not done enough “to help prevent our platform from being used to foment division and incite offline violence.”

While Facebook may have been oblivious to these consequences in the beginning, it was studying them by 2016. In an internal presentation from that year, reviewed by the Wall Street Journal, a company researcher, Monica Lee, found that Facebook was not only hosting a large number of extremist groups but also promoting them to its users: “64% of all extremist group joins are due to our recommendation tools,” the presentation said, predominantly thanks to the models behind the “Groups You Should Join” and “Discover” features.
 

Stevano

(24 posts)
3. Fake news is profitable to Facebook
Mon Mar 15, 2021, 10:58 AM
Mar 2021

Because right wing nuts use Facebook more, when Facebook gives them the fake news they crave.

Johnny2X2X

(19,140 posts)
4. Works on lefties too
Mon Mar 15, 2021, 11:04 AM
Mar 2021

Sure, the right falls for more fake news, but Dems are not immune either, and another important aspect is that even smart people fall for it. This isn't just idiots being duped, it works on the most educated and intelligent too.

Latest Discussions»General Discussion»How Facebook got addicted...