Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

swag

(26,490 posts)
Wed Nov 27, 2019, 06:17 PM Nov 2019

Researchers pull back the curtain on online conversations

https://lettersandscience.ucdavis.edu/behind-bots

Excerpt:

From yellow journalism in the 1890s to the Nazi propaganda machine, the spread of misinformation is nothing new in human society. But bots bring new challenges.

For example, bots can give the false impression that a photo or story is highly popular and endorsed by many, whether or not the information is real. This exploits a human bias: people judge information more favorably if their social circle supports it. “People think messages with more views or likes are more credible,” Zhang said.

Human biases can also be leveraged to help us behave in more health-conscious ways. Seeing one’s co-workers, friends, and family take an exercise class or lose weight encourages others to do the same. In research with collaborators at the University of Pennsylvania, Zhang showed peer influence also works within online social networks, spurring young adults to exercise more.

Zhang is currently looking at ways to co-opt our biases to combat anti-vaccine messages. In one recent study, she tested a two-step strategy that laid out anti-vaccine messages and then refuted them point by point. “The results were concerning but also validating,” Zhang said.

It turns out that exposure to misinformation, even when accompanied by detailed fact-checking, diminished participants’ attitudes about the benefits of vaccines. The study also revealed the culprit. The fake stories about harm from vaccines made people feel angry, rather than fearful or unsafe. The findings correspond to seminal research in communication, Zhang said: “The reason misinformation spreads so fast is it contains an emotional component.” A possible solution is providing pro-vaccine messages that evoke an emotional response, an approach Zhang plans to test.

. . . more
Latest Discussions»Issue Forums»Editorials & Other Articles»Researchers pull back the...