How “engagement” makes you vulnerable to manipulation and misinformation on social media - Nieman Journalism Lab at Harvard

“The heart of the matter is the distinction between provoking a response and providing content people want.”

Facebook has been quietly experimenting with reducing the amount of political content it puts in users’ news feeds. The move is a tacit acknowledgment that the way the company’s algorithms work can be a problem.

The heart of the matter is the distinction between provoking a response and providing content people want. Social media algorithms — the rules their computers follow in deciding the content that you see — rely heavily on people’s behavior to make these decisions. In particular, they watch for content that people respond to or “engage” with by liking, commenting and sharing.

As a computer scientist who studies the ways large numbers of people interact using technology, I understand the logic of using the wisdom of the crowds in these algorithms. I also see substantial pitfalls in how the social media companies do so in practice.

From lions on the savanna to likes on Facebook

The concept of the wisdom of crowds assumes that using signals from others’ actions, opinions and preferences as a guide will lead to sound decisions. For example, collective predictions are normally more accurate than individual ones. Collective intelligence is used to predict financial markets, sports, elections and even disease outbreaks.

Throughout millions of years of evolution, these principles have been coded into the human brain in the form of cognitive biases that come with...



Read Full Story: https://www.niemanlab.org/2021/09/how-engagement-makes-you-vulnerable-to-manipulation-and-misinformation-on-social-media/

Your content is great. However, if any of the content contained herein violates any rights of yours, including those of copyright, please contact us immediately by e-mail at media[@]kissrpr.com.


Source: Story.KISSPR.com