How Facebook Relies on Accenture to Scrub Toxic Content - The New York Times

In 2019, Julie Sweet, the newly appointed chief executive of the global consulting firm Accenture, held a meeting with top managers. She had a question: Should Accenture get out of some of the work it was doing for a leading client, Facebook?

For years, tensions had mounted within Accenture over a certain task that it performed for the social network. In eight-hour shifts, thousands of its full-time employees and contractors were sorting through Facebook’s most noxious posts, including images, videos and messages about suicides, beheadings and sexual acts, trying to prevent them from spreading online.

Some of those Accenture workers, who reviewed hundreds of Facebook posts in a shift, said they had started experiencing depression, anxiety and paranoia. In the United States, one worker had joined a class-action lawsuit to protest the working conditions. News coverage linked Accenture to the grisly work. So Ms. Sweet had ordered a review to discuss the growing ethical, legal and reputational risks.

At the meeting in Accenture’s Washington office, she and Ellyn Shook, the head of human resources, voiced concerns about the psychological toll of the work for Facebook and the damage to the firm’s reputation, attendees said. Some executives who oversaw the Facebook account argued that the problems were manageable. They said the social network was too lucrative a client to lose.

Facebook and Accenture have rarely talked about their arrangement or even acknowledged that they work...



Read Full Story: https://www.nytimes.com/2021/08/31/technology/facebook-accenture-content-moderation.html

Your content is great. However, if any of the content contained herein violates any rights of yours, including those of copyright, please contact us immediately by e-mail at media[@]kissrpr.com.



Tags: