Facebook reaches $52M settlement with ex-content moderators over PTSD

Facebook-logo-phone-eye-4680

Angela Lang/CNET

Facebook has reached a $52 million settlement with former content moderators who alleged they suffered psychological trauma and symptoms of post-traumatic stress disorder from repeatedly reviewing violent images on the world’s largest social network.

In a 2018 lawsuit, the moderators alleged that Facebook had violated California law by failing to provide them with a safe workplace. Former moderators reviewed content including murders, suicides and beheadings that were livestreamed on Facebook, according to the lawsuit.

“The harm that can be suffered from this work is real and severe,” said Steve Williams, a lawyer for the Joseph Saveri Law Firm in San Francisco who’s representing the plaintiffs. “This settlement will provide meaningful relief, and I am so proud to have been part of it.”

More than 10,000 current and former content moderators who worked for Facebook’s partners in California, Arizona, Texas and Florida will be eligible for a piece of the settlement, which still needs to be approved, according to a press release from the plaintiffs’ lawyers. Each moderator will receive $1,000 and some could get more compensation. Moderators diagnosed with certain conditions because of their work will receive money that could go toward treatment. Depending on the amount remaining in the settlement fund, they may be eligible for awards of up to $50,000.

The preliminary settlement was filed last week in San Mateo Superior Court, according to The Verge, which reported the agreement earlier. In 2019, the news outlet reported that some content moderators made as little as $28,800 per year and one moderator who worked at a site in Florida operated by Cognizant died after having a heart attack at his desk. 

Selena Scola, the first plaintiff in the lawsuit, alleged she suffered from PTSD after working as a contractor for Facebook from June 2017 to March 2018. Scola was an employee of PRO Unlimited, a Florida staffing business that worked with Facebook to police content. Other former content moderators who contracted with Facebook later joined the lawsuit.

As part of the settlement, Facebook will require staffing firms to provide coaching sessions with licensed mental health counselors along with other mental-health support.

The settlement comes as Facebook relies more on artificial intelligence to help detect content such as coronavirus misinformation and hate speech. 

“We are grateful to the people who do this important work to make Facebook a safe environment for everyone,” a Facebook spokesperson said in a statement. “We’re committed to providing them additional support through this settlement and in the future.”

Source Article