Thursday, November 14, 2024

Meta’s content moderator subcontracting model faces legal pressure in Spain

Must read


Image credits: Jakub Porzycki/Nuru Photography/Getty Images

A Barcelona-based company that provides content moderation services to Facebook and Instagram as a subcontractor to Meta has been found liable for psychological harm suffered by its employees in a Spanish court. The verdict, handed down on Thursday, is the first time a Spanish court has held a content moderation company liable for the mental health problems suffered by its workers, according to local reports.

report of El Periodico The ruling, handed down earlier this month, follows a challenge brought by a 26-year-old Brazilian, who has been undergoing psychiatric treatment for five years for radical revelations, against Meta’s local subcontractor, CCC Barcelona Digital Services, on Thursday. It said it was about the allegations. Violent content such as murder, suicide, terrorism, torture, etc. on Facebook and his Instagram.

The employee in question, who began moderating content on Facebook and Instagram in 2018, reported symptoms including panic attacks, avoidance behaviors, excessive worry about having the disease, trouble sleeping, difficulty swallowing, and serious symptoms. It is said that he suffered various psychological damage. death phobia (Anxiety due to fear of death) According to newspaper reports.

According to the newspaper, a court in Barcelona found that the mental problems suffered by the worker were due to an industrial accident and not a common illness. Mehta’s subcontractors tried to treat his absences as a common illness and deny responsibility for the psychological harm he suffered from viewing violent content uploaded to Facebook and Instagram.

The law firm Espacio Giurdico Feliu Fins, which represented the workers, said in a social media post in response to the court’s ruling that this result is an effective way for workers suffering from mental health problems due to their work to He said it was a big victory for him.

“Metamedia and social media in general need to recognize the magnitude of this problem and change their strategies,” the law firm wrote in a post. [in Spanish; this is a machine translation]. “Instead of pursuing a strategy of denying the problem, we must accept that this horrifying reality that workers suffer is as real as life itself.

“The day they accept it and stand up to it, everything will change on that day. Unless this happens, we will work through the legal system to make sure this happens. We will take it step by step, without haste but without hesitation. . And above all, with a strong determination to win.”

meta outsources its review of harmful content to various third-party subcontractors, which are used as human filters for extreme violence and other horrific acts uploaded to social networks; They usually provide a large number of low-wage workers. Year. Still, the practice continues.

Back in May 2020, Meta reported that content moderators working for a third party that provides content review services for its social network reported that reviews of violent and graphic images were associated with post-traumatic symptoms. agreed to pay $52 million to resolve a U.S. class action lawsuit alleging that the company caused the stress disorder.

The company is also facing lawsuits in Africa, where a moderator working for Sama, a Kenyan meth subcontractor, is suing both companies for allegedly failing to provide “adequate” mental health and psychosocial support. .

Meta declined to comment on the judgment against the Spanish subcontractor. However, the social networking giant provides general information about its approach to outsourcing content moderation, and the agreements it has with third parties it works with on content reviews include counseling, training and other worker support. It is expected that provisions will be made in the following areas:

The technology giant also provides subcontractors with 24/7 on-site support from trained physicians, in addition to providing on-call services and access to private medical care from day one of employment. He said he was required to provide it.

Meta also said it provides its subcontractors with technical solutions aimed at enabling content reviewers to limit as much as possible their exposure to the graphic material they are asked to moderate. These tools can be customized by reviewers, so they can display graphic content in black and white and completely blurred, blur the first frame, play without audio, or opt out of autoplay.

However, the company’s background paper did not address the potential for support services and screening tools to be compromised by strict productivity and performance quotas that subcontractors may impose on reviewers. In fact, it may be difficult for these employees to receive adequate support while on the job. Maintains the performance required by the employer.

Back in October, Barcelona-based newspaper La Vanguardia reported that around 20% of staff at CCC Barcelona Digital Services were absent from work as a result of psychological trauma from reviewing harmful content. Reported. In its article, the newspaper quotes workers as saying the support provided by their employers and Mehta’s subcontractors is “very inadequate.”

In another report from the same month, El Nacional, describes the need for employees to achieve a high “success rate” (98%). That is, each moderator’s decisions must match those of their colleagues. The same report says senior auditors are almost always at risk of being fired if interest rates fall.

Screening tools that obscure the content being reviewed, in whole or in part, can clearly make it difficult for reviewers to meet strict performance goals. Workers may therefore see it as a risk to use tools that may reduce the accuracy of their assessments, as falling behind their colleagues may jeopardize their continued employment. , effectively discouraging employees from taking actions to prevent exposure to the virus. Psychologically harmful content.

Disrupted sleep patterns are known to contribute to stress, and the shift work that content moderators are required to do on a daily basis may also contribute to the development of mental health issues. Additionally, the routine use of young, low-wage workers in content moderation farms means that a high risk of burnout is built into the model, and the industry is experiencing high churn. This suggests that it is a closed industry structured around the management of harmful substances. Or, basically, outsource burnout as a service.

However, legal rulings that require third-party content reviewers to consider workers’ mental health could put limits on this model.

Telus, the Canadian company that owns CCC Barcelona Digital Services, did not respond to a request for comment as of press time.





Source link

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article