Facebook moderators reportedly complain about Covid risk in Dublin office
The moderators, employed by professional services firm Covalen, claimed in online forums that they were not allowed to work from home and could not physically distance at the office with so many co-workers, according to UK newspaper The Sunday Times.
Staff also questioned why the office had remained open after two of their coworkers tested positive for the coronavirus, the Sunday Times reported. The employees claim they were originally told that the building would be shut for 72 hours if there was a positive case of Covid-19, but in this instance were only moved to a different floor while deep cleaning was carried out.
In a statement to CNN Business, Facebook said that since the pandemic struck it has enabled “an overwhelming majority” of its reviewers to work from home.
“But considering some of the most sensitive content can’t be reviewed from home, we’ve begun allowing reviewers back into some of our sites as government guidance has permitted,” a spokesperson said, adding that the company prioritizes the health and safety of employees.
Covalen told CNN Business that it has “extensive health and safety procedures in place,” which include strict physical distancing, staggered shifts to allow for deep cleaning, and the regular cleaning of desks and other surfaces that are touched regularly.
If an employee reports Covid-19 symptoms or has been in close contact with an employee who has tested positive “they are informed, isolated onsite, provided with private transport home and advised to contact their GP,” the company said in a statement. “We are also providing transfers to and from the office so employees do not need to take public transport.”
The company added that it had a policy of shutting the office for 72 hours if a Covid case was detected back in March when it was operating at full capacity. “When we reopened [after lockdown was lifted] with significantly reduced capacity, we updated our procedures to reflect that.”
Facebook’s content moderators are responsible for monitoring the site for misinformation and graphic material such as violence, animal abuse and child pornography.
This is not the first time that individuals whose jobs require them to view such content on Facebook have spoken out about grim working conditions.
Two investigations by The Verge last year found that workers were poorly paid, made to labor at filthy workstations and provided little emotional support despite the nature of the work they are doing.
A report last week from nonprofit Avaaz found that purveyors of misinformation have successfully evaded Facebook’s human and automated content review systems, raising questions about the platform’s readiness for a potential wave of misinformation ahead of the US presidential election.