back to national news

Shameful conditions of content moderators must be regulated

13 September 2023


  • Government must enact robust legislation to protect these workers

Labour workers’ rights spokesperson Marie Sherlock welcomed the news that the Health and Safety Authority (HSA) and the State Claims Agency (SCA) has stepped up and issued guidelines on occupational exposure to sensitive content in the workplace.

However, Senator Sherlock said that legislation is vital and the guidelines must only be seen as the starting point in regulating this work.

Senator Sherlock said:

“Content moderators are the backbone of social media companies. They are on the frontline protecting us from graphic and violent images, videos and content. It is their work that enables a safe experience for all users. However, these workers work under harrowing conditions.

“These workers face many employment barriers and are forced to sign Non-Disclosure Agreements, of which they are often not provided a copy.

“These NDAs are consistently dangled over their heads and are reminded that they cannot discuss any aspect of their work with their families or trade unions. This is cruel considering the often harrowing and emotional content they sift through on a daily basis.

“The outsourcing of employment in that regard can create environments where outsourcing partners exploit moderators’ lack of legal knowledge or training, making demands of secrecy that could be unlawful, something highlighted to us by the Communications Workers’ Union.

“It is welcome that the HSA will now look to put guidelines for employers in place, but guidelines alone will not be sufficient.

“In particular, we need to see statutory responsibility by employers and, crucially, the platform provider in protecting content moderators.

“The use of NDAs must be reviewed and regulated.

"There is also a wider issue whereby many content moderators are prohibited from joining a trade union, leading to a total information vacuum of workers’ rights and employers’ obligations.

“The ‘arm’s length’ manner in how many large tech companies employ these content moderators must not prove an obstacle to ensuring responsibly by the major tech firms.

"Outsourcing the role is a particularly insidious feature of how many tech firms operate even though content moderation is a fundamental activity to its operation.

“What is troubling beyond belief is that the outsourcing company has failed to provide appropriate clinical support for the very serious content that content moderators have to deal with.

"Many firms dress up supports with a range of wellbeing classes being made available however this is no substitute for appropriate clinical one on one assistance. These people are looking at traumatic posts, posts that Meta does not deem safe to any user of its sites.

“Now, as a matter of urgency, we need to extend health and safety legislation to cover content moderation activity in Ireland.

"Content regulation needs to be recognised as a hazardous activity, it needs to be directly regulated by the Health and Safety Authority and it needs to be brought in-house within the social media giants.

“Ultimately, it is simply unacceptable that workers are having to sustain very serious impacts on their mental and physical health arising from the work they do and that there is no regulation in place.

“Guidelines are useful but what is really needed is a Government response with legislation and adequate resources to the HSA to enforce this.

“Ireland must lead the charge in protecting all workers in Silicon Docks.”