The Reality and Risks of Domestic Content Moderator Labor (2024)

The Reality and Risks of Domestic Content Moderator Labor (2024)

No Gabin, Department of Sociology, Seoul National University

Lee So-min, Department of Social Welfare, Yonsei University

2023 Independent Research Grant Program

Korea Institute of Labor Safety and Health

Translated by Cheonghee Yu

 

  1. Research Background and Purpose

Content moderators are workers who sanction and delete user-generated content (UGC) on social media, online platforms, and similar spaces. They do this by applying various curation practices and technologies to align UGC with standards such as current laws, company terms of service, and local national policies and regulations (Gibson, 2022; Gillespie, 2018; Roberts, 2019). Content moderators play a role in determining the appropriateness of content posted by users across various areas of commercial online spaces—such as posts, comments, and profiles—by labeling, filtering, or deleting it according to given regulations. The criteria for determining inappropriateness primarily include whether content is sexually explicit, violent, humiliating, or discriminatory. The review of content appropriateness begins in real-time as social media users create UGC and post it to the online space (Ahmad & Krzywdzinski, 2022). Most companies possess proprietary filtering technology, enabling automatic filtering to occur simultaneously with UGC posting. Content difficult for AI to judge is then forwarded to employed content moderators. The constant flood of online content necessitates massive real-time management, requiring a large workforce. This demand has expanded to form a global labor market that extends beyond national borders to form a worldwide chain.

Interviews were conducted with 18 individuals who had participated for at least one month in content moderation work. This work involved reviewing and filtering videos, broadcasts, posts, photos, and comments on domestic social media platforms and online communities, following the policies or guidelines of the employing company or institution. This study aimed to understand the labor conditions of content moderators and assess the impact on their mental health.

 

  1. Findings: Work process

 

1) Exposure to severely harmful content

Content moderators’ work processes vary slightly depending on the employer. One form involves moderators proactively intervening at an intermediate stage before user-generated content is publicly posted on the platform, assessing its harmfulness and classifying it by tagging or labeling it according to guidelines provided by the company. This constitutes pre-exposure intervention. Work required by public institutions involves reviewing already published content to determine its harmful nature, requesting content deletion or user sanctions—a post-publication intervention.

Online community management involves judging the harmfulness of user behavior and mediating through sanctions and intervention. To attract a large user base to the space, the primary focus is not only on the quality of the content itself but also on preventing active users from causing disturbances or engaging in behavior inconsistent with the community culture, which could lead to discomfort and resentment among other users and result in their departure.

 

2) Labor control

Most social media companies deploy staff to ensure harmful content can be reviewed without interruption, while simultaneously minimizing content moderator staffing to maximize profits. They primarily operate on a three-shift system, and workers often cannot choose their preferred shift times. Furthermore, frequent changes in shifts disrupt circadian rhythms, causing mental and physical distress. Even when statutory leave is granted monthly, companies sometimes penalize leave usage to ensure round-the-clock staffing. For remote work, employers can monitor workers’ tasks in real-time. In coercive organizations, they frequently contact workers to constantly verify they are working.

 

3) Persistent labor amid employment instability

Content moderators must voluntarily resolve the mental distress experienced during work to demonstrate their capabilities and maintain their contracts. It was confirmed that workers avoid visiting counseling centers during work hours due to fear of being perceived negatively and labeled as problematic individuals who cannot manage their mental health properly. Their unstable employment status, requiring contract reviews ranging from one month to as long as two years, combined with concerns about receiving an unsuitable evaluation due to inadequate mental health management, prevents them from seeking psychological support.

Nevertheless, the reasons they continued this work included: having no other options in the labor market; perceiving it as an attractive stepping stone for the future; seeing it as a challenge toward a better life; and taking pride in doing socially valuable work to improve the online environment.

 

4) Impact on mental health

Workers frequently encounter unspeakably stimulating and shocking content, including sexual intercourse videos, genital photos, violent/abusive videos, posts related to self-harm, suicide or murder, scenes of defecation, and drug-related posts. Consequently, interview participants reported experiencing psychological shock and stress after witnessing such content. In severe cases, they experienced physical symptoms like trembling hands and palpitations after exposure. They also reported short-term negative psychological and mental effects, such as loss of appetite or sudden flashbacks.

Differences emerged between domestic and overseas workplaces. Participants working in South Korea described content as highly stimulating and shocking, while those at foreign business process outsourcing (BPO) companies reported somewhat lower levels of stimulation and shock in their reactions to the content.

Differences were also observed by gender. A higher proportion of female participants reported experiencing stress and psychological shock from stimulating and shocking content, while a lower proportion of male participants reported stress.

Meanwhile, when they had only recently started working, they felt psychological pressure and stress while viewing unpleasant and unwanted content as part of their job. However, as they continued to be exposed to the content and stress accumulated, the participants’ psychological defense mechanisms kicked in. They gradually adapted to the content and eventually became desensitized to it.

 

5) Psychological and physical changes

Some content moderators reported developing biases toward specific groups due to the content and experiences encountered in their role. Respondents also mentioned that their personalities or thoughts changed as a result of continuous exposure to provocative and shocking videos. They noted becoming desensitized to explicit content and experiencing shifts in their thought processes due to frequent exposure.

The most common complaint was eye strain from sitting at a desk and staring at a computer screen for extended periods. Severe cases included conjunctivitis, cervical disc herniation, and lower back pain. Those working shifts, especially night-to-early morning shifts, reported irregular sleep patterns and disrupted circadian rhythms, leading to severe fatigue, headaches, and insomnia. Some respondents complained of headaches caused by continuous exposure to stimulating and negative content, while others expressed concern that persistently viewing explicit material would inevitably have negative physical effects.

 

  1. Findings: Solutions

 

1) Corporate Support

Overseas BPO companies provided various programs for content moderators, such as counseling and wellness programs. Since they must secure content moderation contracts from large clients like those in the US, they align their program standards for enhancing moderators’ work capabilities and welfare with those of large corporate clients.

In contrast, most South Korean companies did not provide services or programs for content moderators. Even when offered, such programs were often mere formalities, and workers hesitated to use them due to workplace pressure and organizational culture.

 

2) Support Demands

Workers demanded improved labor conditions, recognition of content moderators’ expertise, salary system reform, additional staffing during peak workloads, and equipment support for remote work. Regarding welfare, they cited the need for emotional and psychological support (counseling, initial response protocols) and guaranteed vacation and rest periods.

Demands were also directed at the government. They requested state support—both legal and financial—for litigation procedures to secure recognition of content moderators’ worker status under freelance contracts. They called for official counseling support, similar to that provided for call center agents. They emphasized the need for government attention to the content moderator profession and the establishment of concrete online content guidelines covering diverse areas such as discrimination and hate speech.

 

  1. Conclusions and Policy Recommendations

 

Survey on the status of domestic content moderators: Platform companies conceal and devalue the existence of content moderators to project a ‘clean’ image, further obscuring their reality. (Kwon Ah-hyun, 2023) Therefore, it is imperative to first understand their actual conditions across various aspects—such as employment status, poor working conditions, and job characteristics—through official statistical surveys or national-level investigations to assess the current situation.

Legislative amendments to guarantee content moderators’ labor rights: Most surveyed content moderators worked as contract or freelance workers and commonly experienced employment insecurity. Amendments are needed, as proposed in the “Partial Amendment Bill to the Trade Union and Labor Relations Adjustment Act” (commonly known as the “Yellow Envelope Act”), to broaden the definition of ‘employer’ from “the party to the labor contract” to “anyone who can substantially control or determine working conditions,” thereby enabling the primary client to bear responsibility for protecting workers.

New discussions on worker definition: Content moderators working through platforms like typified gig work websites are not covered even under special types of workers (Article 67 of the Enforcement Decree of the Industrial Safety and Health Act), leaving them in a legal blind spot. Therefore, as a first step, the scope of the Industrial Safety and Health Act and the Serious Accidents Punishment Act should be expanded to include “persons providing labor services.” This would explicitly require all employers (not just employers of workers under labor contracts) to have a duty to maintain and promote the safety and health of “persons providing labor services.”

Psychological and emotional support: Corporate occupational safety and health officers or human resources personnel must adopt an attitude of interpreting and applying regulations in a manner consistent with the goal of preventing and promoting workers’ mental health. Amendments to the Industrial Safety and Health Act and other relevant laws must follow to explicitly state the company’s legal responsibility to prevent and promote workers’ mental health in all work activities.

Establishing sanctions for harmful content posting: Companies operating online media, such as social media, create their own guidelines and assign tasks to content moderators. However, these corporate guidelines may be biased toward prioritizing users’ freedom of expression over the public interest in content censorship, depending on the company’s interests and operational methods. If the government establishes standard guidelines through discussions with diverse stakeholders, and requires online media companies to formulate regulatory policies based on these guidelines and report their content review processes and outcomes, this would help improve the online environment and prevent the posting of harmful content.

1 Research Abstract

Comments

Post reply

*