Society

What’s In It For TikTok Content Moderators? PTSD.

Tiktok, a constantly growing social media that opens up the freedom to create content and gather an audience that shares the same interests. However, this freedom is often taken advantage of and get taken down for violating community guidelines. The general public might think videos get taken down automatically through keywords or visuals, and while there is a new automatic review system that scans and removes movies that violate company regulations “as soon as they’re uploaded, it is also done manually by content moderators since automated systems cannot track content and underlying meaning as well as real people and it is no surprise how traumatizing it may be to have to sit through and view disturbing content.

Candie Fraizer, a TikTok content moderator, is suing the social media platform for psychological trauma she claims she suffered as a result of her employment, which she argues prompted her to assess videos with graphic violence, disturbing visuals, and other unsettling material. According to the lawsuit, TikTok and ByteDance fail to offer necessary security and psychological support for content moderators.

“Plaintiff Frazier views videos of the genocide in Myanmar, mass shootings, children being raped, and animals being mutilated,” the complaint states. “As a result of constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace, Ms. Frazier has developed and suffers from significant psychological trauma including anxiety, depression, and posttraumatic stress disorder.”

CNN Business article, “TikTok sued by content moderator who claims she developed PTSD from reviewing disturbing content

In 2018, a content moderator alleged she suffered PTSD after being exposed to content depicting rape, suicide, and violence on the job, and filed a similar complaint against Facebook (FB). Despite being burdened with such a hard work, one of the complaints directed at Facebook for its content moderation methods was that moderation contractors did not receive the same perks as corporate employees. The social media company eventually agreed to a $52 million class action settlement, which included payouts and money for content moderators’ mental health therapy, as well as workplace improvements.

According to the lawsuit, content moderators are obliged to sign non-disclosure agreements, which “exacerbate the harm” caused by their job. The practice of compelling employees to sign non-disclosure agreements has lately come under scrutiny in the technology industry, as a result of employee disputes at Pinterest, Apple, and other major corporations.

Frazier asks TikTok to compensate her and other content moderators damages, as well as establish a “medical monitoring fund” to compensate for psychological screening, diagnosis, and therapy for such employees.

Categories: Society