- Behind a safe social media site, there are people who are working to view and delete traumatic videos and images for the sake of everyone
- In her post, she expressed her sadness knowing that there are Filipinos being outsourced with this kind of job
- Content moderators receive limited training and inadequate psychological support, which result in having post-traumatic stress disorder or, much worse, is commit suicide
How much do you know about social media?
Most of the people love to use social media, especially Facebook, to post pictures and video which can be seen by anyone. But some people are posting offensive and vulgar pictures and videos, which is against the guidelines of the social media sites.
Without having the knowledge, behind safe social media sites there are people who are working to view and delete traumatic videos and images for the sake of everyone.
Jasmine Curtis-Smith, who plays the mother of Sahaya in the new teleserye “SAHAYA” of GMA network, retweeted a post of BBC Three featuring content moderators of social media sites like Facebook. On her post, she expresses her sadness knowing that there are Filipinos being outsourced with this kind of job.
Knowing that these workers are having an extremely traumatic and highly sensitive job, they are being poorly paid and have lack of psychological support.
This is so low and sad of @facebook 😥 our fellow Filipinos are outsourced, poorly paid, and given little psychological support for this extremely traumatic and sensitive type of work. My heart breaks.. https://t.co/ObEHlVZLFF
— Jasmine Curtis-Smith (@jascurtissmith) March 22, 2019
On a documentary video of BBCThree, they have unleashed the dirtiest secret of being a content moderator. Content moderators are responsible for moderating the social media feed which is to be seen or not. Social media sites like Facebook itself have been outsourcing thousands of Filipinos to delete content that violates their guidelines. Each worker has a target of 25,000 images to review content that can include terrorist propaganda, child sexual abuse, pornography and graphic violence for 8 to 10-hour shifts and some of their workers were only 18 years old.
“You are not allowed to commit one mistake, you could ruin more than one life, it could trigger a war, it could trigger bullying, it could trigger killing.”
One said that when they started the training, they didn’t have any idea what a content moderator was. The employer just simply explained what they would see and what tools to use. But in reality, it was very traumatic for persons to witness such kind of pictures and photos being uploaded. Since they have already signed a contract, quitting the job is prohibited.
“The most shocking that I saw was a girl, around six years of age, sucking a dick.”
Content moderators receive limited training and inadequate psychological support, which result in having post-traumatic stress disorder or, much worse, is to commit suicide.
https://www.instagram.com/p/BvOzuCwAWvc/?utm_source=ig_web_copy_link
They consider themselves like policemen who remove violent content to make it safe for the user.
“The world that we are living in today right now I believed, is not really healthy, we are just like policemen. I’ve seen hundreds of beheading that’s not even pictures, even the videos –a two-minute act of beheading it’s really challenging. Skip images can be marked as a mistake, pressurizing every decision they make.
One of their tasks is to decide when to stop live video streams and some of those horrifying videos remain in their memory.
“It seems like it was a joke, we didn’t know it was real, he was attempting to kill himself, as long as he hasn’t committed suicide we’re not allowed to stop his live stream, because if we stop it, the one who gets in trouble kills himself.”
And having this type of job,one must not let these video affect you; but some can’t avoid being affected by what they see.
These people are not well compensated but they have no other choice because it is their only means to survive.