New     Submit     Search     Register  

TikTok, Time's Up on Facebook's Content Moderation Sites

Updated on June 29
0:00
0:00

Key Smash Notes In This Episode

0

Casey Newton reported that the moderators make about $28,000 a year. The moderators often leave the job with post-traumatic stress caused by the rough working conditions of the job.

0

Cognizant is a contractor hired by Facebook to moderate content. Facebook may write the policies, but the moderator decides if what’s reported fits the policy. According to a moderator, they were just bodies in seats for the company.

0

Although Casey does not know how many posts are reported per day, he does know that companies want moderators to take action on at least 250 posts per day. Really high performers will look at around 400 a day.

0

One example is that a person came forward to Casey and revealed that he was diagnosed with PTSD due to the videos he moderated each day for 8 hours, which included hate speeches and graphic violence. The person also volunteered at an animal shelter, and had to moderate videos of human beings and animals die in the worst way.

0

For most videos, moderators have to watch at least 15 seconds, sometimes 30 seconds. It varies in length, but the video must be watched enough to know which section of the video is problematic. No matter if it's the same video that's been reported before, they must rewatch its contents again, and again, and again. This leads many moderators to burn out, and even to get PTSD as a result.

0

Facebook believes that American moderators would do a better job with understanding our culture and slang in order to have the necessary context needed to moderate our posts.

0

Moderators have stated that they work in poor conditions that are unfit and unclean. They are not assigned a work space, so they usually have to find a new available area each day, and the work areas are not clean or well-kept.

0

Facebook does turn over videos to law enforcements when the contents are breaking the law. The authorities cannot intervene anymore because even if something is sent to the police, a video or post can be reposted.

0

Facebook wants its platform to be a place where people can talk about whatever. They view this as a political problem by restricting someone's constitutional right to free speech. If they do enforce certain content restrictions, another competing platform may rise and allow the banned content on their site, causing more users to leave Facebook.

0

Casey does not know, but he hopes nothing bad will happen to them. He strongly hopes that more attention is brought to this situation. He states how people have an unequivocal right to discuss their workplace conditions. To sue them for discussing their work conditions is a violation of their First Amendment rights.

0

Casey thinks that Donald Trump would use another platform to post. Then, a bot would be set up to screenshot the post and upload it to Twitter. It may be somewhat of a positive for Trump to be on Twitter so we know about whatever topics are on his mind-- even if it's talking off the top of his head with no information.

Suggested Episodes