Featured Image for In 2018, Mark Zuckerberg will focus on improving the worst job in technology

In 2018, Mark Zuckerberg will focus on improving the worst job in technology

Facebook founder Mark Zuckerberg has revealed his major focus for 2018 is to tackle abuse and extreme content with renewed vigour.

“My personal challenge for 2018 is to focus on fixing these important issues,” he wrote in a Facebook post.

“We won’t prevent all mistakes or abuse, but we currently make too many errors enforcing our policies and preventing misuse of our tools. If we’re successful this year then we’ll end 2018 on a much better trajectory.”

His promise is to free the platform from the nightmarish content that gets published on a daily basis, or at least as much as possible. In a world that feels more anxious and divided than ever, Zuckerberg vows to do his part in an attempt to make the net a safe, more amicable place.

“This will be a serious year of self-improvement and I’m looking forward to learning from working to fix our issues together.”

Tech giants are hiring legions of content moderators to keep off racism, hate speech and depravity out of the internet, but the job takes such a psychological toll that workers often quit on the first day.

Although Google, Facebook and YouTube have developed automated tools to keep the worst of the human condition off the web, people still remain the best at policing content.

Searching for the worst of the worst means to hunt on a daily basis for pornography, racism and violence. Not surprisingly, the role has a high turnover rate, with moderators usually lasting from a few months to a year.

At $30 per hour, it’s no wonder content moderation is considered the worst job in technology.

The stuff these people have to see is so gruesome, it’s not uncommon for workers to leave for lunch on their first day and never return.

A recent piece from The Wall Street Journal by Lauren Weber and Deepa Seetharaman takes a peek at the dirty side of the net, interviewing people who have gone through the job.

“I was watching the content of deranged psychos in the woods somewhere who don’t have a conscience for the texture or feel of human connection,” said Shaka Tafari, a 30-year-old contractor at Whisper.

On top of the emotional stress content moderators are put through, managers usually monitor performance remotely, prodding workers with messages when they’ve dwelled too long after reviewing a post.

To Mashable editor Lance Ulanoff, the gig is akin to working on a 24-hour crisis hotline.

“It’s very intense work,” Ulanoff says.

“These are people who are looking specifically for language or images that might indicate self-harm, violence or anything that would indicate someone might harm others.

“These monitors are seeing potentially intense information on a constant basis. At the same time, that’s what they signed up to do.”

Companies try to mitigate the impact on workers by offering them counselling sessions and other aids, but most of the time that’s not enough.

Former Microsoft employees Henry Soto and Greg Blauert are seeking damages from the company. Both claim Microsoft failed to provide them with adequate psychological support.

According to the lawsuit, Soto says he was exposed to “many thousands of photographs and video of the most horrible, inhumane and disgusting content you can imagine”. He claims he started having auditory hallucinations after seeing some particularly disturbing footage.

Leave a comment