Facebook Hires Thousands to Combat Violent Video Footage


Facebook has announced that it's expanding its community operations team by 3,000 people to review the millions of reports it gets every week, and ultimately respond quickly to remove inappropriate footage.

Facebook Live has sparked controversy over the last few weeks with people recording violent and disturbing crimes on the medium and broadcasting them to the world.

"Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or in video posted later," said Facebook's Chief Executive Officer Mark Zuckerberg in a Facebook post. "It’s heartbreaking, and I’ve been reflecting on how we can do better for our community."

With Facebook's massive reach, it has become increasingly difficult for its team to monitor all of the content that's recorded live or recorded and posted later on the social media platform.

In recent weeks, Facebook Live has seen a lot of violence internationally, including a man who posted himself shooting an elderly man in Cleveland, and a man who recorded himself hanging his 11-month-old daughter in Thailand.

"If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down," said Zuckerberg.

The global expansion is Facebook's response to this disturbing use of the medium.

"Keeping people safe is our top priority," commented Facebook's Chief Operations Officer Sheryl Sandberg on Zuckerberg's post. "We won't stop until we get it right."

Commenters had mixed feelings about the announcement to expand the community operations team.

Some said to shut Facebook Live down entirely, and that the platform encourages people to do horrible things, knowing they'll receive attention, some applauded Zuckerberg for taking action, and some suggested additional measures, such as enabling an emergency link for viewers to click when they see extremely disturbing or violent footage.

"These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation. And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else," said Zuckerberg.

Facebook's global community operations team currently employs 4,500 people, and that hasn't been enough to monitor reports and content. Zuckerberg aims to nearly double that number to review the millions of reports they receive each week, and improve the process for addressing reports quickly over the next year.

"In addition to investing in more people, we’re also building better tools to keep our community safe," said Zuckerberg. "We're going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help."

Zuckerberg mentioned that, last week, Facebook received a report that someone on Live was considering suicide. Facebook was able to reach out to law enforcement and prevent him from hurting himself.

"No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need," said Zuckerberg.

With the growth in users and revenue that Facebook generates, growing its team makes sense.

The extent to which the additional 3,000 people monitoring content will make a difference to the number of people posting disturbing or violent content or commiting crimes remains to be seen. 

Your Comments