AM 890 Homer, 88.1 FM Seward, and KBBI.org: Serving the Kenai Peninsula
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Facebook Plans To Add 3,000 Workers To Monitor, Remove Violent Content

Sean Gallup
/
Getty Images

Faced with a recent spate of violent videos and hate speech posted by users on its network, Facebook has announced plans for a heap of hires: 3,000 new employees worldwide to review and react to reports of harm and harassment.

"Over the last few weeks, we've seen people hurting themselves and others on Facebook — either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community," CEO Mark Zuckerberg announced Wednesday in a Facebook post.

"If we're going to build a safe community," he continued, "we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner — whether that's responding quickly when someone needs help or taking a post down."

Zuckerberg says the move significantly expands the company's workforce dedicated specifically to these concerns, increasing its current staff of 4,500 by nearly 70 percent. The hires will be made over the next year. (Facebook pays NPR and other leading news organizations to produce live video streams that run on the site.)

It's a high-profile announcement for what has become a high-profile problem for the tech giant. In recent months, an elderly man was murdered in Cleveland in a video later uploaded to Facebook; a teenage girl was sexually assaulted in a livestreamed video; and four people were charged with hate crimes for the assault of a man who authorities said had "mental health challenges" in a video that was also streamed on the site.

And then, there have been issues in the other direction — where the problem wasn't that violence went unflagged, but that an overactive flagging process removed less-than-offensive content. Perhaps the most notable of these incidents came last year, when the Pulitzer Prize-winning "Napalm Girl" photograph was removed from the site for violating Facebook's Community Standards, before the company finally relented and allowed it after an international outcry.

"Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " Stanford University law professor Daphne Keller told NPR's Laura Sydell last month. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "

Zuckerberg intends the new hiring spree to help ease this seesaw swing between too little enforcement and too much. Saying the company also aims to combat things such as hate speech and child exploitation, Zuckerberg explained the next steps include close work with law enforcement and streamlined reporting mechanisms.

"As these become available," Zuckerberg wrote, "they should help make our community safer."

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Colin Dwyer covers breaking news for NPR. He reports on a wide array of subjects — from politics in Latin America and the Middle East, to the latest developments in sports and scientific research.