The story of Steve Stephens live streaming a murder on Facebook has dominated the headlines this week, and despite the incident occurring on Sunday, April 16, the public did not hear from Facebook CEO Mark Zuckerberg until Tuesday. Zuckerberg spoke out a mere hour after Stephens committed suicide in Erie, Pennsylvania.
During a speech at Facebook’s annual developers conference, Zuckerberg, the founder and CEO of Facebook, referenced the murder of Robert Godwin, 74. Godwin was killed by 37-year-old Steve Stephens, who posted the video of the shooting to the social networking site.
The main issue people have with Facebook’s involvement in this incident is that it took two hours and 14 minutes to take down the harrowing footage. The company tried to defend itself in a statement, saying they did not receive a report about it until approximately two hours after it was posted.
“We have a lot more to do here. And we’re reminded of that this week by the tragedy in Cleveland. And our hearts go out to the family and friends of Robert Godwin Sr. And we have a lot of work and we will keep doing all we can to prevent tragedies like this from happening,” Zuckerberg said.
The company is currently testing out multiple ways to filter out violent content — including AI technology — except Zuckerberg has said that Facebook is still ‘a few years away’ from fully-functional artificial intelligence.
Facebook currently deals with violent content through reports by users. Users report the unwanted content to human Facebook moderators, who then decide whether to remove the content and disable the user’s account.
Facebook’s VP of Global Operations released an official statement on the matter:
“On Sunday morning, a man in Cleveland posted a video of himself announcing his intent to commit murder, then two minutes later posted another video of himself shooting and killing an elderly man. A few minutes after that, he went live, confessing to the murder. It was a horrific crime — one that has no place on Facebook, and goes against our policies and everything we stand for.
As a result of this terrible series of events, we are reviewing our reporting flows to be sure people can report videos and other material that violates our standards as easily and quickly as possible. In this case, we did not receive a report about the first video, and we only received a report about the second video — containing the shooting — more than an hour and 45 minutes after it was posted. We received reports about the third video, containing the man’s live confession, only after it had ended.
We disabled the suspect’s account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind. But we know we need to do better.
In addition to improving our reporting flows, we are constantly exploring ways that new technologies can help us make sure Facebook is a safe environment. Artificial intelligence, for example, plays an important part in this work, helping us prevent the videos from being reshared in their entirety. (People are still able to share portions of the videos in order to condemn them or for public awareness, as many news outlets are doing in reporting the story online and on television). We are also working on improving our review processes. Currently, thousands of people around the world review the millions of items that are reported to us every week in more than 40 languages. We prioritize reports with serious safety implications for our community, and are working on making that review process go even faster.
Keeping our global community safe is an important part of our mission. We are grateful to everyone who reported these videos and other offensive content to us, and to those who are helping us keep Facebook safe every day.”