On Sunday, the murder of Robert Goodwin Sr. was streamed live on Facebook by a man named Steve Stephens, who has been dubbed the ‘Facebook killer’, leading to many questioning Facebook’s speed in dealing with the stream.
Initial reports claimed that Facebook took around three hours to remove videos showing Stephens’ intent to kill, him subsequently pulling a gun on Goodwin, and his confession. Facebook has released a statement, along with a timeline, that puts the removal at just over two hours – though Facebook admits this was still far too long.
In the statement, Justin Osofsky, VP of global operations at Facebook, wrote: “As a result of this terrible series of events, we are reviewing our reporting flows to be sure people can report videos and other material that violates our standards as easily and quickly as possible. In this case, we did not receive a report about the first video, and we only received a report about the second video — containing the shooting — more than an hour and 45 minutes after it was posted. We received reports about the third video, containing the man’s live confession, only after it had ended.
“We disabled the suspect’s account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind. But we know we need to do better.”
Facebook also says it will not only be reviewing its reporting but is also ‘constantly exploring ways that new technologies can help’ keep its platform safe. One of these technologies is AI – which Facebook says it is using to prevent resharing of certain videos, a tool that it recently implemented to prevent the resharing of ‘revenge porn’ across its social platforms.
Despite Facebook’s intentions with AI, the technology isn’t at a position to be able to detect if a live broadcast may be offensive or violent. Facebook live has previously had issues with other similar livestreams – including suicides, sexual assault and torture – so it is unclear how Facebook really can combat this via technology.
Incidents like this also bring into question brand safety – as does everything at the moment. Does a brand really want to advertise on a feature, or a platform, that is the home to the streaming of such violent acts?
There is a lot for Facebook to ponder, especially as one in five of all videos on the social network is now a live video, according to Facebook’s VP of product, Fidji Simo.
Timeline of Events
11:09AM PDT — First video, of intent to murder, uploaded. Not reported to Facebook.
11:11AM PDT — Second video, of shooting, uploaded.
11:22AM PDT — Suspect confesses to murder while using Live, is live for 5 minutes.
11:27AM PDT — Live ends, and Live video is first reported shortly after.
12:59PM PDT — Video of shooting is first reported.
1:22PM PDT — Suspect’s account disabled; all videos no longer visible to public.