Earlier today, the New Zealand Herald published a letter from Facebook COO Sheryl Sandberg, addressing how the company is addressing the deadly terror attack in Christchurch two weeks ago. In the letter, she lays out three steps that the company is taking, including that it’s “exploring restrictions” for live video.
She described the attack as “an act of pure evil,” and that the company is “committed to reviewing what happened,” and that it’s working with the country’s authorities. In the aftermath of the attack, Facebook says that it removed 1.5 million videos of the attack around the world, with 1.2 blocked “at upload.” In her letter, Sandberg says that while Facebook moved quickly to remove the video and the perpetrator’s account, the company could do more, and laid out three steps that it will take. We’ve reached out to Facebook to clarify on the letter, and will update this post if we hear back.
The first step is that Facebook is “exploring restrictions on who can go Live depending on factors such as prior Community Standard violations,” and that the company is putting more resources into systems that can identify violent videos, even if they’re edited. She noted that they’ve found more than 900 variations of the video of the attack.
The second step is to take “even stronger steps to remove hate on our platforms.” Earlier this week, Facebook announced that it was banning white nationalist and separatist contentfrom the site, and would redirect people searching for such content to resources that aid in people who leave hate organizations. Sandberg says that the company has since removed a number of groups for such organizations from its platform, and will remove praise of such groups as well.
The final step that Sandberg outlines is that the company is providing support to “four local well-being and mental health organizations” within New Zealand, and reiterated that the company is “ready to work with a commission designed to examine how online sites and social media platforms played a role in the attack.
Sandberg’s letter doesn’t provide specifics for what it means by “exploring restrictions” on live video, although it does look like there are some caveats — such as if a person has previously violated the site’s community standards. Earlier this month, the perpetrator of the attacks filmed his attacks on two mosques and uploaded them to Facebook, YouTube, Twitter, and Instagram with the goal of making them go viral. While the attack was initially viewed by less than 200 people, tech companies have scrambled to keep the videos from spreading further. Facebook has faced issues with people streaming violent videos such as in 2017, when a Cleveland uploaded footage of a murder to his profile page, where it remained for several hours.