Social media platforms fight to stop Buffalo shooting video from spreading

Major social media platforms have tried to improve their reaction to sharing this type of content since the mass shooting in Christchurch, New Zealand, in 2019, which was broadcast live on Facebook. Within 24 hours of that attack, Facebook said it deleted 1.5 million copies of the video. Online extremism experts say such content can act as far-right terrorist propaganda and inspire others to carry out similar attacks; the Buffalo shooter was directly influenced by the Christchurch offense, according to the document he allegedly shared.

The stakes in dealing with the rapid dissemination of such content are high. “This fits into a pattern we’ve seen time and time again,” said Ben Decker, CEO of digital investigative consultancy Memetica and an expert on online radicalization and extremism. “At this point, we know the consumption of these videos creates copycat mass shootings.”

Still, social media companies are struggling to respond to what appear to be users posting a deluge of copies of Buffalo’s video and shooting document.
Saturday’s attack was streamed live on Twitch, a video streaming service owned by Amazon (AMZN) which is particularly popular with gamers. Twitch said it deleted the video two minutes after the violence began, before it could be viewed widely, but not before it was uploaded by other users. The video has since been shared hundreds of thousands of times on major social media platforms and also posted on more obscure video hosting sites.
Facebook Spokesperson, Twitter (TWTR), YouTube and Reddit all told CNN they have banned sharing of the video on their sites and are working to identify and remove copies. (TikTok did not respond to requests for comment on its response.) But the companies appear to be struggling to contain the spread and manage users looking for loopholes in their content moderation practices.
CNN observed a link to a copy of the video circulating on Facebook on Sunday evening. Facebook included a disclaimer that the link violated its Community Standards, but still allowed users to click through and watch the video. Facebook parent company Meta (Facebook) said he removed the link after CNN questioned him.

Meta called the event a “terrorist attack” on Saturday, prompting the company’s internal teams to identify and delete the suspect’s account, as well as begin deleting copies of the video and document. and links to them on other sites, according to a company spokesperson. The company added the video and document to an internal database that automatically detects and removes copies if they are re-uploaded. Meta also banned content that praises or supports the attacker, the spokesperson said.

The video was also hosted on a lesser-known video service called Streamable and was only removed after it had been viewed over 3 million times, and its link shared on Facebook and Twitter, according to the New York Times.

A Streamable spokesperson told CNN the company is “working diligently” to remove copies of the video “promptly.” The spokesperson did not respond when asked how a video reached millions of views before being deleted.

Copies of the document allegedly written by the shooter were uploaded to Google Drive and other smaller online storage sites and shared over the weekend via links to those platforms. Google did not respond to requests for comment on using Drive to deliver the document.

Challenges in countering extremist content

According to Tim Squirrell, head of communications at the Institute for Strategic Dialogue, a think tank dedicated to countering extremism.

But consumer big tech platforms also have to deal with the fact that not all internet platforms want to take action against this content.

Tech platforms struggled to handle live shoots.  New legislation could make it impossible
In 2017, Facebook, Microsoft (MSFT), YouTube and Twitter founded the Global Internet Counterterrorism Forum, an organization designed to help promote collaboration to prevent terrorists and violent extremists from exploiting their platforms, which has since grown to include more of a dozen companies. Following the Christchurch attack in 2019, the group pledged to prevent the attacks from being broadcast live on their platforms and to coordinate to tackle violent and extremist content.

“Now technically it failed. It was on Twitch. It then started posting within the first 24 hours,” Decker said, adding that the platforms still had work to do to coordinate effectively to remove harmful content during crisis situations. Still, the work the major platforms have done since Christchurch means their response to Saturday’s attack has been faster and more robust than the response three years ago.

But elsewhere on the internet, smaller sites such as 4chan and messaging platform Telegram provided a place where users could congregate and coordinate to repeatedly upload the video and document, according to Squirrell. (For its part, Telegram says it “expressly prohibits” violence and is working to remove footage of the Buffalo shooting.)

“A lot of the threads on the 4chan message board were just people clamoring for the stream over and over again, and once they got a seven minute version, just repost it over and over again” on bigger platforms, Squirrell said. As with other content on the internet, videos like the one from Saturday’s shooting are also often quickly manipulated by extremist communities online and incorporated into memes and other content that can be harder for viewers to identify and remove. consumer platforms.

Like Facebook, YouTube, and Twitter, platforms like 4chan rely on user-generated content and are legally protected (at least in the US) by a law called Section 230 from liability for much of what users post. But while mainstream Big Tech platforms are driven by advertisers, social pressures, and users to fight harmful content, smaller, more fringe platforms aren’t driven by a desire to protect ad revenue or attract a large user base. In some cases, they want to be online homes for speeches that would be moderated elsewhere.

“The consequence of that is you can never complete the game of Whack-a-mole,” Squirrell said. “There will always be somewhere, someone passing around a Google Drive link or a Samsung cloud link or something else that allows people to access it… Once it’s in the ether, it is impossible to delete everything.”


Comments are closed.