Facebook scrubs 1.5mn Christchurch attack videos but criticism goes viral

CHRISTCHURCH (New Zealand), March 19 (NNN-AGENCIES) – Facebook says it removed a staggering 1.5 million videos showing harrowing viral footage of the Christchurch mosque rampage but criticism of social media giants for failing to block images of the “real-time terror attack” is also spreading fast.

As the alleged gunman callously picked off his victims in Christchurch’s Al Noor mosque, he livestreamed the gruesome scene on Facebook Live, apparently using a camera mounted on his body, after also tweeting a racist “manifesto.”

Facebook said it “quickly” removed the video, plus the gunman’s account and Instagram, and in the first 24 hours scrubbed 1.5 million videos worldwide “of which 1.2 million were blocked at upload.”

Spokeswoman Mia Garlick from Facebook New Zealand said the firm was “working around the clock to remove violating content using a combination of technology and people.”

But despite pleas — and official orders from authorities — not to share the content, the footage proliferated widely online and experts said the 17-minute video was easily retrievable several hours after the attack that killed 50 people.

According to Facebook’s own figures, at least 300,000 videos were not blocked before being uploaded and there is no official data on how many times these were viewed or shared.

New Zealand Prime Minister Jacinda Ardern said authorities did whatever they could to purge the web of the images but laid the responsibility at the door of the Silicon Valley giants.

“Ultimately it has been up to those platforms to facilitate their removal,” she told reporters.

“I do think that there are further questions to be answered. Obviously these social media platforms have wide reach. This is a problem that goes well beyond New Zealand.”

“This is an issue that I will look to be discussing with Facebook,” she warned.

This was not the first time Facebook Live has been used to broadcast atrocities — a murder was livestreamed in the US city of Cleveland in 2017 — and Facebook and Twitter say they have invested in technology and human resources to combat the problem.

Facebook has hired about 20,000 moderators but critics say they are not doing enough. — NNN-AGENCIES

administrator

Related Articles