A live video streamed by a terrorist as he killed dozens of people in two mosques in New Zealand was only reported to Facebook 29 minutes after the attack started, the social media giant has said. 

The Facebook Live video was seen fewer than 200 times as it happened, but none of those users reported the video.

It was only flagged to Facebook moderators 29 minutes after the attack started, and 12 minutes after it ended. 

“No users reported the video during the live broadcast. Including the views during the live broadcast, the video was viewed about 4,000 times in total before being removed from Facebook," the company’s vice-president Chris Sonderby said in a statement on its site.

“We remain shocked and saddened by this tragedy and are committed to working with leaders in New Zealand, other governments, and across the technology industry to help counter hate speech and the threat of terrorism. We continue to work around the clock to prevent this content from appearing on our site, using a combination of technology and people.”

A group of students sings in front of flowers left in tribute to victims at the Botanical Garden in Christchurch. Photo: AFPA group of students sings in front of flowers left in tribute to victims at the Botanical Garden in Christchurch. Photo: AFP

Read: Hero refugee chased gunman away from second New Zealand mosque

Facebook removed the attacker’s video within minutes of users’ reports, the first of which was made 12 minutes after the live broadcast ended – by which time 49 people were dead. Another person died later.

However, before it was alerted, a user had already posted a link to a copy of the video on a file-sharing site.

He explained that the original Facebook Live video was removed and marked in such as way that shares that were visually similar would be detected and automatically removed from Facebook and Instagram.

To detect screen recordings, additional detection systems based on audio technology were used.

In the first 24 hours, Facebook removed about 1.5 million videos of the attack globally. More than 1.2 million of those videos were blocked at upload, and were therefore prevented from being seen on its services.

“This incident highlights the importance of industry cooperation regarding the range of terrorists and violent extremists operating online,” he said, noting that it had also identified abusive content on other social media sites in order to assess whether or how that content might migrate to one of its platforms.

Sign up to our free newsletters

Get the best updates straight to your inbox:
Please select at least one mailing list.

You can unsubscribe at any time by clicking the link in the footer of our emails. We use Mailchimp as our marketing platform. By subscribing, you acknowledge that your information will be transferred to Mailchimp for processing.