The broadcast of horrific New Zealand shooting by one of the accused in the rampage turned out to be a nightmare for internet and social media giants like YouTube, Facebook and Twitter. The firms had to work overtime to remove versions of the incident after the shooter broadcast his mosque rampage on social media, that continue to crop up online. A number of companies such as Facebook, Twitter, Google, and Reddit took steps to remove one too many versions of the video containing the scenes of the ghastly act.
On Saturday, Facebook said it removed 1.5 million videos of the incident within hours after it was live-streamed. The California-based companies said around 1.2 million of those videos have been blocked. The company also said that fewer than 200 people watched the video that the shooter live streamed on the platform.
Other companies which took similar steps include Twitter, Reddit and YouTube. Banning a forum that uploaded the video of the attack, Reddit said it did not follow the company’s policies by “glorifying or encouraging violence.” However, hours later, videos are still available online as tech firms only delete duplicate versions.
YouTube said it has also deleted thousands of videos from its platform. It also removed the human review from its usual content moderation process so as to take down violent content of the massacre, quickly. A YouTube spokesperson told CNBC that it terminated a number of accounts created to promote incidents. Read | Facebook adds ‘Gaming’ to main navigation, challenges YouTube and Twitch Earlier, YouTube took steps to prioritise news reports during any trending event, instead of videos that may spread wrong information. However, a number of copied videos of the shooting were altered, which the social network site was unable to detect.
The company said it suspended the ability to sort searches by upload date while removing videos of the attack to make it tougher to find.

Source: financialexpress.com