A video seemingly filmed by the man charged with murder after the killing of at least 49 people and wounding of at least 20 in shootings at two mosques in Christchurch, has been widely seen on social media.
The incident once again highlights how these platforms deal with such content.
While Facebook, Twitter, Reddit and YouTube raced to remove it, they failed to stop it being shared.
It raises questions about who is sharing it and why but, perhaps more importantly, how these platforms are dealing with the threat of far-right extremism.
Many members of the public have taken to Twitter to express shock and anger at the fact that the video is still in circulation on lots of platforms, with others pleading for people to stop sharing it.
One pointed out: "That is what the terrorist wanted."
The video, which shows a first-person view of the killings, has been widely circulated.
All of the social media firms sent heartfelt sympathy to the victims of the mass shootings and reiterated that they act quickly to remove inappropriate content.
Facebook said; "New Zealand Police alerted us to a video on Facebook shortly after the live-stream commenced and we removed both the shooter's Facebook account and the video.
"We're also removing any praise or support for the crime and the shooter or shooters as soon as we're aware. We will continue working directly with New Zealand Police as their response and investigation continues."
And in a tweet, YouTube said "our hearts are broken", adding it was "working vigilantly" to remove any violent footage.
In terms of what they have done historically to combat the threat of far-right extremists, their approach has been more chequered.
Twitter acted to remove alt-right accounts in December 2017. Previously it has removed and then reinstated the account of Richard Spencer, an American white nationalist who popularised the term "alternative right".
Facebook, which suspended Mr Spencer's account in April 2018, admitted at the time that it was difficult to distinguish between hate speech and legitimate political speech.
This month, YouTube was accused of being either incompetent or irresponsible for its handling of a video promoting the banned Neo-Nazi group, National Action.
British MP Yvette Cooper said the video-streaming platform had repeatedly promised to block it, only for it to reappear on the service.
Dr Ciaran Gillespie, a political scientist from Surrey University, thinks the problem goes far deeper than a video, shocking as that content has been.
"It is not just a question about broadcasting a massacre live. The social media platforms raced to close that down and there is not much they can do about it being shared because of the nature of the platform, but the bigger question is the stuff that goes before it," he said.
As a political researcher, he uses YouTube "a lot" and says that he is often recommended far-right content.
"There is oceans of this content on YouTube and there is no way of estimating how much. YouTube has dealt well with the threat posed by Islamic radicalisation, because this is seen as clearly not legitimate, but the same pressure does not exist to remove far-right content, even though it poses a similar threat.
"There will be more calls for YouTube to stop promoting racist and far-right channels and content."
His views are echoed by Dr Bharath Ganesh, a researcher at the Oxford Internet Institute.
"Taking down the video is obviously the right thing to do, but social media sites have allowed far-right organisations a place for discussion and there has been no consistent or integrated approach to dealing with it.
"There has been a tendency to err on the side of freedom of speech, even when it is obvious that some people are spreading toxic and violent ideologies."
Now social media companies need to "take the threat posed by these ideologies much more seriously", he added.
"It may mean creating a special category for right-wing extremism, recognising that it has global reach and global networks."
Neither under-estimate the enormity of the task, especially as many of the exponents of far-right views are adept at, what Dr Gillespie calls, "legitimate controversy".
"People will discuss the threat posed by Islam and acknowledge it is contentious but point out that it is legitimate to discuss," he said.
These grey areas are going to be extremely difficult for the social media firms to tackle, they say, but after the tragedy unfolding in New Zealand, many believe they must try harder.