Children "remain an afterthought" for leading social media companies, England's Children's Commissioner, Anne Longfield, has said.
And she has written to the biggest companies, urging them to commit to tackling issues of disturbing content.
Her letter follows the suicide of , who killed herself after viewing distressing self-harm images on Instagram.
The companies say they are working hard to keep their platforms safe.
Ms Longfield's letter is addressed to YouTube, Pinterest, Facebook, which also own WhatsApp and Instagram, and Snapchat.

It urges them to back the introduction of a statutory duty of care where they would have to prioritise the safety and wellbeing of children using their platforms.
She also calls for a digital ombudsman - paid for by the industry - who would act as an independent arbiter between young users and technology companies.
The letter says: "The tragic suicide of Molly Russell and her father's appalled response to the material she was viewing on social media before her death have again highlighted the horrific amount of disturbing content that children are accessing online.
"I do not think it is going too far to question whether even you, the owners, any longer have any control over their [platforms'] content.
"If that is the case, then children should not be accessing your services at all and parents should be aware that the idea of any authority overseeing algorithms and content is a mirage."
Ms Longfield says she believes that, while she has been told by the industry that the issue is being taken seriously, there is "still a failure to engage and that children remain an afterthought".
The commissioner calls on the industry to "accept there are problems and to commit to tackling them - or admit publicly that you are unable to".
Her letter ends: "With great power comes great responsibility and it is your responsibility to support measures that give children the information and tools they need growing up in this digital world - or to admit that you cannot control what anyone sees on your platforms."
A spokesman for Instagram and Facebook acknowledged the company had a "huge responsibility" to make sure young people were safe.
"Our thoughts are with Molly's family and with the other families who have been affected by suicide or self-harm," he said.
"We are undertaking a full review of our policies, enforcement and technologies and are consulting further with mental health experts to understand what more we can do.
"In the meantime, we are taking measures aimed at preventing people from finding self-harm related content through search and hashtags."
YouTube says it has policies that prohibit videos that promote self-harm and will remove flagged videos that violate this policy.
A spokeswoman for Snapchat said: "We work hard to keep Snapchat a safe and supportive place for everyone.
"From the outset, we have sought to connect our community with content that is authoritative and credible and safeguard against harmful content and disinformation."
The promotion of self-injury or eating disorders was not allowed, she added.
A spokeswoman for Pinterest said : "We don't want people to ever see disturbing content on our platform, and it is deeply upsetting to us if they do.
"We have assembled a special team that is urgently working to strengthen our technology that helps keep unwanted content off Pinterest.
"In addition, we are working with more outside groups with expertise in these issues to review our policy and enforcement guidelines and ensure we get this right."
Source: bbc.com