The internet was used to commit 53 child sexual offences in the last 12 months, according to latest figures, prompting the NSPCC to call for tougher safety checks on what children can access online.

Peter Wanless, chief executive of the NSPCC said: “Leaving social media sites free to make up their own rules when it comes to child safety is unacceptable. We need legally enforceable universal safety standards that are built in from the start.

“We’ve seen time and time again social media sites allowing violent, abusive or illegal content to appear unchecked on their sites, and in the very worst cases children have died after being targeted by predators or seeing self-harm films posted online.

We’ve seen time and time again social media sites allowing violent, abusive or illegal content to appear unchecked on their sites, and in the very worst cases children have died after being targeted by predators or seeing self-harm films posted online.

“It makes no sense that in the real world we protect children from going to night clubs or seeing over-18 films, but in the online world –where they spend so much of their time - we have no equivalent safeguards to protect them from harmful strangers or violent content.

“Enough is enough. Government must urgently bring in safe accounts, groomer alerts and specially trained child safety moderators as a bare minimum to protect our children. And websites who fail to meet these standards should be sanctioned and fined.’

A ‘How Safe Are Our Children’ conference sets out steps the Government must force social media companies to take.

Ahead of NSPCC’s annual flagship ‘How Safe Are Our Children’ conference, the charity has set out the three steps the Government must force social media companies to take:

• Safe Accounts for under-18s. Social networks must be required to give all under-18s safe accounts the highest privacy settings as default and better controls over who they connect with, so that all followers must be approved by the young person.

• Grooming and bully alerts. Social media companies must track patterns of grooming or abusive language by users, take swift action against those individuals and send a grooming or bully notification to children when they are being targeted in this way.

• Hire an army of online child safety guardians.

Every social media company must hire dedicated child safety moderators to help protect their young users, and disclose the number of reports those moderators receive to the regulator.