Young people are not equipped to deal with pornography. It can have a deeply damaging effect on their behavior and their understanding of consent and healthy relationships.
NSPCC research shows that by the age of 16 nearly half of young people have viewed pornography, and those children are just as likely to find it accidentally as they are to deliberately search for it.
At the NSPCC we have worked hard to stress this to Government, so its pledge to introduce a regulator and age verification measures to block children from accessing pornography websites online is a vital and welcome first step towards keeping children safe online.
But there is much more work to do.
By the age of 13 three quarters of young people now have a social media account. For children these networks are a way of chatting with friends, watching funny videos or finding out about the world.
Yet all too often we hear of grooming, hate speech, self-harm content, cyber bullying and even child abuse images cropping up on social media.
Some social networks have designed their platforms with child safety in mind, others have not. Some are good at taking action when harmful content is reported, others are not.
The problem is that each network has their own rules for handling inappropriate content or abusive behaviour on their platforms. This leads to inconsistencies in keeping children safe, and ultimately means social networks are marking their own homework.