The world of digital safeguarding is constantly changing for example, we recently wrote to you about emerging harms in gen AI. This week's newsletter is to give a couple of updates on things affecting your students' online worlds which you need to know about and share with colleagues:
The Online Safety Act duties coming into force
Latest lWF stats & trends on child sexual abuse material
Content warning - references to child sexual abuse below - please be aware.
The UK's Online Safety Act
You don't have time to wade through hundreds of pages of guidance and codes, so we put together an infographic to show what's happening with the implementation of the Online Safety Act and the regulator Ofcom. Why not share it with staff?
There is a long way to go but hopefully these changes will make a real difference on the online harms facing your students (especially regarding pornography, social media & search platforms and also effective! age-checking online).
New CSAM figures from the Internet Watch Foundation
The Internet Watch Foundation is the only body outside of law enforcement allowed to actively scan for online CSAM (child sexual abuse material). At LGfL we are proud to be an IWF member, to use their services and to support their aims.
IWF recently released the data for 2024, during which they found 729,696 CSAM images. You can read the full report here, but we would like to bring your attention to the figures below.
As you can see, they found more images showing the sexual abuse of 7-10 year-olds than any other age group. The fact that 3-6s were seen more than 14-15 and 16-17s combined and that there were 13,032 0-2 year-olds seen is beyond words.
The 11-13 age group retains the highest risk in secondaries, with the older teen groups both increasing in number (and more likely to involve sexual extortion).
Overall, the vast majority of CSAM relates to girls (97%) but a greater proportion of images involving boys shows the most serious abuse (Category A).
The bar chart below is important to highlight that every one of these awful images is criminal and highly serious.
What can we do about it?
Of course legislation and regulation of the internet and law enforcement action against predators is key. But what can or should schools do, as it is important to understand this background to the education we deliver?
In schools, we would recommend you:
Consider how you might share relevant data in staff training (mindful that this may be traumatic and should be done by the DSL)
What key areas might be wise to share with parents, and what language could be used to make the issue clear but appropriate for such communications
In the early years and in primary, LGfL's Undressed song and animation is a good way to teach children never to get undressed near a device (and its camera), as is, of course, Pantosauras.
Relationships, Sex and Health Education across primary and secondary is the home for teaching about consent, bodies, touching and sex, and we know schools are doing great work in this area, but it may be wise to review the curriculum against what is being seen in these statistics and recent AI trends we shared. Is your curriculum preparing young people for these realities?
One resource which is relatively new which may be useful for secondary and which also crosses over into AI is Protect Us from WeProtect. Note this should not be used without consulting the DSL and excerpts or screenshots of specific moments may be best.
One way to get a round-up of resources and strategies to help you in this area is to attend our free training 'Online Safeguarding Essentials - Online Sexual Abuse and Harms', based on the latest data and practices. Book here.