Regulatory gaps in the Online Safety Act also mean that private messaging services have fewer protections for children than public spaces, leading to fears that harms will migrate from public spaces into private ones. Research commissioned by the NSPCC highlights the existing and emerging technological solutions tech platforms can use to prevent, detect and disrupt grooming and online abuse. The research report aims to equip policymakers, regulators and the tech industry with an evidence-based framework that supports strategic, collaborative action to protect children online.
Key findings:
+Tools and interventions should tackle every stage of the online grooming process.
The report identifies four stages of online grooming:
- targeting and approach
- gaining trust
- trust development and isolation
- maintaining control.
+ Online platforms need to share information with each other to keep children safe.
Perpetrators of online grooming often move between platforms to evade detection or re-offend. Cross-platform signal sharing allows for information on certain users to be shared between platforms. This would help platforms to detect suspicious behaviour at all stages of the grooming process.
+ A repository of grooming behavioural indicators could help to better detect signs of grooming.
Machine learning models could be trained to detect subtle grooming cues, such as common language used by perpetrators and grooming behavioural patterns. Users would then be notified if a message they received was flagged as unsafe.
+ On-device safety features can be particularly effective in end-to-end encrypted (E2EE) spaces.
On-device solutions, such as safety features that auto-blur nude images or provide prompts to prevent users from sharing personal information, can effectively protect children against different online harms, particularly in private messaging and E2EE environments where safety features within the platform may be limited.
+ A systemic, collaborative approach is required to build a more secure online world.
Tackling online grooming cannot be the responsibility of a single group or organisation. It requires a systemic, collaborative approach involving government, online platforms, safety tech developers, regulators, device manufacturers, end users and civil society organisations such as the NSPCC and the Internet Watch Foundation (IWF).