New law snares 35 Child groomers in Thames Valley

More than 35 child grooming crimes were recorded in Thames Valley in the first six months of a new offence being brought in as a result of the NSPCC's Flaw in the Law campaign.
More than 35 child grooming crimes were recorded in Thames Valley in the first six months of a new offence being brought in as a result of the NSPCCs Flaw in the Law campaign.More than 35 child grooming crimes were recorded in Thames Valley in the first six months of a new offence being brought in as a result of the NSPCCs Flaw in the Law campaign.
More than 35 child grooming crimes were recorded in Thames Valley in the first six months of a new offence being brought in as a result of the NSPCCs Flaw in the Law campaign.

The figure has been released by the children’s charity after a Freedom of Information request revealed 1,316 offences of Sexual Communication with a Child were recorded by police forces in England and Wales in the six months between April 2017 when the law changed, to October 2017. Of that total 37 were recorded by Thames Valley Police.

Now the NSPCC is calling on Government and social networks to develop technology already at their disposal to prevent grooming, and bring in grooming alerts for victims and moderators.

Hide Ad
Hide Ad

The Freedom of Information request to Thames Valley Police also showed that:

Now the NSPCC is calling on Government and social networks to develop technology already at their disposal to prevent grooming, and bring in grooming alerts for victims and moderators.Now the NSPCC is calling on Government and social networks to develop technology already at their disposal to prevent grooming, and bring in grooming alerts for victims and moderators.
Now the NSPCC is calling on Government and social networks to develop technology already at their disposal to prevent grooming, and bring in grooming alerts for victims and moderators.

· Snapchat was the most common method used by groomers in the county.

· Girls aged 12-15 in Thames Valley were the most likely to be targeted by predators.

Before the new anti-grooming law came into force on April 3 last year, police could not intervene until groomers met their victims. In 2015 former England footballer Adam Johnson sent sexual messages to a 15-year-old girl, before meeting her and engaging in sexual activity. Police being able to step in sooner is an important step, but Government must now act to prevent grooming.

Hide Ad
Hide Ad

Algorithms are already used by social networks to target adverts at social media users, and to detect illegal content online. The same techniques must now be developed to:

Now the NSPCC is calling on Government and social networks to develop technology already at their disposal to prevent grooming, and bring in grooming alerts for victims and moderators.Now the NSPCC is calling on Government and social networks to develop technology already at their disposal to prevent grooming, and bring in grooming alerts for victims and moderators.
Now the NSPCC is calling on Government and social networks to develop technology already at their disposal to prevent grooming, and bring in grooming alerts for victims and moderators.

Alert children to potential grooming behaviour from adults they speak to online.

Alert moderators of suspected groomers and enable them to notify police.

Groomer alerts for children must be introduced as part of the Government’s Internet Safety Strategy.

Hide Ad
Hide Ad

Where children are speaking to adults online, it’s possible for grooming language to be automatically picked up using algorithms in order to send an alert to children; allowing them to think twice about the chat they’re having and offering them support if needed.

The Department for Digital, Culture, Media and Sport could make this happen. Yet DCMS has said that its Internet Safety Strategy will produce a code for social networks that will only be voluntary, and that code will not include measures to prevent grooming.

The NSPCC is arguing that this doesn’t go far enough.

Home Office must work with social networks so grooming suspects can be automatically flagged to moderators.

At present algorithms already automatically flag child abuse images, hate speech and extremist content to moderators for removal. The NSPCC is calling on the Home Office to work with industry to use existing technology to flag unusual account patterns associated with grooming behaviours. For example, friending and following many young people with no mutual friends and no geographic links, getting a high number of rejected friend requests from children, or spikes in views of posts made by under-18 accounts.

Hide Ad
Hide Ad

Where moderators believe criminal activity is taking place, they can notify police.

Tony Stower, NSPCC Head of Child Safety Online, said: “Despite the staggering number of grooming offences in just six months, Government and social networks are not properly working together and using all the tools available to stop this crime from happening.

“Government’s Internet Safety Strategy must require social networks to build in technology to keep their young users safe, rather than relying on police to step in once harm has already been done.

“If Government makes a code for social networks that is entirely optional and includes no requirement for platforms to tackle grooming, this is a massive missed opportunity and children will continue to be put at risk.”

Children who are worried about inappropriate messages online can contact Childline on 0800 11 11 or using the online chat function atwww.childline.org.uk.