The NSPCC says the figures are “overwhelming evidence that keeping children safe cannot be left to social networks”.

 

The number of children targeted for grooming and abuse on Instagram has more than tripled – with some of the victims as young as five years old.

Figures obtained by the NSPCC suggest there were 5,161 reports of sexual communications with a child recorded in just 18 months.

Facebook, Snapchat and Instagram were used in 70% of those incidents.

Girls aged 12 to 15 were most likely to be targeted, but roughly one in five victims were under the age of 11.

The NSPCC’s chief executive, Peter Wanless, has accused social media firms of “10 years of failed self-regulation”.

He said: “These figures are overwhelming evidence that keeping children safe cannot be left to social networks.”

The charity obtained freedom of information data from 39 of the 43 police forces in England and Wales.

Computer Forensics & Mobile Phone Forensics are often used in Police Investigations to aid in convictions. 

In incidents where police recorded the method used to contact a child, Instagram was used by groomers 126 times between April and September 2017.

This increased to 428 for the same period last year.

The figures come amid growing criticism of how social networks protect the children using their platforms.

The government is due to release a white paper about online harms, and the NSPCC hopes this will include new laws to tackle grooming.

Mr Wanless warned: “We cannot wait for the next tragedy before tech companies are made to act.

“It is hugely concerning to see the sharp spike in grooming offences on Instagram, and it is vital that the platform designs basic protection more carefully into the service it offers young people.”

Girls aged 12 to 15 were the most likely targets
Image:Girls aged 12 to 15 were the most likely targets

 

One victim told the NSPCC of how she was groomed by a 24-year-old man when she was just 13.

She had met him in person through a friend and he initially said he was 16, then 18, before he added her on Facebook and Snapchat the same evening.

The girl said it “escalated very quickly” before he encouraged her to share photos of herself and meet for sex after school.

She added: “He drove me somewhere quiet… and took me into the woods and had sex with me.

“He drove me in the direction of home straight afterwards, refusing to even talk, and then kicked me out of the car at the traffic lights. I was bleeding and crying.”

The girl’s mother added: “Somebody has got to take responsibility for what happens to children on their platforms. Simply put, if social media didn’t exist, this would never have happened.”

A National Crime Agency spokesman said: “It is vital that online platforms used by children and young people have in place robust mechanisms and processes to prevent, identify and report sexual exploitation and abuse, including online grooming.

“Children and young people also need easy access to mechanisms allowing them to alert platforms to potential offending.”

A spokesperson for Facebook and Instagram said: “Keeping young people safe on our platforms is our top priority and child exploitation of any kind is not allowed.

“We use advanced technology and work closely with the police and CEOP to aggressively fight this type of content and protect young people.”

On Thursday, YouTube announced it is disabling comments on videos featuring children after a vlogger alleged he had found instances of paedophiles targeting videos of young girls on the site.

Children as young as five years old are being targeted for grooming on Instagramwhere attempts have more than tripled, the NPCC has warned.

More than 5,100 online grooming crimes were recorded by police in just 18 months after a new offence of sexual communication with a child came into force, figures show.

In cases where officers recorded how victims were contacted, Facebook, Snapchat and Instagram were used 70 per cent of the time, according to the data obtained by the NSPCC, with Instagram accounting for 33 per cent.

The charity’s chief executive, Peter Wanless, accused social media firms of “10 years of failed self-regulation”.

“These figures are overwhelming evidence that keeping children safe cannot be left to social networks,” he said.

He added: “It is hugely concerning to see the sharp spike in grooming offences on Instagram, and it is vital that the platform designs basic protection more carefully into the service it offers young people.”

Facebook was the second most common platform chosen by groomers, used in 23 per cent of offices, followed by Snapchat which was the platform used in 14 per cent of crimes.

The data runs from April 2017, when the law was changed, and September 2018 and was obtained through freedom of information requests to 39 of the 43 police forces in England and Wales.

In most instances, police forces did not record which particular website or app was used to groom the victim. But where they did, a steep increase in the use of Instagram was observed.

In the first six months since the law came into force, from April to September 2017, there were 126 recorded instances of Instagram being used to sexually groom a child.

Just one year later during the same time period, that number rose to 428, a 240-per-cent increase.

According to the NSPCC data, the most common target of online groomers were girls aged 12 to 15.

One in five victims, however, were aged under 11. Children as young as five were recorded as victims in some instances.

The government is due to publish a white paper on internet safety before the end of winter and Mr Wanless said it was vital it included tough new regulation.

The NSPCC is campaigning for tech firms to be given a legal duty of care to children who use their platforms and for large fines to be imposed on them when they fail to protect under-18s.

One mother of a 13-year-old girl who was groomed by a 24-year-old man over Facebook and Snapchat said if social media had not existed her daughter would have been spared her ordeal.

“We felt as though we had failed as parents – we knew about these social media sites, we thought we were doing everything we could to ensure our children’s safety when they were online, but we still couldn’t protect her.

“Somebody has got to take responsibility for what happens to children on their platforms. Simply put, if social media didn’t exist, this would never have happened to her.”

The white paper on internet safety was originally meant to have been published by the end of 2018, although that deadline later slipped to the end of the winter.

In February, a spokesperson for the Department for Digital, Culture, Media and Sport, said it had “heard” demands for an internet regulator and statutory duty of care and was “seriously considering all options”.

A National Crime Agency spokesperson said: “It is vital that online platforms used by children and young people have in place robust mechanisms and processes to prevent, identify and report sexual exploitation and abuse, including online grooming.

“Children and young people also need easy access to mechanisms allowing them to alert platforms to potential offending.

“The National Crime Agency helps industry to enhance their reporting tools and where possible, shares knowledge and expertise to support industry to improve standards and security online.”

https://news.sky.com/story/instagram-grooming-of-children-as-young-as-five-triples-11651339

https://www.independent.co.uk/news/uk/crime/instagram-grooming-sex-crime-police-report-nspcc-children-a8801876.html

 

Pin It on Pinterest

Share This