Police uncover secret Instagram group where teenage girls as young as 12 prepare and plan to commit suicide

0

 

Police uncover secret Instagram group where teenage girls as young as 12 prepare and plan to commit suicide

 

Police have uncovered a secret Instagram group named ‘suicide’ involving twelve girls, aged between 12 and 16 years old across southern England, where the young girls plan to have “suicidal crises” and “serious self-harm”.

According to the BBC, the police investigation discovered the online group when three of the girls went missing and were found seriously unwell in London.

The group of girls travelled by train to meet in London.

They were found seriously unwell in a street and taken by ambulance to hospital for emergency treatment.

One of the girls mentioned they had first met each other online and discussed suicide, according to the police briefing.

Police officers then examined digital devices to identify the name of the online group and its other members.

Seven of the 12 girls had self-harmed prior to being traced by the police. Children’s social care services from seven different local authorities have been involved in safeguarding children identified as members of the group.

Police said in a statement to BBC that “peer-to-peer influence increased suicidal ideation amongst the children involved to the extent that several escalated to suicidal crises and serious self-harm.”

 

Instagram says it found no evidence of its rules being broken as it uses Artificial Intelligence (AI) to hunt and block self-harm posts and groups.

Some of the children had met on other social media platforms but were part of a closed Instagram group – a direct message thread – whose title used the words “suicide” and “missing”.

 

Facebook, which owns Instagram, does not deny that the name of the closed group referenced “suicide” but says it has not been removed from the platform because the content of the messages does not break its rules.

In a statement, a company spokesperson said it was co-operating with the police.

“We reviewed reports and found no evidence of the people involved breaking our rules around suicide and self harm.

“We don’t allow graphic content, or content that promotes or encourages suicide or self-harm, and will remove it when we find it.

“We’ll continue to support the police and will respond to any valid legal request for information.”

 

 

 

 

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More