Instagram still posing serious risks to children, campaigners say

Young Instagram s could still be exposed to "serious risks" even if they use new Teen s brought in to provide more protection and control, research by campaigners suggests.
Researchers behind a new report have said they were able to set up s using fake birthdays and they were then shown sexualised content, hateful comments, and recommended adult s to follow.
Meta, which owns Instagram, says the report is "filled with inaccuracies and demonstrates a misunderstanding" of the Teen s, and that parents "find these new protections helpful".
The research, from online child safety charity 5Rights Foundation, is released as Ofcom, the UK regulator, is about to publish its children's safety codes.
They will outline the rules platforms will have to follow under the Online Safety Act. Platforms will then have three months to show that they have systems in place which protect children.
That includes robust age checks, safer algorithms which don't recommend harmful content, and effective content moderation.
Instagram Teen s were set up in September 2024 to offer new protections for children and to create what Meta called "peace of mind for parents".
The new s were designed to limit who could s and reduce the amount of content young people could see.
Existing s would be transferred to the new s and those g up for the first time would automatically get one.
But researchers from 5Rights Foundation were able to set up a series of fake Teen s using false birthdays, with no additional checks by the platform.
They found that immediately on sign up they were offered adult s to follow and message.
Instagram's algorithms, they claim, "still promote sexualised imagery, harmful beauty ideals and other negative stereotypes".
The researchers said their Teen s were also recommended posts "filled with significant amounts of hateful comments".
The charity also had concerns about the addictive nature of the app and exposure to sponsored, commercialised content.
Baroness Beeban Kidron founder of 5Rights Foundation said: "This is not a teen environment."
"They are not checking age, they are recommending adults, they are putting them in commercial situations without letting them know and it's deeply sexualised."
Meta said the report was "filled with inaccuracies and demonstrates a misunderstanding of how Teen s work, which we could have clarified if [5Rights Foundation] had shared the report with us".
A spokesperson said: "We developed Teen s following from parents, and a recent survey showed that 94% of parents find these new protections helpful.
"Fundamentally changing Instagram for tens of millions of teens around the world is a big undertaking, and we know we will need to work tirelessly to get it right and bring parents peace of mind."

In a separate development BBC News has also learned about the existence of groups dedicated to self-harm on X.
The groups or "communities", as they are known on the platform, contain tens of thousands of sharing graphic images and videos of self-harm.
Some of the s involved in the groups appear to be children.
Becca Spinks, an American researcher who discovered the groups, said: "I was absolutely floored to see 65,000 of a community."
"It was so graphic, there were people in there taking polls on where they should cut next."
X was approached for comment, but did not respond.
But in a submission to an Ofcom consultation last year X said: "We have clear rules in place to protect the safety of the service and the people using it."
"In the UK, X is committed to complying with the Online Safety Act," it added.

Get our flagship newsletter with all the headlines you need to start the day. Sign up here.