“It appears that Facebook is not responding to a need, but instead creating one, as this platform appeals primarily to children who otherwise do not or would not have an Instagram account,” according to an open letter signed by attorneys general from 44 U.S. states and territories, addressed to Facebook CEO Zuckerberg.
Facebook in March confirmed that it’s in the early stages of developing a restricted, special-purpose version of Instagram managed by parents for children under 13.
The attorneys general who cosigned the letter raised several key concerns about the project. They cited research showing that social media can be harmful to the emotional and mental well-being of children; agued that children “do not have a developed understanding of privacy”; and pointed to “alarming rates of cyberbullying” on Instagram in particular.
The AGs also cited a U.K. study that found more cases of “sexual grooming on Instagram than any other platform,” and noted that in 2020 Facebook and Instagram reported 20 million child sexual abuse images. In addition, the attorneys general wrote, “Facebook has a record of failing to protect the safety and privacy of children on its platform, despite claims that its products have strict privacy controls.”
“In short, an Instagram platform for young children is harmful for myriad reasons,” the AGs’ letter concluded.
In response to the AGs latter, a Facebook spokesperson said in a statement, “We agree that any experience we develop must prioritize their safety and privacy, and we will consult with experts in child development, child safety and mental health, and privacy advocates to inform it. We also look forward to working with legislators and regulators, including the nation’s attorneys general. In addition, we commit today to not showing ads in any Instagram experience we develop for people under the age of 13.”
The “Instagram for Kids” app would fall under the U.S.’s Children’s Online Privacy Protection Act, a federal law that bars internet services from collecting data from kids under 13.
Facebook launched the Messenger Kids app for kids under 13 in 2017 — and the product immediately drew concern from consumer-privacy advocates. In 2019, a bug in Messenger Kids let children to join groups with strangers, The Verge reported; Facebook at the time said the glitch affected only a “small number of group chats.”
According to Instagram, it’s developing new artificial intelligence and machine learning technology to help it detect individual users’ ages. Despite its requirement that users be at least 13, “we know that young people can lie about their date of birth,” Instagram said in a recent blog post. “We want to do more to stop this from happening, but verifying people’s age online is complex and something many in our industry are grappling with.”
At a congressional hearing in March 2021 about social media and misinformation, Zuckerberg was asked about social media’s impact on children but wave off concerns it could be harmful. “The research we’ve seen is that using social apps to connect to other people can have health benefits,” the Facebook chief said at the hearing. Regarding the Instagram for Kids app, he said, “We’re early in thinking through how this service would work” but added there is clearly “a large number of people under the age of 13 who would want to use a service like Instagram.”
The AGs who sent the letter to Facebook Monday said that contrary to Zuckeberg’s statements, solid data and research have shown a link between young people’s use of social media and an increase in mental distress, self-injurious behavior and suicidality. In addition, they charged that Instagram has been frequently flagged for increasing suicidal ideation, depression and body-image concerns in children.
“Not only is social media an influential tool that can be detrimental to children who are not of appropriate age, but this plan [for an under-13 version of Instagram] could place children directly in the paths of predators,” New York Attorney General Letitia James said in a statement Monday.