A Facebook rep confirmed the project. “We’re exploring bringing a parent-controlled experience to Instagram to help kids keep up with their friends, discover new hobbies and interests, and more,” company spokesperson Joe Osborne said.
According to Osborne, “Increasingly kids are asking their parents if they can join apps that help them keep up with their friends. Right now there aren’t many options for parents, so we’re working on building additional products — like we did with Messenger Kids — that are suitable for kids, managed by parents.”
Facebook launched the Messenger Kids app for kids under 13 in 2017 — and the product immediately elicited concern from consumer-privacy advocates. In 2019, a bug in Messenger Kids let children to join groups with strangers, the Verge reported; Facebook at the time said the glitch affected only a “small number of group chats.”
The social giant is in the earliest stages of mapping out the Instagram-for-kids app. The app would fall under the U.S.’s Children’s Online Privacy Protection Act, a federal law that bars internet services from collecting data from kids under 13.
Word of Instagram’s plans for the kids app comes two days after the division announced a series of updates on new features and resources “as part of our ongoing efforts keep our youngest community members safe.” That includes a new Parents Guide for the U.S. created in partnership with the Child Mind Institute and ConnectSafely, along with versions in other countries developed with local experts.
In addition, according to Instagram, it’s developing new artificial intelligence and machine learning technology to help it detect individual users’ ages. Despite its requirement that users be at least 13, “we know that young people can lie about their date of birth,” Instagram said in a blog post. “We want to do more to stop this from happening, but verifying people’s age online is complex and something many in our industry are grappling with.”
Development of the Instagram app for kids is being led by Facebook VP Pavni Diwanji, who joined the company last December, per the BuzzFeed report. Previously, she worked at Google for 13 years and from 2013-18 oversaw its kid-focused products, including YouTube Kids.
YouTube Kids, aimed at children 4-12, provides a curated, white-listed video experience with parental controls. YouTube also is rolling out a new “supervised” account option for parents who feel their tweens or teens have outgrown the YouTube Kids app but aren’t yet ready for unrestricted YouTube.
Meanwhile, TikTok in the U.S. offers a specialized version of its service to users under 13 in a limited app experience called TikTok for Younger Users, which the company says “offers additional safeguards and privacy protections designed specifically for a younger audience.” TikTok introduced that under a settlement with the FTC over allegations it illegally collected personal info from children in violation of COPPA.