Facebook has stopped developing an Instagram for kids, which is intended to be used by children younger than 13 years old. This was done in response to concerns about the vulnerability of younger users.
“I believe it’s a good idea to create a version Instagram that’s safe for tweens. But we want to take time to talk with parents, researchers, and safety experts to get more consensus about how we move forward,” Adam Mosseri (head of Instagram) said Monday on NBC’s Today.
This announcement comes after an investigative series by The Wall Street Journal, which reported that Facebook knew that some teenage girls were using Instagram to cause anxiety and mental health problems.
However, Instagram’s development for a younger audience met with wider opposition almost immediately.
Facebook announced the creation of an Instagram Kids app in March. It stated that it was “exploring parent-controlled experiences.” Two months later, a bipartisan group comprising 44 attorneys general wrote to Mark Zuckerberg, asking him to end the project. They cited the well-being of children.
They also cited an increase in cyberbullying and vulnerability to online predators as well as Facebook’s “checkered record” in protecting children via its platforms. Similar criticisms were levelled at Facebook in 2017, when it launched Messenger Kids, a Messenger app that allows children to chat with their friends and family.
Fairplay’s executive director Josh Golin urged Fairplay Monday to pull the plug permanently on the app. A group of Democratic members of Congress also supported the move.
“Facebook is listening to our calls to halt plowing ahead with plans for launching a version Instagram for children,” tweeted Massachusetts Senator Ed Markey. But a pause is not enough. Facebook must abandon this project completely.
Antigone Davis, Facebook’s global safety chief, was scheduled to appear before the Senate on Thursday. This hearing will discuss what Facebook knows about the effects of Instagram on the mental health and well-being of its younger users.
Mosseri stated Monday that the company believes that children under 13 years old should have an appropriate platform to share age-appropriate content. Other companies, such as TikTok or YouTube, have apps for this age group.
In a blog post, he stated that it was better to have an Instagram version where parents can monitor and control their experience than rely on the company to verify that kids are allowed to use the app.
Mosseri stated that Instagram for children is for people between 10 and 12 years old. You will need parental permission to use the app. It is ad-free and will contain age-appropriate content. The app will allow parents to monitor their children’s use, supervise who can message them, and limit who they can follow.
While work on Instagram Kids is being halted, the company will expand opt-in parental supervision tools for teen accounts. Mosseri stated that more details about these tools would be revealed in the coming months.
This isn’t the first time Facebook has been criticized for its product targeting children. Child development experts urged Facebook to close its Messenger Kids app, saying that it wasn’t responding to a “need”, as Facebook claimed, but instead creating one.
Facebook decided to go ahead with the app in that instance.