Instagram for Kids: Is This Rumored New App a Good Idea?
With more than a billion users worldwide, Instagram has come under fire recently for the dangers it presents to kids. Bullying, online predators, and adult content are just some of the many threats that make the platform a less than ideal place for children. Last week, the company announced a slate of new child safety features to help make Instagram safer for its youngest users.
These rollouts include changes in how adults and kids can communicate as well as in-app experiences aimed at making kids think harder about who they’re talking to and how. An Instagram for Kids app may also be in the works, which is pretty huge news! This unexpected development may be similar in functionality to the Facebook Messenger Kids platform — but it also may just be smoke and mirrors on the part of Instagram.
New Safety Features, Explained
Instagram’s latest safety announcement provides an in-depth explanation of all of the new features aimed at giving kids a safer experience on the platform, including:
- Improving their attempts to understand a user’s real age at the point of sign up
- Prohibiting DMs between teens and adults they are not following
- Prompting teens to try and be more cautious when DMing
- Making it harder for adults to search for and follow teens
- Encouraging teens to set their accounts to private
While these can be seen as a step in the right direction and an acknowledgment of the platform’s issues, most of the improvements are aspirational at best. For example, there’s still no way to actually verify users are over 13 when they create an account. The company responds to this problem by stating that it’s just common across all social media platforms. Several of the new safety features also use language like “encouraging” and “prompting,” which still leaves the ultimate decision for things like DMing and private accounts completely up to the child.
In addition to these new features, Instagram has also published a Parent’s Guide to the platform. This booklet explains the different privacy options for kids, provides a glossary of common terms, and suggests helpful ways to start safety conversations.
Instagram for Kids: What It May Look Like
Rumors of an Instagram for Kids app emerged last week as well, although Instagram hasn’t posted details about this development as of yet. Sources like The Associated Press, Buzzfeed News, and People have, however, confirmed that this new app may really be in the works.
The big question is, of course, what will it look like? We imagine it may follow in the footsteps of parent company Facebook’s family-friendly option, Messenger Kids. With it, parents set up and manage their child’s Messenger Kids account through their own Facebook account. This enables them to have complete oversight and transparency with regards to what their kid is doing online.
An Instagram version of this would likely involve a similar parent-managed account, with approval required for who a child follows and is followed by. And, because many of the dangers of Instagram are external, it’s likely that search, message requests, hashtags, and the Explore page may be turned off, as well. This sort of closed-world environment would provide a much safer way for kids to engage in photo sharing, commenting, replying, and messaging on the app.
Will This New App Just Be Smoke And Mirrors?
This new kid’s version of Instagram sounds like a good way to provide children of that in-between age (around 10 to 13, which is the official age required to have a full Instagram account) with a more supervised experience. So, what’s the issue?
First, this development appears on the surface to be a step forward in child safety, but it’s tempting to ask: who does getting users introduced to an app at a young age benefit the most? Most times, it’s the company. Secondly, a brand new app for younger kids does nothing to help protect the kids most at risk — the ones who are already on the platform (either of legal age or those who have fudged their birthdate). Right now, there’s a real tension in the world of social media between parents and the platform. At the end of the day, who’s responsible for child safety?
Parents Need More Tools, Not More Apps
At Bark, we believe that parents are usually in the best position to help protect their kids while they’re online. Big Tech doesn’t need to step in and parent — but we also understand that we need their help. What does that look like? It could be something like parental controls that actually work, or at least ones that don’t let kids unilaterally turn them off whenever they want (sadly, an all too common problem for Instagram, TikTok, Snapchat, and countless other apps).
Instagram for Kids could be a good idea, but online safety isn’t a problem with a one-size-fits-all solution. What would truly benefit families is the ability for parents to actually supervise and manage their children’s social media experiences every step of the way. Kids could then grow up with custom settings and experiences to help them as they learn to use tech responsibly —with their parents providing guidance and guardrails along the way.
Bark is a comprehensive online safety solution that empowers families to monitor content, manage screen time, and filter websites to help protect their kids online. Our mission is to give parents and guardians the tools they need to raise kids in the digital age.