#News

Instagram Brings Stricter Safety Features For Teenage Users

Instagram Brings Stricter Safety Features For Teenage Users

Date: September 18, 2024

After congressional pressure from multiple angles, Meta is revamping safety features for teenage users, which should have taken place much earlier.

Last year, a coalition of state attorneys general sued Meta for allegedly relying on addictive features to hook kids and boost profits. This year, in January, Mark Zuckerberg issued a written apology to the families of victims of online abuse, sextortion, and other negative exposures during a tense hearing on Capitol Hill. The congressional pressure kept intensifying as more organizations joined hands against Meta’s alleged manipulative tactics on underage users. 

As a result, Meta has finally announced revamping safety features and policies around teenage users. This revamp includes strict actions to safeguard under-18 users from potential online exposure to nonconsensual explicitly, interactions and online predators.

Instagram announced that it will, by default, place users under the age of 18 into a ‘teen accounts’ category. All accounts in this category will have unknown users blocked from viewing or interacting with their content. It is unclear if the underage accounts will remain discoverable to them or not.

Instagram said it will also mute all notifications for teen users between 10 PM and 7 AM. It will also send time-limit reminders urging underage users to close the app after 60 minutes of continuous usage.

Meta also claims to bring stricter parental controls to help parents monitor, review, and bring changes to safeguard their children. Parents will be able to view which accounts the underage account users recently interacted with, including messaging, follows, and requests. They can also set custom time limits and block complete access for certain periods. Users falling in the under-16 age category will require permission from their parents to make any changes to their accounts or related safety settings.

Meta’s announcements fell flat on the regulators and organizations that have sued the company, as this act is being touted as a way to escape legal litigations. Multiple online safety groups have shared their response to Instagram’s recent move, claiming it to be another false promise that will soon fade away.

“Meta can push out as many ‘kid-’ or ‘teen’-focused ‘features’ as it wants, but it won’t change the fact that its core business model is predicated on profiting off and encouraging children and teenagers to become addicted to its products – and American parents are wise to the hustle and demanding legislative action,” said Haworth in a statement to a news media website.

The bipartisan Kids Online Safety Act and COPPA 2.0 are two landmark bills that impose a legal duty of care on social media platforms like Meta, TikTok, and Snapchat. This bill will change the scenario for many social media platforms and push them to make strict, immediate, and adequate measures for ensuring online safety of children.

Arpit Dubey

By Arpit Dubey LinkedIn Icon

Have newsworthy information in tech we can share with our community?

Post Project Image

Fill in the details, and our team will get back to you soon.

Contact Information
+ * =