Connect with us

Social Media

Instagram Unveils Teen Accounts Worldwide for Parental Guidance

Published

on

Instagram on Tuesday unveiled a round of changes that will make the accounts of millions of teenagers private, enhance parental supervision and set messaging restrictions as the default in an effort to shield kids from harm.

Meta said users under 16 will now need a parent’s approval to change the restricted settings, dubbed “Teen Accounts,” which filter out offensive words and limit who can contact them.

“It’s addressing the same three concerns we’re hearing from parents around unwanted contact, inappropriate contact and time spent,” said Naomi Gleit, Meta’s head of product, in an interview.

With teens all being switched to private accounts, they can only be messaged or tagged by people they follow. Content from accounts they don’t follow will be in the most restrictive setting, and the app will make periodic screen time reminders under a revamped “take a break” feature.

Instagram, which is used by more than 2 billion people globally, has been under intensifying scrutiny over its failure to adequately address a broad range of harms, including the app’s role in fueling the youth mental health crisis and the promotion of child sexualization.

States have sued Meta over Instagram’s “dopamine manipulating” features that authorities say have led to an entire generation becoming hooked on the app.

In January, Meta chief executive Mark Zuckerberg stood up during a Congressional hearing and apologized to parents of kids who died of causes related to social media, like those who died by suicide following online harassment, a dramatic moment that underscored the escalating pressure the CEO has faced over child safety concerns.

The new features announced on Tuesday follow other child safety measures Meta has recently released, including in January, when the company said content involving self-harm, eating disorders and nudity would be blocked for teen users.

Instagram changes arrive as federal bill stalls

Meta’s push comes as Congress dithers on passing the Kids Online Safety Act, or KOSA, a bill that would require social media companies to do more to prevent bullying, sexual exploitation and the spread of harmful content about eating disorders and substance abuse.

The measure passed in the Senate, but hit a snag in the House over concerns the regulation would infringe on the free speech of young people, although the effort has been championed by child safety advocates.

If it passes, KOSA would be the first new Congressional legislation to protect kids online since the 1990s. Meta has opposed parts of the bill.

Jason Kelley with the Electronic Frontier Foundation said the new Instagram policies seem intended to head off the introduction of additional regulations at a time when bipartisan support has coalesced around holding Big Tech to account.

“This change is saying, ‘We’re already doing a lot of the things KOSA would require,” Kelley said. “A lot of the time, a company like Meta does the legislation’s requirements on their own, so they wouldn’t be required to by law.”

Continue Reading
You have not selected any currencies to display
mebookshelfandi