Sunday, December 22, 2024
ad
HomeNewsInstagram Fined €405 million for misusing teens’ data and abusing EU Privacy...

Instagram Fined €405 million for misusing teens’ data and abusing EU Privacy law

After adopting the revisions requested by a group representing all of the bloc's privacy authorities, Ireland claims that the verdict, including the fine, is now final.

After reviewing how Instagram handled the data of teenagers, Ireland’s Data Protection Commission (DPC) fined Meta €405 million (about $402 million). Meta Platforms Inc. is planning to appeal the case in the meantime to garner some relief.

The inquiry, which was launched over two years ago, concentrated on two potential GDPR violations by the corporation. The first was when Instagram started enabling users between the ages of 13 and 17 to create business profiles, making their contact details available to the public. Users occasionally switch to business accounts because they get access to greater engagement statistics when they do so. Additionally, it has been reported that Instagram by default, made some underage users’ profiles public.

Instagram said that the investigation and the verdict centered on “old settings” that had been modified more than a year prior. Since then, the social media platform has unveiled new teen-friendly privacy features, including the option to set accounts to private status upon registration.

The penalties, which were finalized last Friday, are the third and highest the DPC has levied on Meta, far surpassing the €225 million ($267 million) fine the business received after the DPC discovered that WhatsApp had not adequately informed EU residents about how it acquired and handled personal data, notably about how it shared that data back with Meta.

The fine is the second-largest imposed under the strict privacy regulations of the European Union, after the €746 million fine that Luxembourg’s regulators imposed on Amazon in the past year.

Read More: Instagram to use AI for Age Verification

The decision’s emphasis on youngsters touches a delicate yet often overlooked subject for social media companies – how they handle minors using their services. Legislators in California last week approved a measure requiring social media app developers to take minors’ physical and mental health into account when creating new products, such as Instagram and TikTok. The legislation was based on a U.K. regulation that mandates social media companies develop their products with children’s interests in mind. This year, European lawmakers established new laws concerning minors. Companies are barred from accessing specific data to personalize advertising directed at individuals under the age of 18 under a new regulation called the Digital Services Act.

Ireland is embroiled in a slew of legal fights over Meta’s data-collection practices. One of them is whether Meta has the authority to require users to provide specific types of information to use the service and another concerns whether some of the fundamental components of digital ad auctions comply with EU law. Ireland is responsible for monitoring Meta’s compliance with G.D.P.R, a law passed in 2018 to restrict how businesses could gather and use people’s data. This is because Meta has Ireland as its European headquarters as well as the allegations that the nation has been lenient with data protection laws.  In a separate lawsuit, the country has threatened to prohibit Meta from transmitting data from European consumers to the company’s U.S. data centers.

Subscribe to our newsletter

Subscribe and never miss out on such trending AI-related articles.

We will never sell your data

Join our WhatsApp Channel and Discord Server to be a part of an engaging community.

Preetipadma K
Preetipadma K
Preeti is an Artificial Intelligence aficionado and a geek at heart. When she is not busy reading about the latest tech stories, she will be binge-watching Netflix or F1 races!

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular