Meta has been sued over its age limit enforcement

A complaint from a lawsuit filed against Meta in October was just unsealed to the public.

Meta has been sued over its age limit enforcement
Children playground miniatures are seen in front of displayed Instagram logo in this illustration taken April 4, 2023. REUTERS/Dado Ruvic/Illustration/File Photo

The backstory: Most social media sites have age limits on who can create accounts. This limit is usually 13 or 14 years old. In the US, where companies like Meta (the parent of Facebook and Instagram), Snapchat and X (formerly Twitter) are based, the federal Children's Online Privacy Protection Act (COPPA) doesn’t allow anyone under 13 to have social media accounts and also limits childrens’ web access based on their parent's permission. In China, where Douyin (China’s version of TikTok) and WeChat are based, there’s a similar law limiting social media to teens over the age of 14. In Europe, this age usually falls somewhere between the ages of 13 and 16, depending on the country. 

The idea behind these age laws is to protect the personal data and safety of younger people who can’t consent the way that adults can when it comes to technology use. We all know that social media can also be a dangerous place, so these laws limit how children can use it for their protection.

More recently: Authorities all over the world have been looking to regulate how younger teens access the internet and social media, as research has shown how these platforms can harm the health and safety of vulnerable groups (like teenagers). Earlier this year, US Surgeon General Vivek Murthy said he sees 13 as too young to be on social media because kids this age are still “developing their identity.” He said that the “skewed and often distorted environment of social media” can do “a disservice” to many children as they figure out who they are. 

A couple of months ago, China’s Cyberspace Administration released a draft of guidelines restricting minors from most internet services on mobile devices from 10 p.m. - 6 a.m. Teens between 16 and 18 years old would also be limited to using the internet for just two hours a day. Meanwhile, these guidelines would allow children between 8 and 15 years old one hour a day online, and those under 8 would only be allowed 40 minutes. The agency is pushing these guidelines to also help prevent internet addiction among young people. 

Earlier this month, Meta called for more legislation that would require parents to approve app downloads for their teens and kids, putting more of the responsibility on parents and app stores run by Google and Apple.  

The development: A complaint from a lawsuit filed against Meta in October was just unsealed to the public. The document, filed by 33 states in the US, states that Meta has been both aware that children under 13 years old use its platforms and also “coveted and pursued” this age group for years on Instagram. According to the filing, Meta has gotten over 1.1 million reports of underage users on Instagram since 2019 but only disabled “a fraction” of those underage accounts. Allegedly, Meta “routinely continued to collect” children’s personal info, like their locations and email addresses, without parental permission, which is against the law.

If Meta is found guilty of these offenses, it could face more than hundreds of millions of dollars in penalties. Meta has responded to these claims, saying that the complaint misrepresents the situation and that there are "measures in place” to get rid of underage accounts when they’re found.

Key comments:

“Within the company, Meta’s actual knowledge that millions of Instagram users are under the age of 13 is an open secret that is routinely documented, rigorously analyzed and confirmed, and zealously protected from disclosure to the public,” the complaint against Meta said.

“[...] Verifying the age of people online is a complex industry challenge," Meta's statement read. "Many people — particularly those under the age of 13 — don't have an ID, for example. That's why Meta is supporting federal legislation that requires app stores to get parents' approval whenever their teens under 16 download apps. With this approach, parents and teens won’t need to provide hundreds of individual apps with sensitive information like government IDs in order to verify their age.”