TBS Logo

4 Shocking Facts About the Facebook Mental Health Lawsuit

Table of Contents

As we harmlessly scroll through social media every day of our lives, it is essential to be mindful of its impact on mental health.

While these platforms do offer countless benefits, they also come with various, unique sets of challenges. Social media giant Facebook, now Meta, is facing a first-of-its-kind lawsuit alleging that it knowingly created an unsafe environment that has caused significant mental health issues for teenagers. 

Filed in December 2021 by the attorney general of eight states and Washington, D.C., the lawsuit aims to force Facebook to implement safeguards to protect children online. As adults, we must pay attention to this controversial lawsuit and support efforts to prioritize kids’ well-being over tech companies’ profits. 

Here are four shocking details about the claims against Facebook that everyone should know:

1. Facebook Knew About the Harms

One of the most egregious allegations in the lawsuit is that Facebook has long been aware of the significant harm its platforms cause teenagers, but failed to protect them.

Leaked internal documents at the company revealed how Facebook’s and Instagram’s content affected teens and younger adults. The research also found a link between these platforms and how they resulted in mental health disorders such as depression, anxiety, social comparison, and body image issues. Furthermore, it revealed alarming statistics about the negative mental health effects of Instagram usage on teens, particularly teenage girls.

The research suggested that Instagram worsened body image issues for over 30% of girls. Despite having access to years of data signaling real dangers, Facebook did not use or share these insights to protect its young users. Instead, Facebook opted to prioritize engagement-boosting algorithms over implementing safeguards.

Moreover, reports suggest that Facebook ignored the fact that its algorithms promoted harmful content like hate speech, COVID-19 misinformation, and conspiracy theories. As expected, Meta chose to maintain high user engagement and its multi-billion dollar profits.

This blatant ignorance of expert advice and willful disregard for minors’ safety forms the crux of the lawsuit.

2. Facebook Targeted Vulnerable Users

The Facebook lawsuit alleges that Facebook designed its algorithms to target the most vulnerable users to increase engagement and profits.

According to the documents leaked by Frances Haugen, whistleblower and an ex-Facebook manager, certain users are more likely to see disturbing content on their feeds than others. Despite being aware of the mental health risks for minors, Facebook pushed addictive features like infinite scrolling and notifications to keep them hooked. 

The platform also used data to identify and recommend problematic content to susceptible users, such as those with low digital literacy, as they would likely interact with it. A Facebook team found out that 11% of users see disturbing content, 39% see ‘somewhat’ hateful content, and 32% see inappropriate content such as nudity every day. By capitalizing on users’ emotional vulnerabilities, the platform aimed to increase engagement and ad revenue.

This intentional targeting of vulnerable individuals raises ethical concerns, as it points towards a calculated approach to user manipulation.

3. Facebook Prioritized Profits Over People

If you’ve read this far, it shouldn’t come as a surprise that when forced to choose between protecting vulnerable users or safeguarding its profits, Facebook chose money over people.

As TorHoerman Law puts it, the “Facebook Files” is a compilation of Wall Street Journal reports that include leaked internal Facebook documents that detail the company’s accountability and inaction. These documents reveal that despite repeated warnings from experts about algorithm harms, hate speech, misinformation, and illegal drug sales on its platforms, Facebook refused to make meaningful changes, as it would decrease engagement and ad revenues.

The company had the data, tools, and capability to make its sites safer, especially for susceptible teens. Yet, it prioritized high profits over the health and well-being of generations of young users. 

Another horrific example is that of Meta collecting confidential patient information, via their tracking tool, Meta Pixel. Pixel is a tool that is installed on websites, and it collects analytics for better ad performance. According to The Markup, 33 of the top 100 hospitals in the US use this tool. During investigations, it was found that Pixel was collecting and sharing information such as patient health conditions, doctor appointments, and medication allergies. 

Now, mind you, under the HIPAA Privacy Rule, healthcare providers cannot share identifiable health data without the patient’s approval. Yet, Facebook was found serving targeted ads to patients’ health conditions.

This profit-first mentality led to tremendous real-world harm. The lawsuit aims to finally force Facebook to put people before profits, especially children.

4. Holding Big Tech Accountable

This landmark lawsuit against Facebook represents a crucial moment in the effort to hold Big Tech accountable.

For too long, tech giants like Facebook have operated without sufficient oversight, regulation, or transparency. This enabled companies to pursue profit, scale, and growth at all costs—even if it meant harming vulnerable users. But that era of unfettered Big Tech is coming to an end.

With this lawsuit, Facebook finally faces real consequences for its alleged misconduct and mistreatment of young people. In 2020, attorneys general from 48 states and territories sued Facebook on antitrust grounds, citing anticompetitive conduct in its acquisition of WhatsApp and Instagram. In the same year, a total of 33 states, including California and New York, sued Meta claiming it deliberately designed its products to make them addictive for kids and teens.

Moreover, Selena Scola, a Facebook former moderator, sued Facebook claiming she developed PTSD nine months into her job. She was soon joined by fellow Facebook moderators across four states. The moderators who were required to review graphic content daily allege that Facebook failed to provide them with a secure environment. 

In response to this, Facebook set aside $52 million as compensation for current and former moderators.

The reforms being sought aim to restrain Facebook’s worst harms. More broadly, this suit signals that Big Tech can no longer exploit people for profit. Accountability is coming. If successful, this case could spur a wave of increased oversight and protection for the public good.

As the alarming allegations against Facebook work their way through the courts, we must stay informed and push for changes that meaningfully protect vulnerable users like our children.

This lawsuit is just the beginning. We, the users, can also play a role by being mindful of our social media usage and advocating for stronger ethical standards within the tech industry. While social media is here to stay, platforms like Facebook, TikTok, and Snapchat must be held accountable for operating responsibly and ethically, especially when it comes to our kids. Their well-being and safety have to come before tech companies’ bottom lines.

We all have a part to play in driving this vital conversation forward and advocating for reforms that restrain Big Tech’s worst harms. The outcome of this lawsuit could pave the way for a much-needed transformation in the tech industry. 

The future mental health of generations depends on it!