A lawsuit against Meta alleges the company suppressed internal research showing Facebook and Instagram may harm users’ mental health. This report explores how Meta ‘buried’ evidence about the harmfulness of social networks and the implications for young people worldwide.

 A Tech Giant Under Fire

Fresh scrutiny is falling on one of the world’s most influential technology companies as new allegations suggest Meta suppressed evidence pointing to potential mental health risks linked to its platforms. At the heart of the controversy is the claim that Meta halted internal studies after uncovering data that suggested Facebook and Instagram may heighten depression, anxiety, loneliness, and harmful social comparison among users. The unfolding case—captured in the headline How META ‘buried’ evidence about the harmfulness of social networks—raises urgent questions about corporate transparency and the responsibility of social media giants.

The revelations stem from a lawsuit filed by multiple US school districts, accusing Meta and other platforms of knowingly placing young users at risk. The keyword How META ‘buried’ evidence about the harmfulness of social networks has since become a rallying point in discussions about digital well-being.

Project Mercury: Research That Never Saw the Light

Meta and Nielsen’s Joint Study

According to documents disclosed during court proceedings and reported by Reuters, Meta commissioned a major internal study in 2020 known as Project Mercury. Conducted in partnership with the research firm Nielsen, the project examined the effects of users “deactivating” Facebook and Instagram for one week.

The results, according to internal reports, were stark: individuals who paused their social media activity experienced lower levels of depression, anxiety, loneliness, and social comparison. These findings — directly contradicting Meta’s public assurances — suggested clear and causal links between platform usage and mental health challenges.

How META ‘Buried’ Evidence About the Harmfulness of Social Networks

Internal Pushback and a Sudden Halt

The lawsuit alleges that instead of publishing the findings or commissioning deeper research, Meta swiftly halted Project Mercury. Internal messages cited an “existing media narrative” that, according to Meta’s leadership, rendered the negative findings unreliable. However, behind closed doors, several employees insisted the results were both credible and concerning.

One unnamed researcher warned that the evidence demonstrated a “causal effect of social comparison,” even adding a disgruntled emoji to underscore the frustration. Another employee reportedly compared the situation to the tobacco industry, saying the decision to hide the findings resembled how cigarette makers once “conducted research, knew cigarettes were harmful, and kept the information to itself.”

Despite internal acknowledgement of potential harm, Meta continued telling US lawmakers that it could not accurately measure the impact of its platforms on teenage girls.

Meta Responds: Flawed Science or Corporate Defence?

In a statement issued on Saturday, Meta spokesman Andy Stone rejected the allegations. He argued that the decision to halt Project Mercury was based on concerns about methodological flaws, not a deliberate attempt to hide damaging evidence.

Stone added that Meta has spent more than a decade working to improve safety and implement protections for younger users. “The overall record will show that for more than a decade we have listened to parents, studied the issues that matter most and implemented real changes to protect teenagers,” he stated.

 A Debate That Is Far From Over

What the Future Holds for Social Media Accountability

The headline How META ‘buried’ evidence about the harmfulness of social networks reflects a broader global debate about how much responsibility social media companies bear for the well-being of their users. As court proceedings continue, the case could shape future regulation, industry standards, and transparency measures across the technology sector.

For now, the revelations have reignited concerns among parents, educators, and policymakers — and placed Meta at the centre of a conversation it may no longer be able to control.