On March 26th, 2026, Meta and Google were found liable for inflicting harm on a then-minor’s mental health through the design of their scrolling algorithm, awarding the plaintiff 6 million dollars in damages. Lawyers in the case emphasized that “features of Instagram, such as the infinite scroll, were designed to be addictive”. In addition, they argued that “the company wanted young users because they were more likely to stick with its platforms for longer stretches of time.” The specificity contributed to the lawyers’ victory; they proved Meta and Google methodologically implemented features specifically designed for teen exploitation. Both companies involved have announced plans to appeal the decision. But the story is not over yet. Somewhere in Silicon Valley, a team of lawyers is already planning on arguing that the teenager’s mental unraveling was, fundamentally, her own fault.
There’s no doubt this verdict is a landmark. It establishes a precedent in the judicial system that social media can no longer hide behind the guise of neutrality regarding their algorithm. Designs are, after all, an intentional choice. The infinite scroll system was intentionally designed to keep teenagers addicted to the social media platform to maximize the company’s profits. It’s correct, then, to hold social media companies liable for the harm they cause.
In that way, however, Google and Meta’s appeal is not solely a legal move, but rather a moral referendum on how large companies operate. They will most likely argue that their platforms protected speech, that all users consented to their algorithm, and the correlation between social media use and mental health is not necessarily causational. These arguments do hold some merit: for example, it’s nearly impossible to equate poor mental health in an entire generation of teenagers to simply an Instagram notification. However, this does not serve as a blanket defense against negligence. Take, for example, Tobacco companies in the 20th century, who hid behind the complexity of cancer research, rather than acknowledging the direct link between smoking and cancer. This cycle of deniability is not new.
Furthermore, the companies involved in the lawsuit have evidence that goes against their argument. Meta’s researchers warned that Instagram was harmful to teenage girls’ self-image and mental health. Young men are desensitized to unrealistic body standards set by “fitness” influencers championing unsustainable living habits. Yet, despite this finding, minimal was done to remedy the issue. Internal employees within the company are actively raising alarms, yet they are being ignored in favor of profit and growth. Thus, the excuse of “we didn’t know” or “it cannot be proven” is not a credible defense.
As I was reading through the case and trying to understand all the details, I kept circling back to a simple question: “What exactly are these companies pursuing?” Social media giants like Instagram and Snapchat are marketed as the ideal way to connect with the world, through features such as the like button, the Stories format, or the repost function. However, I find this to be partially an illusion. Yes, if used correctly, the positive effects of belonging and community-building can be found using social media. However, there is a large portion of users who do the opposite, who are bombarded with content that promotes their insecurity, anxiety, and comparison. This is by design: fear and compulsion are, neurologically, more stimulating than contentment. Thus, while social media companies emphasize connection as the main purpose of their apps, user retention and engagement are intricately built into the system.
So, when the question of whether addiction is the user’s responsibility or the company’s, I find that framework to be limiting in scope. When dealing with other forms of addiction—say, gambling—it’s not a question of whether the decor at a casino, the carpet patterns, the lighting, is what keeps people at the table. We do not ask if an opioid addiction is the patient’s fault when pharmaceutical companies knowingly overstated the drug’s safety. The simple presence of agency should not absolve social media companies from responsibility, particularly when there is an abundance of evidence that the users’ addiction is systematically engineered and methodologically implemented by them.
In addition, it’s important to understand the significance of the issue of those involved. Teenagers are not somehow uniquely weak-willed—after all, I’m willing to bet everyone knows a grandparent or two addicted to YouTube shorts. However, teenagers are, by nature, more vulnerable because the regions in their brains that regulate impulse control and long-term decision-making skills are still being developed. Companies build scrolling algorithms that directly exploit that vulnerability, while simultaneously ignoring the stark consequences. The argument that “kids should have just logged off” is by no means a defensible position—it’s negligence framed as a strategic defense.
Though the March 26th verdict will not immediately change the social media landscape, the precedent it sets is significant. More cases will follow this one, as will more research reinforcing the scrolling algorithm’s detrimental effect to adolescent development. However, the hope is that, as legal cases gain more public exposure and society becomes cognizant of social media among the youth, practical policy changes will follow.