Technology: Landmark US case ‘could launch a whole wave’ of addiction litigation
Stephen CousinsFriday 27 March 2026
A jury in a Californian court has found Meta and Google liable for intentionally building addictive social media platforms that harmed the mental health of a woman who used them as a child. The case, decided in late March, opens the door to new ways to hold tech companies to account for allegations of harm to users.
The plaintiff in the trial was a 20-year-old woman identified by the initials KGM. She claimed to have been addicted to social media for over a decade and that her nonstop use of such platforms has caused various mental health issues.
The jury awarded $3m in compensatory damages, with Meta – the parent company of Instagram and Facebook – found to be 70 per cent responsible and YouTube’s owner Google 30 per cent. The jury further awarded $3m in punitive damages. Two other companies, Snap and TikTok, settled confidentially before the trial began.
The verdict provides leverage to litigants in a number of lawsuits being brought on similar grounds. There are over 20 other trials shortly set to begin in the Superior Court of Los Angeles County. ‘This is potentially the most impactful case in the US right now,’ says Philip Yannella, a partner in business litigation at Blank Rome in Philadelphia. ‘It could launch a whole wave of addiction litigation.’
Both companies will appeal. ‘Teen mental health is profoundly complex and cannot be linked to a single app,’ said Meta, adding the company remains ‘confident in our record of protecting teens online.’ Google’s spokesperson said the case ‘misunderstands’ YouTube, which the company argues is ‘a responsibly built streaming platform, not a social media site.’
The ability to pursue product liability design defect claims based on algorithms could greatly expand technology litigation
Philip Yannella
Partner, Blank Rome
Previous legal challenges to social media companies have largely been unsuccessful in the US because internet businesses are protected by Section 230 of the Communications Decency Act 1996, which broadly shields them from liability for third-party content posted online.
Lawyers behind the latest cases have adopted a different legal strategy, centred on negligence-based product liability. In the KGM case, they successfully argued that the design and functionality of the software products is what has caused addiction and harm, rather than content posted online by others.
Making software subject to product liability law is new legal territory in the US. ‘Design defect as a creature of product liability law has historically focused on physical fungible products, typically purchased by consumers over the counter,’ says Yannella. ‘The ability to pursue product liability design defect claims based on algorithms could greatly expand technology or website technology litigation.’
In the KGM case, lawyers for the plaintiff argued that design features such as ‘likes’, algorithmic recommendation engines, infinite scroll, autoplay and deliberately unpredictable rewards have been purposely engineered to maximise engagement by creating dopamine-driven feedback loops. KGM claimed this addiction fuelled her depression, anxiety, body dysmorphia and suicidal thoughts.
The arguments made by defence lawyers included that social media can’t be legally or clinically classified as an ‘addiction’. Those representing YouTube argued that the video sharing platform isn’t social media and is not addictive, meanwhile. Testifying in court, Meta CEO Mark Zuckerberg denied that the company seeks to make Instagram addictive to younger people and highlighted its measures to detect and remove underage users.
In another legal first, the jury in a case brought by the New Mexico Justice Department found in March that Meta had violated the state’s Unfair Practices Act by misleading the public about the safety of its platforms for young users, with children exposed to explicit material, for example. It’s the first time a US state has successfully sued Meta regarding child safety. Meta has been ordered to pay a penalty of $375m.
Meta will appeal and says it works hard ‘to keep people safe on our platforms’ and is ‘clear about the challenges of identifying and removing bad actors and harmful content.’
Cases such as these may ultimately result in a significant weakening of the legal immunity enjoyed by tech platforms and force the redesign of social media products.
Discussing the California case, Marc Berkman, CEO of the Organization for Social Media Safety, says such litigation plays ‘an important role because it brings internal practices and business incentives into the public conversation. That increased transparency helps drive change by informing families, shaping consumer expectations and encouraging policymakers to pursue stronger protections for young people.’
In the EU, Article 28 of the Digital Services Act (DSA) proactively addresses the protection of minors online by requiring platforms to disable features that promote excessive use by default, such as ‘streaks’ that reward users for increased engagement, push notifications and autoplay.
Tech companies must also remove or restrict persuasive design features intended to keep users engaged, such as infinite scroll, and include child-friendly tools that increase awareness of time spent on a platform.
‘Very large online platforms’, defined as those with over 45 million monthly active users in the EU, have the additional requirement to treat the safety of minors as a systemic risk. This means proactively analysing the risks posed by services and implementing ‘targeted measures’ to protect minors, such as age verification and parental controls.
In February, the European Commission preliminarily found TikTok in breach of the DSA on account of its addictive design. ‘TikTok will have to demonstrate that they have taken measures to change their design with respect to addictive functions like scrolling, notifications and recommendations,’ says Innocenzo Genna, Senior Vice-Chair of the IBA Communications Law Committee, highlighting that if TikTok doesn’t comply, it risks being fined or banned.
A TikTok spokesperson has said the findings present a ‘categorically false and entirely meritless depiction of our platform.’ The company intends to challenge them.
Several jurisdictions worldwide are introducing laws that prevent access to social media platforms for under-16s, potentially limiting the likelihood of future litigation related to young people. For example, in December Australia’s new online safety legislation took effect. It requires platforms to take reasonable steps to stop children under the age of 16 from creating, or retaining, accounts.
This doesn’t stop under-16s from using platforms that don’t require an account, such as YouTube, but according to Angela Flannery, Co-Chair of the IBA Communications Law Committee, this still makes it ‘more difficult to argue that the platforms were aware that such users were under 16 and targeted by the platforms.’
Header image: stokkete/Adobe Stock