Australian law firms are investigating the scope for future legal cases after a landmark US court ruling that found Meta and YouTube liable for deliberately designing addictive products.
A jury in Los Angeles ruled against the two tech giants on Wednesday, finding both to be negligent and having failed to provide adequate warnings about the potential dangers of their products.
The jury awarded the plaintiff, known as KGM, US$6m in case damages, to be split 70-30 between Meta and Google.
KGM had testified she became addicted to YouTube at age six and Instagram at nine. She told the court that by age 10 she had become depressed and was engaging in self-harm as a result. At 13 she was diagnosed with body dysmorphic disorder and social phobia, which she attributed to use of Instagram and YouTube.
The decision came just one day after Meta was ordered to pay US$375m in civil penalties in a separate lawsuit in New Mexico after a jury found Meta misled consumers about the safety of its platforms and enabled harm.
While the tech giants said they disagreed with the ruling and planned to appeal, the latest decision could reverberate around the globe.
Sign up for the Breaking News Australia emailAustralian law firm Shine Lawyers “is working through inquiries and investigating how an Australian claim could be run”, the firm’s chief legal officer, Lisa Flynn, told Guardian Australia.
“This $6m verdict is a watershed moment. It signals that courts are increasingly willing to hold tech giants accountable for the real-world harm their products can cause,” she said.
“The [social] media ban has had very little impact on adolescent use of these technology platforms. Meta has targeted generations of people who may never be able to detach from the grip of social media.”
Andy Wei, the principal lawyer for class actions at Slater and Gordon, said the verdict was “a significant moment in global scrutiny of social media platforms and their impact on young people”.
“We’re closely watching developments overseas and continually assessing whether Australian law provides avenues for accountability where harm has occurred,” Wei said.
“This is an evolving area, and we expect the conversation around social media harm, regulation and accountability in Australia to accelerate in the wake of this decision in the US overnight.”
The law firm Maurice Blackburn did not state whether it was considering a case, but, as an example of taking on big tech, pointed to the firm’s pursuit of Apple and Google in class actions over app store access that has run parallel to the Epic Games case.
“This [Meta] case shows that the law can and should be used to create accountability and guardrails for harmful effects of big tech,” Rebecca Gilsenan, the national head of class actions at Maurice Blackburn, said.
Meta declined to comment beyond its statement from the US ruling. Google was approached for comment.
Associate prof Stan Karanasios, a researcher of information systems at the University of Queensland, said the verdict showed there had been a fundamental shift in how responsibility for social media harm was assigned.
“For too long, the burden has fallen on individuals, parents, and families to resist platforms that, as the evidence in this trial showed, were consciously designed to be addictive,” he said. “Features like infinite scroll, autoplay, and constant notifications were not incidental design choices; they were the architecture of addiction.”
A day before the ruling, the Albanese government extended the definition of social media platforms that must comply with Australia’s under-16s social media ban to include those that have systems “designed to be addictive and provide constant dopamine hits”, including those with infinite scroll and likes or upvotes features, and time-limited features that are “designed to create urgency so young people check apps constantly”.
“Targeted algorithms, doomscrolling, persistent notifications and toxic popularity metres are stealing their attention for hours every day,” the communications minister, Anika Wells, said on Wednesday.
The federal government has committed to legislating a digital duty of care that would require platforms to take reasonable steps to prevent harm taking place on their services. An initial survey on the proposal ended in early December, but the government has yet to announce next steps.
The Greens communications spokesperson, Sarah Hanson-Young, said a digital duty of care “would force big tech giants to prevent harm before it happens – not just apologise after the damage is done”.
“The Greens will introduce new laws that will help keep everyone safe online because the government has refused to act,” she said.
“Social media apps shouldn’t be able to rig their algorithms to force dangerous content on users for the sake of making mega profits.”