this post was submitted on 28 Mar 2026
66 points (98.5% liked)

Technology

42587 readers
459 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 4 years ago
MODERATORS
 

The young woman at the heart of what has been called the tech industry’s “big tobacco” moment was on YouTube at six and Instagram by nine. More than a decade later, she says, she still can’t live without the social media she became addicted to.

“I can’t, it’s too hard to be without it,” Kaley, now 20, told a jury at Los Angeles’ superior court. This week, five men and seven women handed down a verdict on the design of two of the world’s most popular apps that vindicated Kaley’s position.

The ruling sent shockwaves through Silicon Valley and sparked hope among families and child safety campaigners that change may finally be coming to social media. Mark Zuckerberg’s Meta and Google’s YouTube were found liable for deliberately designing addictive products used by Kaley and millions of other young people.

It was one case centred on the suffering of one young person who became depressed at 10 and self-harmed, but Kaley, referred to by her first name or the initials KGM in order to protect her privacy, was the figurehead for a much bigger fight.

“We wanted them to feel it,” one of the jurors explained to reporters. “We wanted them to realise this was unacceptable.”

you are viewing a single comment's thread
view the rest of the comments
[–] megopie@beehaw.org 34 points 1 day ago (1 children)

Don’t let this become a “protect the kids” thing. The intentionally addictive and manipulative design of these platforms has been just as harmful to people across a wide spectrum of ages. The solution is not to ban kids from using these platforms, the solution is to hold these platforms accountable for their behavior and put regulations in to ban intentionally manipulative design. Adults are just as much victims of having their brains cooked by this shit, and it’s had larger scale societal consequences that we need to take seriously.

[–] Mothra@mander.xyz 5 points 1 day ago (1 children)

Agreed, unfortunately I think this will only fuel further age and ID verification enforcements. And of course change nothing in the design of the platforms.

[–] definitemaybe@lemmy.ca -1 points 8 hours ago

The article highlights how the UK is moving to ban infinite scrolling access autoplay videos. So, thankfully, those changes are coming in at least some jurisdictions.

That said, the article also helpfully points out that the Republican administration has stuffed their science & tech advisory panel with Meta and Google execs, so I'm doubtful that the US will regulate anything reasonable.

I'd like a ban in effect for children below 16, but enforcement should be a misdemeanor on the parent. It should be a social worker coming to discuss with the parents the known harms of the platforms and let them get away with a warning, but that there will be fines if this damaging behaviour continues with an automatic 1-year (or whatever) follow-up. Basically, treat it the way it's treated if parents are giving cigarettes to their children.