this post was submitted on 31 Dec 2025
174 points (98.9% liked)

Technology

41175 readers
291 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] schnurrito@discuss.tchncs.de 2 points 1 week ago (1 children)

I've recently said this in another thread, and I'll repeat it here: this problem would easily be solved by changing content liability laws (e.g. section 230 in the US) so that anything recommended by an algorithm counts as speech by the platform and the platform is liable for it if it turns out to be illegal (e.g. libellous).

That would mean that you could operate a forum or wiki or Lemmy or Mastodon instance without worrying about liability, but Facebook, YouTube, TikTok would have to get rid of the feature where they put "things that might interest you" that you didn't actually choose to follow into your feed.

None of that has anything to do with anyone's age.

[–] silentdon@beehaw.org 2 points 6 days ago (1 children)

This could work as long as "algorithm" is sufficiently defined. Someone could argue that a sorting "algorithm" counts.

[–] schnurrito@discuss.tchncs.de 1 points 5 days ago

Agreed. This is a potential problem, but not an unsolvable one.