this post was submitted on 24 Oct 2025
18 points (71.4% liked)
Comradeship // Freechat
2656 readers
136 users here now
Talk about whatever, respecting the rules established by Lemmygrad. Failing to comply with the rules will grant you a few warnings, insisting on breaking them will grant you a beautiful shiny banwall.
A community for comrades to chat and talk about whatever doesn't fit other communities
founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
ELIZA was written in the 60s. It's a natural language processor that's able to have reflective conversations with you. It's not incredible but there's been sixty years of improvements on that front and modern ones are pretty nice.
Otherwise, LLMs are a a probabilistic tool: the input doesn't determine the output. This makes them useless at things tools are good at, which is repeatable results based on consistent inputs. They generate text with an authoritative voice but all domain experts find that they're wrong more often than they're right, which makes them unsuitable as automation for white-collar jobs that require any degree of precision.
Further, LLMs have been demonstrated to degrade thinking skills, memory, and self-confidence. There are published stories about LLMs causing latent psychosis to manifest in vulnerable people, and LLMs have encouraged suicide. They present a social harm which cannot be justified by their limited use cases.
Sociopolitically, LLMs are being pushed by some of the most evil people alive and their motives must be questioned. You'll find oceans of press about all the things LLMs can do that are fascinating or scary, such as the TaskRabbit story (which was fabricated entirely). The media is culpable in the image that LLMs are more capable than they are, or that they may become more capable in the future and thus must be invested in now.