Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
Chatbots can't think or tell if anything is correct. They can generate fake data and there is literally no amount of "tuning" to fix that. For the love of the planet and your sanity, please never touch one again.
I find duck duck go works better than Google now. I also find perusing Wikipedia will often give me the information I need or point me in a better direction.
The chatbot isn't the issue here. It's the user treating it like a reliable source of information.
It's a large language model - not a large knowledge model. It gets plenty of stuff right, but that's not because it actually "knows" anything - it's just trained on a massive pile of correct information.
People trash it for the times it gets things wrong, but it should be the other way around. It's honestly amazing how much it gets right when you consider that's not even what it's built to do.
It's like cruise control that turns out to be a surprisingly decent driver too.
I'm fully aware of the technology behind it. I trash it because it's resource sucking, planet burning trash that serves no real purpose.
The technology is inherently flawed and fills no niche because you can never trust anything from it. Even if it's right 9/10 times that tenth time can, and has, killed people.
It's a chatbot. You talk to it, and it responds in natural language. That's exactly what it's designed to do - and it does it exceptionally well, far better than any system we've had before.
Faulting it for being untrustworthy just shows most people don't actually understand this tech, even though they claim they do. Like I said before: it's a large language model - not a large knowledge model.
Expecting factual answers from it is like expecting cruise control to steer your car. When you end up in the ditch, it's not because cruise control is some inherently flawed technology with no purpose. It's because you misused the system for something it was never designed to do.
If you just Google something like “health effects of hibiscus,” you’ll find a mixture of information too. Most people probably can’t tell which claims are well researched and which ones aren’t.
You’ll be left with a mixed bag, but reading all that takes more time than it takes to read an equally flawed summary you get from a gas powered AI. From a convenience perspective, I can understand why some people might prefer an LLM. From a reliability perspective, I can’t favour either option. Regardless, the difference in environmental impact should be clear to everyone.
We should exact mandatory media literacy classes alongside killing LLMs. I know Finland has them and from what I can tell it's been a resounding success.