531
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 05 Jul 2023
531 points (90.4% liked)
Technology
59588 readers
3208 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
The issue with LLMs that I have is that while they are great at certain tasks, they are bad at anything, let's call it factual, due to their nature.
I can for example use it to quickly draft up a email or a piece of python code, and I can immediately see whether or not the response it generated is actually what I want.
If I go ask it what the hottest day in a given country was or ask it to explain something, I have absolutely no idea whether it's bullshit or not and I will have to double check it anways.
I think the learning curve with LLMs as a tool is to be able to know when to use it and when to rely on other sources instead.