31
ChatGPT is bullshit (link.springer.com)

Here's a good & readable summary paper to pin your critiques on

you are viewing a single comment's thread
view the rest of the comments
[-] Frank@hexbear.net 13 points 3 months ago

Hard agree. A hallucination results from a damaged mind perceiving things that aren't there. Llms have no mind, no perception, and a thing has to work before you can call it damaged. Llms are exploring brave new frontiers in garbage in garbage out.

this post was submitted on 24 Jun 2024
31 points (100.0% liked)

technology

23213 readers
203 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS