this post was submitted on 20 Mar 2026
109 points (99.1% liked)
Fuck AI
6441 readers
1357 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Except, imo, AI searching is literally a regression vs other search methods.
I work as a field operations supervisor for an ISP, and we use a GPS system to keep track of our fleet. They've been cramming AI into it, and I decided to give it a shot.
I had a report of a van running a stop sign. The report only had a license plate, so I asked the AI which of the vehicles in my fleet had that plate. And it thought about it and returned a vehicle. So I follow the link to that vehicle's status page, and the license plate doesn't match. Isn't even close.
It's only in recent time that searching has turned into such a fuzzy concept, and somehow AI turned up and made everything worse.
So you can trust AI if you want. I'll keep doing things manually and getting them right the first time.
That sounds like a tooling problem.
Either your tooling was outright broken, or not present.
It should be a very trivial task to provide an agent with a MCP tool it can invoke to search for stuff like that.
Searching for a known specific value is trivial, now you are back to just basic sql db operations.
These types of issues arise when either:
A: the tool itself just gave the LLM bad info, so thats not the LLMs fault. It accurately reported on wrong data it got handed.
B: the LLM just wasnt given a tool at all and you prompted it poorly to give room for hallucinating. You just asked it "who has this license plate" instead of "use your search tool to look up who has this license plate", the latter would result in it reporting the lack of a tool to search with, the former will heavily encourage it to hallucinate an answer.