1242
"prompt engineering"
(lemmy.world)
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
LLMs are in a position to make boring NPCs much better.
Once they can be run locally at a good speed it'll be a game changer.
I reckon we'll start getting AI cards for computers soon.
We already do! And on the cheap! I have a Coral TPU running for presence detection on some security cameras, I'm pretty sure they can run LLMs but I haven't looked around.
GPT4ALL runs rather well on a 2060 and I would only imagine a lot better on newer hardware