this post was submitted on 10 Aug 2025
20 points (76.3% liked)

DIY Electronics and Hardware

146 readers
4 users here now

founded 4 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] wizzor@sopuli.xyz 3 points 2 weeks ago (1 children)

The models an rpi zero can run are very limited though. 512 mb ram is very, very little for AI models.

[–] Redex68@lemmy.world 9 points 2 weeks ago (1 children)

The LLM isn't local

For the actual conversational responses, the project typically utilizes cloud-based large language models accessed via APIs

[–] rbn@sopuli.xyz 16 points 2 weeks ago (1 children)

Then, from my perspective, there's little to no value to have a dedicated piece of hardware for it. At least I'd guess that 99.999% the target audience for such a thing already has a smartphone with them. What - if not for the sake of privacy - is the added value of a special chatbot device?

[–] Redex68@lemmy.world 4 points 2 weeks ago

I guess it makes it a bit easier to access + it's a fun project to DIY, not much else.