Then, from my perspective, there's little to no value to have a dedicated piece of hardware for it. At least I'd guess that 99.999% the target audience for such a thing already has a smartphone with them. What - if not for the sake of privacy - is the added value of a special chatbot device?
The models an rpi zero can run are very limited though. 512 mb ram is very, very little for AI models.
The LLM isn't local
Then, from my perspective, there's little to no value to have a dedicated piece of hardware for it. At least I'd guess that 99.999% the target audience for such a thing already has a smartphone with them. What - if not for the sake of privacy - is the added value of a special chatbot device?
I guess it makes it a bit easier to access + it's a fun project to DIY, not much else.