this post was submitted on 19 Jan 2026
1 points (52.9% liked)
Open Source
43486 readers
260 users here now
All about open source! Feel free to ask questions, and share news, and interesting stuff!
Useful Links
- Open Source Initiative
- Free Software Foundation
- Electronic Frontier Foundation
- Software Freedom Conservancy
- It's FOSS
- Android FOSS Apps Megathread
Rules
- Posts must be relevant to the open source ideology
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
- !libre_culture@lemmy.ml
- !libre_software@lemmy.ml
- !libre_hardware@lemmy.ml
- !linux@lemmy.ml
- !technology@lemmy.ml
Community icon from opensource.org, but we are not affiliated with them.
founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I would check out Open WebUI which can be self-hosted via docker etc and configured with any OpenAI compatible endpoint so you can use a service like OpenRouter to run nearly any LLM remotely. Most of the open weights ones like Qwen 3 or Kimi K2 Thinking are great and cost pennies per inquiry and can be configured with Zero Data Retention (ZDR) so your data is not recorded. You could also use something like Ollama to run local LLMs if you want even more privacy and have the hardware (typically a modern Nvidia GPU with at least 16-24 GB of VRAM).