this post was submitted on 07 Dec 2025
91 points (98.9% liked)
technology
24126 readers
123 users here now
On the road to fully automated luxury gay space communism.
Spreading Linux propaganda since 2020
- Ways to run Microsoft/Adobe and more on Linux
- The Ultimate FOSS Guide For Android
- Great libre software on Windows
- Hey you, the lib still using Chrome. Read this post!
Rules:
- 1. Obviously abide by the sitewide code of conduct. Bigotry will be met with an immediate ban
- 2. This community is about technology. Offtopic is permitted as long as it is kept in the comment sections
- 3. Although this is not /c/libre, FOSS related posting is tolerated, and even welcome in the case of effort posts
- 4. We believe technology should be liberating. As such, avoid promoting proprietary and/or bourgeois technology
- 5. Explanatory posts to correct the potential mistakes a comrade made in a post of their own are allowed, as long as they remain respectful
- 6. No crypto (Bitcoin, NFT, etc.) speculation, unless it is purely informative and not too cringe
- 7. Absolutely no tech bro shit. If you have a good opinion of Silicon Valley billionaires please manifest yourself so we can ban you.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I could see a hypothetical machine translation suite integrated directly into the reader being a useful tool, especially if it allowed interrogation, correction, and revision by the user in a way that an LLM could actually almost sort of do well enough for a casual context. I mean it would still be frustrating and error prone, but for a book without extant translations it could potentially be better than trying to bulk process it with a separate translation tool.
Although that's not what they added. If I'm reading this right, what they added was the ability for it to make API calls to LM Studio, which is a framework (I believe open source too) for running text models locally with (also open source) model weights, with the current integration features being something about being able to "discuss selected books" with that local chatbot or ask it for recommendations, although I have no idea how any of that is supposed to work in practice. Since it is adding backend compatibility with local models, the machine translation angle I mentioned is at least a feasible addition that a plugin could add.
The whole thing's silly and has extremely limited actual usecases, but anyone getting up in arms over it allowing compatibility with other, entirely locally-run open source programs is being even sillier. It's not like they're replacing extant functionality with ChatGPT API calls or some nonsense, just enabling hobbyists who go through the trouble of setting up this entire other suite of unrelated shit and manage to get it working to then do something sort of silly with it.