I use this app personally
LocalLLaMA
Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.
Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.
As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.
Rules:
Rule 1 - No harassment or personal character attacks of community members. I.E no namecalling, no generalizing entire groups of people that make up our community, no baseless personal insults.
Rule 2 - No comparing artificial intelligence/machine learning models to cryptocurrency. I.E no comparing the usefulness of models to that of NFTs, no comparing the resource usage required to train a model is anything close to maintaining a blockchain/ mining for crypto, no implying its just a fad/bubble that will leave people with nothing of value when it burst.
Rule 3 - No comparing artificial intelligence/machine learning to simple text prediction algorithms. I.E statements such as "llms are basically just simple text predictions like what your phone keyboard autocorrect uses, and they're still using the same algorithms since <over 10 years ago>.
Rule 4 - No implying that models are devoid of purpose or potential for enriching peoples lives.
Ah cool, perhaps I can figure out what model the're using because it's open source, thanks for sharing!
God damn this app is amazing! I've never seen such speed in translation like the SMS text I paste in is translated within less than half a second! And the translation quality is extremely good!
On top of all those good things it even exposes a android service so other apps can send in a string and it translates it for them and sends it back for them to use asynchronously, holly shit!
With this speed I can remove all the background translation code and all the storage and database, etc and can just translate the SMS when it becomes visible!
I will try to integrate it. Perhaps it is so little code that it'd be worth to upstream it to Fossify if it works well.
looks promising
Firefox includes/supports translation models, IIRC on mobile as well. They're pretty small and probably good.
Another place I would check is f-droid - serving OSS apps so can be inspected. I thought I had a translator app installed like that, but it doesn't seem to use models. I probably mixed it up with a different app that uses models for voice to text.
I don't think a multi billion parameter LLM counts as proper machine translation... That'd be something like Argos Translate or Mozilla's Bergamot Project.
Gemma4 is designed to fit on phones, could try that
They have new MTB versions out for Gemma 4. Seen on their blog.
Edge models:
i use this app which auto exports sms/contacts/call logs to json on my local storage every day. maybe write a script to auto transfer that exported json to your PC, diff for new messages and translate them using the local llm and send it back to your phone?
Yeah, it could also just sent the translation to a Matrix chanel, but then there are more moving parts and a day delay, it's still much better than what I have now though. So if I can't solve it on device this is a good fallback.