this post was submitted on 05 May 2026
19 points (95.2% liked)

LocalLLaMA

4706 readers
49 users here now

Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.

Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.

As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.

Rules:

Rule 1 - No harassment or personal character attacks of community members. I.E no namecalling, no generalizing entire groups of people that make up our community, no baseless personal insults.

Rule 2 - No comparing artificial intelligence/machine learning models to cryptocurrency. I.E no comparing the usefulness of models to that of NFTs, no comparing the resource usage required to train a model is anything close to maintaining a blockchain/ mining for crypto, no implying its just a fad/bubble that will leave people with nothing of value when it burst.

Rule 3 - No comparing artificial intelligence/machine learning to simple text prediction algorithms. I.E statements such as "llms are basically just simple text predictions like what your phone keyboard autocorrect uses, and they're still using the same algorithms since <over 10 years ago>.

Rule 4 - No implying that models are devoid of purpose or potential for enriching peoples lives.

founded 2 years ago
MODERATORS
 

I live in Korea but still don't speak the language. I get a lot of SMS in Korean, 95% are spam but the last 5% are important ones. I already missed to pay my phone bull twice for months because I didn't realize that the credit card I put there was not valid anymore and they kept sending me SMS about it which I ignored because most of the SMS is spam and copy and pasting everyone into Google Translate is quite a lot of work which is tidious and I just don't do it.

So my idea was to take a open source SMS app like Fossify Messages and add automatic translation to it. And especially because SMS is used for security relevant stuff like 2 factor authentication, I really need the translation to work locally and not on a 3rd party server.

On my PC I have a really really good model the Aya:8b which fits well into the 12 GB VRAM on my RTX3060 and the results Korean -> English are outstanding!

But when I put it on the phone -I have a Samsung S24 Ultra - it fills up the RAM and get's killed quite quickly. I tried to configure it so it's allowed to use more ram for a longer period of time, etc. but even then it's extremely slow and translates like 3 SMS in an hour and I have about 5000 in the database (I only translate the Korean ones).

I tried some other models like Gemma 3 and NLLB-200 which just output garbage, especially the later dropped numbers , URL, codes which are important in SMS translations.

Anyway, does someone have any tips what I could do?

top 10 comments
sorted by: hot top controversial new old
[–] TheLeadenSea@sh.itjust.works 6 points 1 day ago (2 children)
[–] jeena@piefed.jeena.net 2 points 1 day ago (1 children)

Ah cool, perhaps I can figure out what model the're using because it's open source, thanks for sharing!

[–] jeena@piefed.jeena.net 1 points 1 day ago

God damn this app is amazing! I've never seen such speed in translation like the SMS text I paste in is translated within less than half a second! And the translation quality is extremely good!

On top of all those good things it even exposes a android service so other apps can send in a string and it translates it for them and sends it back for them to use asynchronously, holly shit!

With this speed I can remove all the background translation code and all the storage and database, etc and can just translate the SMS when it becomes visible!

I will try to integrate it. Perhaps it is so little code that it'd be worth to upstream it to Fossify if it works well.

[–] Tundra@sh.itjust.works 1 points 1 day ago

looks promising

[–] Kissaki@programming.dev 4 points 1 day ago

Firefox includes/supports translation models, IIRC on mobile as well. They're pretty small and probably good.

Another place I would check is f-droid - serving OSS apps so can be inspected. I thought I had a translator app installed like that, but it doesn't seem to use models. I probably mixed it up with a different app that uses models for voice to text.

[–] hendrik@palaver.p3x.de 3 points 1 day ago* (last edited 1 day ago)

I don't think a multi billion parameter LLM counts as proper machine translation... That'd be something like Argos Translate or Mozilla's Bergamot Project.

[–] librekitty@lemmy.today 3 points 2 days ago (1 children)

Gemma4 is designed to fit on phones, could try that

[–] farsinuce@feddit.dk 2 points 1 day ago* (last edited 1 day ago)
[–] hexagonwin@lemmy.today 1 points 1 day ago (1 children)

i use this app which auto exports sms/contacts/call logs to json on my local storage every day. maybe write a script to auto transfer that exported json to your PC, diff for new messages and translate them using the local llm and send it back to your phone?

[–] jeena@piefed.jeena.net 1 points 1 day ago

Yeah, it could also just sent the translation to a Matrix chanel, but then there are more moving parts and a day delay, it's still much better than what I have now though. So if I can't solve it on device this is a good fallback.