this post was submitted on 08 May 2026
3 points (80.0% liked)

Ollama - Local LLMs for everyone!

350 readers
4 users here now

A place to discuss Ollama, from basic use, extensions and addons, integrations, and using it in custom code to create agents.

founded 10 months ago
MODERATORS
 

How to provide local files for Ollama models to use on replies?

For individual files, apparently #Ollama allows natively from what I could find, but I couldn't understand the instructions. Is there some tutorial I may have missed that explains it in layman's terms?

I also remember someone on the threadiverse posting some tool or plugin for that too, which I'll try looking for again when I'm not busy.

And what about whole directories, would it be possible without converting everything into a single file if the folder has only/mostly TXT files or similar?

Thanks in advance, and aldo thanks for the patience with the newbie questions! "<.<

@Ollama@lemmy.world

you are viewing a single comment's thread
view the rest of the comments
[–] Auster@thebrainbin.org 1 points 4 days ago

Will take a look. Thanks!