8

I have a laptop with a Ryzen 7 5700U, 16 GB ram, running Fedora 38 linux.
I'm looking to run a local uncensored LLM, I'd like to know what would be the best model and software to run it.
I'm currently running KoboldAI and Erebus 2.7b. It's okay in terms of speed, but I'm wondering if there's anything better out there. I guess, I would prefer something that is not web-ui based to lower the overhead, if possible.
I'm not very well versed in all the lingo yet, so please keep it simple.
Thanks!

you are viewing a single comment's thread
view the rest of the comments
[-] olicvb@lemmy.ca 5 points 6 months ago* (last edited 6 months ago)

Take a look at GPT4All, very user friendly

this post was submitted on 13 Dec 2023
8 points (100.0% liked)

LocalLLaMA

2100 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS