18
submitted 1 year ago* (last edited 3 weeks ago) by cll7793@lemmy.world to c/localllama@sh.itjust.works

(Deleted for not relevant anymore)

you are viewing a single comment's thread
view the rest of the comments
[-] cll7793@lemmy.world 4 points 1 year ago

Of course. I know some open source devs that advice backing up raw training data, LoRa, and essentially the original base models for fine tuning.

Politicians sent an open letter out in protest when Meta released their LLaMA 2. It is not unreasonable to assume they will intervene for the next one unless we speak out against this.

this post was submitted on 01 Aug 2023
18 points (90.9% liked)

LocalLLaMA

2249 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS