18
submitted 1 year ago* (last edited 3 weeks ago) by cll7793@lemmy.world to c/localllama@sh.itjust.works

(Deleted for not relevant anymore)

you are viewing a single comment's thread
view the rest of the comments
[-] Saledovil@sh.itjust.works 0 points 1 year ago

DRM on the chip seems not really feasible to me. In the end, the chip doesn't know what it is doing. It just does math. So how can any DRM on that level realize that it is running a forbidden model, or that a jailbreak prompt is being executed? Finding out what a program does already non trivial if you have the source code, and the DRM of the chip would only have the source code.

this post was submitted on 01 Aug 2023
18 points (90.9% liked)

LocalLLaMA

2249 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS