0
()
submitted a long while ago by @ to c/@
you are viewing a single comment's thread
view the rest of the comments
[-] turkishdelight@lemmy.ml 3 points 3 months ago

My mind was already blown that models like Llama work with 4-bit quantization. But this is just insane.

this post was submitted on 01 Jan 0001
0 points (NaN% liked)

0 readers
0 users here now

founded a long while ago