this post was submitted on 17 Jun 2024
42 points (97.7% liked)

Stable Diffusion

5653 readers
6 users here now

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] brucethemoose@lemmy.world 2 points 2 years ago

Yeah, and it's just fp8 truncation right? Not actual "smart" quantization? That's even a big hit for huge decoder-only llms.