this post was submitted on 04 Dec 2025
7 points (100.0% liked)
Stable Diffusion
5179 readers
12 users here now
Discuss matters related to our favourite AI Art generation technology
Also see
- Stable Diffusion Art (See its sidebar for more GenAI Art comms)
- !aihorde@lemmy.dbzer0.com
Other communities
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Stupid question, what is a distilled model?
It's basically when you use a larger model to train a smaller one. You use a dataset of data generated by the teacher model and ground truth data to train the student model, and by some strange alchemy I don't quite understand you get a much smaller model that resembles the teacher model.
It's really hard training on a distilled model without breaking it, so people prefer models undistilled whenever possible. Without the teacher model, distilled models are basically cripple-ware.
Thanks for explaining!
A distilled model is a more lightweight version of a full model which can run in fewer steps at slightly reduced quality.
Z-image-turbo is a distilled model, and the full version of the model will be released soon.
This post is referring to someone attempting to somehow undo the distillation to make an approximation of the full model, I guess. Which is basically pointless because as I said, the actual full model will release soon.
Thanks!