I don't like that first article, it gives contradicting information about the energy usage per image, saying 0.29kWh/image then saying 0.29kWh/1000 images.
The article is way, waaaaaaay off. My PC generates images at a rate of about one per second (SDXL Turbo) with an Nvidia 4060 Ti which uses 160W (~12W when idle). Let's assume I have it generate images constantly for one hour:
3,600 images per 0.16kWh
About 22,500 images per hour.
In other words, generating a single image is a trivial amount of power.
How are you able to generate images so quickly? When I tried to run Stable Diffusion on my nominally comparable GPU (RX6800M, similar to 6700XT, 12GB VRAM), it took over a minute to get halfway to generating a medium sized image before my entire computer crashed.
I don't like that first article, it gives contradicting information about the energy usage per image, saying 0.29kWh/image then saying 0.29kWh/1000 images.
The article is way, waaaaaaay off. My PC generates images at a rate of about one per second (SDXL Turbo) with an Nvidia 4060 Ti which uses 160W (~12W when idle). Let's assume I have it generate images constantly for one hour:
In other words, generating a single image is a trivial amount of power.
How are you able to generate images so quickly? When I tried to run Stable Diffusion on my nominally comparable GPU (RX6800M, similar to 6700XT, 12GB VRAM), it took over a minute to get halfway to generating a medium sized image before my entire computer crashed.
Nvidia cards are so much faster than AMD for Stable Diffusion it's ridiculous.
That and Turbo like that other person said ๐