Generating 1,000 images with a powerful AI model, such as Stable Diffusion XL, is responsible for roughly as much carbon dioxide as driving the equivalent of 4.1 miles in an average gasoline-powered car.
Consider the average toaster, roughly 1100W (on average) and toast takes 1-4 min to cook, (for the purposes of this we’ll split the difference and say 2 minutes).
With math, toasting 1 slice of bread equates to roughly 0.037kWh of electricity. (kWh = (watts × hours) ÷ 1000)
Now I’m running a 7900XTX (OC) who’s peak power draw is 800W (300W less than a toaster), and it legit takes 5-10secs to generate an image. Realistically I might do a couple of runs (some small then one big one) and use 30 secs of peak compute time.
This would equate to 0.0067kWh of electricity usage.
Toasting bread quite literally draws way more electricity than it takes for me to generate one AI image.
So are you out there hassling people cooking their morning toast for thier criminally high power usage?
Also some further context for you, I don’t use Stable Diffusion XL (listed in your article) as the old school 512x512 is more than enough for my needs (as demonstrated in this post^^). Your second article is paywalled, (not great to share if ppl can’t access it), but appears to be data center use which as described above is not what I’m doing here.
I know exactly how much goes into it, 5 seconds of GPU time, on my own computer. That’s why I said it. How many phone charges do you think it would take to fully create a digital drawing on a laptop? It’s not going to be much different IME.
You aren’t taking into account the resources required to train the model. You clearly have very little idea how much goes into it other than running software someone else wrote.
Of course I’ve taken into account model training costs, was that supposed to be your gotcha? You don’t actually think the energy cost amortization from training still accounts for the bulk of energy expended per image generated do you?
Oh no! 5 seconds of GPU time on consumer grade hardware!
It’s not nearly as small as you think it is.
https://www.technologyreview.com/2023/12/01/1084189/making-an-image-with-generative-ai-uses-as-much-energy-as-charging-your-phone/
Even better when you take into account the scale at that these run at:
https://www.washingtonpost.com/business/2024/03/07/ai-data-centers-power/
Consider the average toaster, roughly 1100W (on average) and toast takes 1-4 min to cook, (for the purposes of this we’ll split the difference and say 2 minutes).
With math, toasting 1 slice of bread equates to roughly 0.037kWh of electricity. (kWh = (watts × hours) ÷ 1000)
Now I’m running a 7900XTX (OC) who’s peak power draw is 800W (300W less than a toaster), and it legit takes 5-10secs to generate an image. Realistically I might do a couple of runs (some small then one big one) and use 30 secs of peak compute time. This would equate to 0.0067kWh of electricity usage.
Toasting bread quite literally draws way more electricity than it takes for me to generate one AI image.
So are you out there hassling people cooking their morning toast for thier criminally high power usage?
Also some further context for you, I don’t use Stable Diffusion XL (listed in your article) as the old school 512x512 is more than enough for my needs (as demonstrated in this post^^). Your second article is paywalled, (not great to share if ppl can’t access it), but appears to be data center use which as described above is not what I’m doing here.
I know exactly how much goes into it, 5 seconds of GPU time, on my own computer. That’s why I said it. How many phone charges do you think it would take to fully create a digital drawing on a laptop? It’s not going to be much different IME.
You aren’t taking into account the resources required to train the model. You clearly have very little idea how much goes into it other than running software someone else wrote.
Of course I’ve taken into account model training costs, was that supposed to be your gotcha? You don’t actually think the energy cost amortization from training still accounts for the bulk of energy expended per image generated do you?
We aren’t training models here, this isn’t the training-ai-with-massive-datacentres@lemmy