this post was submitted on 22 Dec 2025
1373 points (98.4% liked)
People Twitter
8873 readers
1950 users here now
People tweeting stuff. We allow tweets from anyone.
RULES:
- Mark NSFW content.
- No doxxing people.
- Must be a pic of the tweet or similar. No direct links to the tweet.
- No bullying or international politcs
- Be excellent to each other.
- Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician. Archive.is the best way.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The numbers are are also clearly fictive. Driving a car for 4 miles uses about half a liter of fuel. A liter of gasoline contains about 9kwh of energy meaning, that you would use about 4.5 kwh per half hour of streaming. So the servers would have to draw about 9 KW to serve a single person? That would be like 10 gaming PCs running at full power to serve one person. Are they animating the shows in real time? No compression algorithm is that inefficient and no hard drive uses that much energy.
edit: also they could never be profitable like that. Let's say you watch three hours per day. That would be 9kWx3hrsx30days=810kwh per month. Even if they only pay 5 cents a kWh that would still be over $40 per month just in electricity cost for one user.
Thanks for doing the math. I'm not gonna check it, you seem trustworthy enough.
I’m not gonna check the numbers either. Because I have no idea how. And I don’t even understand them.
So obviously he’s right!
The numbers aren't too difficult to verify.
I found this Canadian government web page that says it's roughly 8.9 kWh, so that checks out.
Looking at the fuel efficiency table on that same website, it looks like OP used a reasonable average fuel efficiency of 30 mpg or slightly under 8L/100km: 4 miles / 30mpg = 0.13 gallons, or 0.492 liters, so their claim of half a liter of gas also checks out.
The cheapest commercial energy in the US appears to be in North Dakota at $0.0741/kWh, so using $0.05/kWh was very generous.
The average Netflix user watches about 2 hours per day, or 60 hours per month.
Just in an attempt to be a bit more accurate, let's assume the individual user's television and internet router use about 900W, so we'll use a final number of 8kW for Netflix's power use per user.
8 kW * 60 hours= 480 kWh
And the cost of all of those kWh at $0.05: 480 kWh * $0.05 = $24.00
Or, the cost in the least expensive state in the US: 480 kWh * $0.0741 = $35.57
National average is $0.14/kWh, so unless Netflix was serving everyone out of North Dakota and Texas, their average cost per user would be much closer to $70 per user.
OP's numbers were definitely already accurate enough for the point. Basically, there's no possible way Netflix needs that much electricity to serve their users.
An average router uses between 5 and 20w, and modern LED televisions use between 30 and 180w (on the high end). Even a worst case scenario, like an uncommonly large 60" older Plasma TV would only use around 600w.
Yeah, I almost added "and they most certainly do not" to the end of that sentence, but I was trying to underestimate a little as well.