this post was submitted on 27 Nov 2025
195 points (100.0% liked)

Fuck AI

4665 readers
1424 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 2 years ago
MODERATORS
top 13 comments
sorted by: hot top controversial new old
[–] very_well_lost@lemmy.world 32 points 2 days ago* (last edited 2 days ago) (3 children)

The article didn't really give much background, but I think this is important for understanding the current bubble so here's my (non-economist) attempt at explaining:

Let's say you run a print business. Customers bring you stuff and you make copies. In order to run your business, you need to buy a fancy copy/printer machine. Let's say that costs $10,000. Your first year in business, let's say you make $2,000 in revenue. Since you bought the printer, your total profit is... -$8000. Not great if you're trying to attract potential investors. But! There's a neat accounting trick you can do called depreciation. See, you expect your expensive printer will be in service for 10 years so instead of putting a 10k expense on your books on year one, you spread out the cost over 10 years (1k every year). Just by changing your accounting strategy, you've gone from losing 8k your first year to making a profit of 1k. Much more attractive for potential investors.

Now imagine that instead of a print business you're an AI company, and instead of a 10k printer, you need to buy thousands of specialty GPUs that cost $50,000 a pop. Obviously you'll wanna spread those massive costs out over the service life of the GPUs, but how long is that? Most experts and most evidence will tell you that these things will probably be good for 2-3 years before needing to be replaced, so your depreciated cost should be somewhere between 15 and 25k per GPU. But fuck the experts and the evidence — you're Oracle, and according to you, the expected service life is 6 years. Now those GPUs are only costing you ~8k per year, and your books look much better to investors.

What's crazy is that you can look at the books of these publicly-traded tech giants and see that they've been lowering the depreciation cost of their GPU assets (by extending their expected service life) year after year. Nothing about that hardware has changed and those companies are using that same hardware for more and more compute each year... but somehow they're going to last longer than expected? They're lying about their hardware expenses to make their revenue shortfalls seem smaller. And the situation gets even crazier when you consider that all of those rapidly-depreciating GPUs are being used as collateral as these companies take on massive debt to buy even more GPUs — GPUs that will be worthless long before those loans are expected to be paid back. If OpenAI goes bust, not even the bank is getting paid because all their "collateral" won't even be worth the silicon it's printed on.

It's like one house of cards built on top of another, even shittier house of cards... and then the entire US economy is balanced on top of that. WTF.

[–] zqwzzle@lemmy.ca 7 points 1 day ago* (last edited 1 day ago)

And for the counter argument of the dot com bubble where they spent enormous amounts of money laying fibre (so called dark fibre), the lifetime of fibre is effectively infinite and can be used to this day. Whereas these GPUs either

  1. burn out from continual use or
  2. are superseded by another model within a few years (see A100->H100->blackwell) where it becomes economically infeasible to run older models

An excellent explanation for the uninitiated, thank you for taking the time to write it.

[–] Saledovil@sh.itjust.works 2 points 1 day ago (1 children)

I'm only familiar with how deprecation works in Gernany, but here, deducting that 10,000$ printer in one go would not fly. Since the reduced profit would also reduce your tax burden.

[–] boonhet@sopuli.xyz 1 points 1 day ago

Here in Estonia you can claim all the VAT back at once if it was a one-off purchase rather than a lease so you get a bunch of money back, but you don't get taxed on your profit anyway, so it's up to you how you want to depreciate it, within reason. But dividends you're allowed to pay are based on last year's profit - and as such, by depreciating over a longer period, you don't diminish your dividends as much (but you're diminishing them for a longer time). Dividends are great to pay out because they only incur income tax, not social tax.

Probably most people would depreciate that fancy printer over 5-10 years if it's expected to last 10.

[–] eleijeep@piefed.social 8 points 2 days ago

Only one question remains: Will this bubble last longer than Liz Truss?

[–] arin@lemmy.world 6 points 2 days ago (2 children)

You can eat lettuce, you can't eat ai wtf is this analogy??

[–] very_well_lost@lemmy.world 8 points 2 days ago (2 children)
[–] arin@lemmy.world 4 points 2 days ago (2 children)

If ewaste rotted quicker i could compost it... STILL terrible analogy

[–] absentbird@lemmy.world 5 points 2 days ago

The idea is that GPUs are a bit like a lettuce crop. They are only really able to give you a return when they're fresh and vibrant. Over time the health of the chips declines and their power gets overshadowed; that's like your lettuce crop wilting.

It's not a great analogy, but I think that's what they were going for.

[–] SaveTheTuaHawk@lemmy.ca 2 points 2 days ago

dude, rent a sense of humor. Lettuce is also a metaphor for cash. It has a short life.

[–] SaveTheTuaHawk@lemmy.ca 1 points 2 days ago

should be digital raspberries, they rot before you get them home.