this post was submitted on 23 Mar 2026
10 points (85.7% liked)

Technology

42208 readers
316 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 7 years ago
MODERATORS
top 9 comments
sorted by: hot top controversial new old
[–] skuzz@discuss.tchncs.de 2 points 11 hours ago (1 children)

It will be once the bubble pops. Small local tuned models for specific tasks that the user powers are much less expensive for the tech companies than tech companies powering and watering datacenters.

Right now the tech bros genuinely think people will be cool paying hundreds of dollars a month to rent a GPU for all their Internet tasks. AI fatigue is already setting in.

The tech bros' investors will pull funding once they realize how asinine that is long-term. Probably already starting to, with the likes of Zuck trying to use green charity money to fund his LLMs.

[–] yogthos@lemmy.ml 2 points 9 hours ago

I'm fully expecting the current bubble to pop in the near future as well. The whole war on Iran could serve as a catalyst incidentally given that it's going to drive energy prices to the moon.

[–] FEIN@lemmy.world 3 points 13 hours ago

That would be preferable. If ML optimization open sources and progresses greatly that would be good for the little guy

[–] TrippinMallard@lemmy.ml 3 points 14 hours ago* (last edited 14 hours ago) (1 children)

OpenAI/Anthropic is incentivized to prevent this.

They are also big enough and unregulated enough that they could use their power & political/industry relationships to drive up the price of local AI ownership (RAM, GPUs, etc)

[–] orc_princess@lemmy.ml 3 points 14 hours ago (2 children)

I'm not sure of how much they can actually prevent us from just running foss Chinese alternatives locally though

[–] yogthos@lemmy.ml 3 points 13 hours ago

Exactly, and a lot of big companies in US are heavily reliant on Chinese models already. For example, Airbnb uses Qwen cause they can self host it and customize it. Cursor built their latest composer model on top of Kimi, and so on. There are far more companies using these tools than making them, so while open models hurt companies that want to sell them as a service, they're lowering the cost for everyone else.

[–] TrippinMallard@lemmy.ml 2 points 13 hours ago

Not for everyone, but they are aiming at increasing hardware ownership costs so more people can't afford local AI

[–] AnAnonymousApe@lemmy.ml 1 points 16 hours ago (1 children)
[–] yogthos@lemmy.ml 3 points 15 hours ago

Do elaborate. The tech industry has gone through many cycles of going from mainframe to personal computer over the years. As new tech appears, it requires a huge amount of computing power to run initially. But over time people figure out how to optimize it, hardware matures, and it becomes possible to run this stuff locally. I don’t see why this tech should be any different.