this post was submitted on 22 Jan 2026
324 points (99.1% liked)

Technology

84274 readers
3890 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 34 comments
sorted by: hot top controversial new old
[–] Aceticon@lemmy.dbzer0.com 4 points 3 months ago

Like when the coach of a big sports team is interviewed on the half-time break of a game they're losing, this guy will of course going to say they're still going to win.

It would only be news if he confessed they were fucked.

[–] HugeNerd@lemmy.ca 4 points 3 months ago (1 children)

Why is that wizened old man dressing in a leather jacket?

[–] ripcord@lemmy.world 3 points 3 months ago (1 children)
[–] Almacca@aussie.zone 1 points 3 months ago

wizened; intransitive verb : to become dry, shrunken, and wrinkled often as a result of aging or of failing vitality

[–] homes@piefed.world 3 points 3 months ago

Yeah, I’m sure they do

[–] Formfiller@lemmy.world 3 points 3 months ago

These people make me thirsty for guillotine time

[–] Snowclone@lemmy.world 2 points 3 months ago

very hard to convince someone of something when their paycheck requires they don't understand it. seroiusly "hay I'm one of the people who has been demanding my company finish AI and roll it out and stuff it into everything we sell to look like massive growth so my stocks will start going up even though the language models aren't even designed to do what were claiming, and they don't even work well as language models, and we have no real use for them and they're horrifyingly costly to run, but no, I don't think it's a bubble."

[–] Itdidnttrickledown@lemmy.world 2 points 3 months ago

He isn't Fonzie but he has definitely jumped the shark.

[–] nonentity@sh.itjust.works 2 points 3 months ago

Autoerotic asphyxiation from farts in an echo chamber produce the wildest trips.

[–] melfie@lemy.lol 1 points 3 months ago* (last edited 3 months ago)

I regularly use GH Copilot with Claude Sonnet at work and it’s a coin toss whether it’s actually useful, but I overall do find value in using it. For my own use at home, I don’t do subscriptions for software and I’m also not giving these companies my data. I would self-host something like Qwen3 with Llama.cpp, but running the flagship MoE model would basically require a $10k GPU and one hell of a PSU. I could probably self-host a smaller model that wouldn’t be nearly as useful, but I’m not sure that would even be worth the effort.

Therein lies the problem. My company is paying a monthly fee for me to use Copilot that would take like 20 years to pay for even one of the $10k GPUs that I’m likely hogging for minutes at a time, and these companies are going to spend trillions building data centers full of these GPUs. It’s obvious that the price we are paying for AI now doesn’t cover the expense of actually running it, but it might when these models become less resource-intensive to the extent that they can run on a normal machine. However, in that case, why even run them in a data centers instead of just running them on the user’s local machine? I’m just not following how these new data centers are going to pay for themselves, though maybe my math is wrong, or I’m ignorant of the economies of scale hosting these models for a large user base.

load more comments
view more: ‹ prev next ›