this post was submitted on 04 Feb 2026
27 points (93.5% liked)

BuyFromEU

6474 readers
392 users here now

Welcome to BuyFromEU - A community dedicated to supporting European-made goods and services!

Feel free to post, comment and vote, be excellent to each other and follow the rules.

We also invite you to subscribe to:

Logo generated with mistral le chat Banner by Christian Lue on unsplash.com

founded 11 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] frightful_hobgoblin@lemmy.ml 0 points 12 hours ago (2 children)
[–] lime@feddit.nu 2 points 11 hours ago (1 children)

that's old data. inference uses more than now since usage has gone up significantly. they traded places in march or april 2025.

[–] elvith@feddit.org 2 points 9 hours ago (1 children)

That's the problem of reference. Your individual queries might not consume much - especially when compared to the training - but the more people use it, the more the whole consumption is. At some point running those models will consume more than training them

[–] lime@feddit.nu 2 points 1 hour ago

we passed that point last year, yes.

[–] MotoAsh@piefed.social 1 points 12 hours ago

That does less than nothing to disprove my point...