this post was submitted on 04 Feb 2026
27 points (93.5% liked)

BuyFromEU

6322 readers
610 users here now

Welcome to BuyFromEU - A community dedicated to supporting European-made goods and services!

Feel free to post, comment and vote, be excellent to each other and follow the rules.

We also invite you to subscribe to:

Logo generated with mistral le chat Banner by Christian Lue on unsplash.com

founded 11 months ago
MODERATORS
top 16 comments
sorted by: hot top controversial new old
[–] Grail@multiverse.soulism.net 2 points 2 hours ago
[–] MotoAsh@piefed.social 30 points 11 hours ago

Global warming is global. Accelerate it at your own risk.

[–] morgunkorn@discuss.tchncs.de 35 points 11 hours ago

don't use AI, it's a scam

[–] Sineljora@sh.itjust.works 2 points 6 hours ago

Just run local models at home.

[–] birdwing@lemmy.blahaj.zone 15 points 11 hours ago* (last edited 11 hours ago)

Use a European AI, like Mistral - or Ollama, Llamafile and so on.

Your data will then at least be stored in Europe, in case of spying. And GDPR > USA.

[–] kutt@lemmy.world 14 points 11 hours ago

Your data is more valuable to them than water and energy

[–] FortyTwo@lemmy.world 5 points 10 hours ago

Your interactions will generally be stored and used for further training, so using their services gives them an advantage compared to European companies. I would suspect, although I have not read about this in detail, that the plan for eventual monetisation, once they have a monopoly and people are dependent and unable to switch back, will be to generate a profile and use this for targeted advertising/influencing, similar to how social network services operate. IMO it's much better not to use their services, they want the data more than they care about electricity bills.

[–] starlinguk@lemmy.world 6 points 11 hours ago

The American ones have AI farms in Europe.

[–] frightful_hobgoblin@lemmy.ml -1 points 11 hours ago (1 children)

Training GPTs takes a lot of water and energy, running them doesn't take massive amounts.

[–] MotoAsh@piefed.social 4 points 11 hours ago* (last edited 11 hours ago) (2 children)

Yes it does. "Not as much as training" is a stratospheric bar...

You may as well say, "my car doesn't use gas because American semis use way more". They both still use a resource we should be more careful with.

[–] Europellinore@europe.pub 2 points 7 hours ago (1 children)

I recently tried to calculate this for my company. I wouldn't call it negligible, but the impact of all video calls turned out to be much greater than the impact of AI.

[–] MotoAsh@piefed.social 1 points 5 hours ago* (last edited 5 hours ago)

Of course that's going to produce a heavier load on your part of the infrastructure... That stuff is running locally.

Also, it's pretty easy to have effective calls and turn off the damn video. Most people don't need to stare at everyone staring at their computers.

[–] frightful_hobgoblin@lemmy.ml 0 points 10 hours ago (2 children)
[–] lime@feddit.nu 1 points 10 hours ago (1 children)

that's old data. inference uses more than now since usage has gone up significantly. they traded places in march or april 2025.

[–] elvith@feddit.org 2 points 7 hours ago

That's the problem of reference. Your individual queries might not consume much - especially when compared to the training - but the more people use it, the more the whole consumption is. At some point running those models will consume more than training them

[–] MotoAsh@piefed.social 1 points 10 hours ago

That does less than nothing to disprove my point...