Ooooh, I'm totally adding that to my email signature.
Aaaaaaand done.
A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.
Here are some examples to inspire your own showerthoughts:
If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.
Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.
Ooooh, I'm totally adding that to my email signature.
Aaaaaaand done.
If you don't mind the additional real estate, it would also be great to have a version where "printing" is still there but struck out, for people who aren't aware of the original.
o7
Is that good? I'm old and have no idea what that means.😆
Haha it’s good! It’s supposed to be a little guy saluting. The o is the head and the 7 is the arm and hand doing a salute pose.
🏞 Please consider the environment before issuing a return-to-office mandate
This is the right response - RTO is much worse for the climate than GenAI.
So I'm not saying RTO is worse than AI or vice versa. But do you have any data to back up that statement. I've been seeing nothing but news about AI data centers being an absolute nightmare for the planet. And even more so when corrupt politicians let then be built in places that already have trouble with maintaining normal water levels.
I get both are bad for the environment.
Editie: thanks for all the sources everyone. TIL
Well, real quick, my drive to the office is ~10 miles. My car gets ~3.1 miles/kwh. So let's say I use 3 KWH per trip, two trips a day, makes it 6KWH. A typical LLM request uses 0.0015KWH of electricity, so my single day commute in my car uses ~4000 LLM queries worth of electricity.
Yeah RTO is way worse, even for an EV that gets 91MPGe.
The thing is: those AI datacenters are used for a lot of things, LLM's usage amount to about 3% of usage, the rest is for stuff like image analysis, facial recognition, market analysis, recommendation services for streaming platforms and so on. And even the water usage is not really the big ticket item:
The issue of placement of data centers is another discussion, and i agree with you that placing data centers in locations that are not able to support them is bullshit. But people seem to simply not realize that everything we do has a cost. The US energy system uses 58 trillion gallons of water in withdrawals each year. ChatGPT use about 360 million liters/year, which comes down to 0.006% of Americas water usage / year. An average american household uses about 160 gallons of water / day; ChatGPT requests use about 20-50 ml/request. If you want to save water, go vegan or fix water pipes.
Proof was during COVID:
In many megacities of the world, the concentration of PM and NO2 declined by > 60% during the lockdown period. The air quality index (AQI) also improved substantially throughout the world during the lockdown. SOURCE
This! There are no reason go back to office for some professions like programmers, managers, etc
I don't think regular people really understand the power needed for AI. It's often taught that we just have it. But not where it comes from.
True, but most people don't realise how little not printing an email 'helped' the environment.
It would have been significant if a lot of people did it.
I'm doing my part. Can't remember the last time I had to print anything.
I don't think regular people really understand how little 3W per request is. It's the energy you take up by eating 3kcal. Or what your WiFi router uses in half an hour. Or your clothes dryer in 5 seconds.
People keep telling us that ai energy use is very low, but at the same time, companies keep building more and more giant power hungry datacenters. Something simply doesn't add up.
Sure, a small local model can generate text at low power usage, but how useful will that text be, and how many people will actually use it? What I see is people constantly moving to the newest, greatest model, and using it for more and more things, processing more and more tokens. Always more and more.
To be fair they never cared about environment. A paper is something easy to recycle and certainly not the most polluting material to produce.
It was more about saving money, greenwashing and pushing a conversion towards digital archiving (which is much more efficient that paper)
This is the main reason I am reticent about using ai. I can get around its funtional limitations but I need to know they have brought the energy usage down.
It's not that bad when it's just you fucking around having it write fanfics instead of doing something more taxing, like playing an AAA video game or, idk, run a microwave or whatever it is normies do. Training a model is very taxing, but running them isn't and the opportunity cost might even be net positive if you tend to use your gpu a lot.
It becomes more of a problem when everyone is doing it when it's not needed, like reading and writing emails. There's no net positive, it's a very large scale usage, and brains are a hell of a lot more efficient at it. This use case has gotta be one of the dumbest imaginable, all while making people legitimately dumber using it over time.
oh you are talking locally I think. I play games on my steamdeck as my laptop could not handle it at all.
Yup, and the deck can do stuff at an astounding low wattage, like 3W to 15W range. Meanwhile there's gpus that can run at like 400W-800W, like when people used to use two 1080s SLI. I always found it crazy when I saw a guy running a system burning as much electricity as a weak microwave just to play a game, lol. Kept his house warm, tho.
Your steam deck at full power (15W TDP per default) equals 5 ChatGPT requests per hour. Do you feel guilty yet? No? And you shouldn't!
You can run one on your PC locally so you know how much power it is consuming
It's the same as playing a 3d game. It's a GPU or GPU equivalent doing the work. It doesn't matter if you are asking it to summarize an email or play Red Dead Redemption.
I mean if every web search I do is like playing a 3d game then I will stick with web searches. 3d gaming is the most energy intensive thing I do on a computer.
How much further down than 3W/request can you go? i hope you don't let your microwave run 10 seconds longer than optimal, because that's exactly the amount of energy we are talking about. Or running a 5W nightlight for a bit over half an hour.
LLM and Image generation are not what kills the climate. What does are flights, cars, meat, and bad insulation of houses leading to high energy usage in winter. Even if we turned off all GenAI, it wouldn't even leave a dent compared to those behemoths.
Text generation uses hardly any energy at all, though. Most phones do it locally these days. In fact, it likely takes less energy to generate an email in 5 seconds than it would take for you to type it out manually in 5 minutes with the screen on the whole time.
“If everyone is littering, it’s not a big deal if I throw the occasional can on the ground”
I mean, depends on the email. If you spend more time answering yourself than the AI would, you almost certainly emit more green house gasses, used more fresh water and electricity, and burned more calories. Depending on the email, you might have also decreased net happiness generally.
Do we care about the environment or not? Please, oppose datacenters in desserts and stop farming alfalfa where water supplies are low. But your friend using AI to answer an email that could have been a google search is not the problem.
I miss the days where climate activists didn't get distracted by small change like GenAI. The big ticket issues haven't changed since the beginning of the climate movement: Cars, Flights, Industry (mainly concrete), Meat and Heating/AC are what drives climate change - any movement that polices individual usage of negligible CO2 emission will fail because noone likes to be preached at.
Please consider the environment before sending me an email, seriously, I won't read it.
I want to see the "cease and desist you may not use my facebook posts without my express permission" type footers but against AI to start showing up
But it's everywhere now and it's almost impossible to use mainstream services without it being used. I can just go to Google anymore, type a search query and get a reply without AI bs being used. How long before it's baked into the GMail compose window and it doesn't without me wanting to.
Then we stop using it.
I think we need a Rule 34 of open-source programs:
Rule 34: If it exists, there is an open-source version of it
i) If no open-source version exists, it is currently being created
ii) If no open-source version is being created, you must create it yourself
Doesn't gmail already do this? I seem to remember there being 'suggested response' options before I turned it off in the settings that were definitely AI generated. That option being presented to me creeps me out because you can't know if what you're receiving was actually written by the person sending it.
Thanks. You reminded me to turn Gemini off. Did that once and it came back on.