this post was submitted on 04 Dec 2024
104 points (90.6% liked)

Technology

59982 readers
2382 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Pretty much the only thing I think AI could be useful for - forecasting the weather based off tracking massive amounts of data. I look forward to seeing how this particular field of study is improved.

Bonus points, AI weather modeling, for once, saves energy relative to physics models. Pair it with some sort of light weight physical model to keep the hallucinations at bay, and you've got a good combo.

you are viewing a single comment's thread
view the rest of the comments
[–] Buffalox@lemmy.world 43 points 2 weeks ago (2 children)

what’s perhaps most striking about GenCast is that it requires significantly less computing power than traditional physics-based ensemble forecasts like ENS. According to Google, a single one of its TPU v5 tensor processing units can produce a 15-day GenCast forecast in eight minutes. By contrast, it can take a supercomputer with tens of thousands of processors hours to produce a physics-based forecast.

If true this is extremely impressive, but this is their own evaluation, so it may be biased.

[–] RvTV95XBeo@sh.itjust.works 16 points 2 weeks ago (2 children)

What they leave off is how much goes into training the model, but I imagine once they settle on a trained model it can carry on pretty efficiently for a long time, especially if they're baking in things like atmospheric CO2 levels to help keep forecasts in line with global warming.

[–] Buffalox@lemmy.world 10 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Absolutely, but training is only once, being so efficient to make the actual forecast, you could have a forecast personally made for your own garden, which may be very different than a generic one covering hundreds of km². Then the about 90% accuracy will feel WAY more accurate.

[–] RvTV95XBeo@sh.itjust.works 6 points 2 weeks ago (1 children)

I feel this personally, I live in the hills outside of a valley metro. All weather data is forecasted off of valley sensors, but shit gets weird when you suddenly climb 2000+ ft.

The best weather services in my area are those that can factor in peoples household meters into their forecasting, but those services still aren't perfect.

[–] futatorius@lemm.ee 1 points 1 week ago

I live in a hilly county in a country at the intersection of two weather cells, with a warm ocean current bathing our coast. Prediction in those conditions is a real challenge. For example, my neighbors 50 metres from me get consistently more snow and ice than I do. More stations would really help, but moving from there to crowd-sourced forecasting has issues due to lack of calibration and other biases. It can help, but not as much as you might think.

[–] futatorius@lemm.ee 2 points 1 week ago

The non-AI models in use now all get feedback on each run from actual observations, that's used to correct model parameters for later runs.

[–] Zarxrax@lemmy.world 4 points 2 weeks ago (2 children)

I'm sure the model would need to be continuously updated to take in more recent weather data.

[–] Beacon@fedia.io 6 points 2 weeks ago

Inputting newer weather condition data is different than changing the model. The model is the machine that does the computing, the weather data is just inputting variables. As an analogy it's like a computer - the hardware itself doesn't change, but if you do different clicks and typing input then the computer will output different things on screen. The ai model itself only changes when you train it differently.

[–] RvTV95XBeo@sh.itjust.works 1 points 2 weeks ago

There's a difference between the real-ish-time weather data continuously fed in to output predictions, and the decades of weather data used to build the model. The continuous feed of data is more than likely part of what Google alleges is saving significant energy.

Its the training on decades of information, and occasional updates to those trained models that take a significant amount of resources, but hopefully for relatively short bursts.

[–] jacksilver@lemmy.world 2 points 2 weeks ago (1 children)

It actually makes sense if you think about it from the perspective that ML is about generalizing trends/functions. Simulating the world is hard, generalizing the world based on past observations - easy (with some lossyness).

[–] futatorius@lemm.ee 1 points 1 week ago

generalizing the world based on past observations - easy (with some lossyness)

I know people who spend their lives working on climate models, and none of them would say it's easy. And climate models are "generalizing the world based on past observations" plugged into some very complex physical models.