this post was submitted on 27 Dec 2024
58 points (90.3% liked)
Games
33752 readers
1536 users here now
Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.
Weekly Threads:
Rules:
-
Submissions have to be related to games
-
No bigotry or harassment, be civil
-
No excessive self-promotion
-
Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts
-
Mark Spoilers and NSFW
-
No linking to piracy
More information about the community rules can be found here.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Someone mentioned Neural Radiance Caching to me recently, which Nvidia's been working on for a while. They presented about it at an event in 2023 (disclaimer: account-gated and I haven't watched - but a 6-minute "teaser" is available: YouTube).
I don't really understand how it works after having skimmed through some stuff about it, but it sounds like it could be one of several ways to improve this specific problem?
It's at least used in RTX Global Illumination as far as the nvidia site mentions it, and I heard rumors about Cyberpunk getting it, but unsure if it's used in current tech or not. I think I heard mentions of it un some graphics review.
Yeah I'm also confused about its current state / status in currently-released games. It looks like a significant enough of a feature that I would naively assume that if it was implemented in a currently-released game that the devs would boast about it, so I guess it's not there yet?