this post was submitted on 11 Feb 2026
862 points (99.5% liked)
memes
19953 readers
2271 users here now
Community rules
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to !politicalmemes@lemmy.world
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads/AI Slop
No advertisements or spam. This is an instance rule and the only way to live. We also consider AI slop to be spam in this community and is subject to removal.
A collection of some classic Lemmy memes for your enjoyment
Sister communities
- !tenforward@lemmy.world : Star Trek memes, chat and shitposts
- !lemmyshitpost@lemmy.world : Lemmy Shitposts, anything and everything goes.
- !linuxmemes@lemmy.world : Linux themed memes
- !comicstrips@lemmy.world : for those who love comic stories.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I haven't even owned an NVIDIA product. AMD has always offered me obviously better value. I've always seen the logos these companies make deals with each other over, pretending their games run better on one brand over the other...
I will never buy NVIDIA as long as I live. It's time for new hobbies I guess. Sad.
That's absolutely a thing. There are a lot of benchmark channels showing noticeable changes in fps for some games between brands. Depending on the game, that might change the value proposition.
CUDA support is what really pissed me off. I wanted to do some early machine learning (photogrammetry and computer vision stuff) 10-15 years ago, but the only way to use it was on nvidia hardware.
The industry relying on proprietary platform always ends up stuck like this
You're right, but the further from release you are the less relevant this becomes. Another win for patient gamers
Over the lifetimes of the GPUs, many that benchmarked higher on nvidia early on swapped places as the AMD drivers matured.
That has more to do with driver compatibility. "Engineered for NVIDIA" does not mean your AMD is going to have a disadvantage unless the game just came out. And when it comes to AAA titles you've either got the power or you don't. If the drivers are up to date for the game in question, how you program a game is not really that big of a concern.
It's just branding deals most of the time. They are not using some secret property tech.
The true catch 22 is devs use dlss as a crutch Escape From Tarkov has had a bug on AMD causing distant objects to shimmer.
I was AMD only for years but since it's one of the main games we play I caved and it feels shitty.
The developments of the last couple of years have definitely changed the tech available on cards. With he AI upscaling coming and games running at SNES resolutions I'm out of knowledge going forward.
Me neiter! I don't care about all these made-up so-called state of the art extra features BS.