this post was submitted on 26 Dec 2025
234 points (99.6% liked)

Linux

10793 readers
975 users here now

A community for everything relating to the GNU/Linux operating system (except the memes!)

Also, check out:

Original icon base courtesy of lewing@isc.tamu.edu and The GIMP

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] DickFiasco@sh.itjust.works 132 points 17 hours ago (7 children)

I've had so many problems with Nvidia GPUs on Linux over the years that I now refuse to buy anything Nvidia. AMD cards work flawlessly and get very long-term support.

[–] criss_cross@lemmy.world 6 points 7 hours ago

Same. Refuse to use NVIDIA going forward for anything.

[–] chocrates@piefed.world 2 points 8 hours ago

Sadly GPU passthrough only worked on Nvidia cards when I was setting up my server, so I had to get one of them :(

[–] ColeSloth@discuss.tchncs.de 14 points 13 hours ago

I just replaced my old 1060 with a Radeon 6600 rx myself.

[–] BarbecueCowboy@lemmy.dbzer0.com 34 points 16 hours ago* (last edited 16 hours ago) (3 children)

I'm with you, I know we've had a lot of recent Linux converts, but I don't get why so many who've used Linux for years still buy Nvidia.

Like yeah, there's going to be some cool stuff, but it's going to be clunky and temporary.

[–] notfromhere@lemmy.ml 17 points 13 hours ago

Even now, CUDA is gold standard for data science / ML / AI related research and development. AMD is slowly brining around their ROCm platform, and Vulcan is gaining steam in that area. I’d love to ditch my nvidia cards and go exclusively AMD but nvidia supporting CUDA on consumer cards was a seriously smart move that AMD needs to catch up with.

[–] bleistift2@sopuli.xyz 19 points 15 hours ago (3 children)

When people switch to Linux they don’t do a lot of research beforehand. I, for one, didn’t know that Nvidia doesn’t work well with it until I had been using it for years.

[–] DickFiasco@sh.itjust.works 9 points 10 hours ago (1 children)

To be fair, Nvidia supports their newer GPUs well enough, so you may not have any problems for a while. But once they decide to end support for a product line, it's basically a death sentence for that hardware. That's what happened to me recently with the 470 driver. Older GPU worked fine until a kernel update broke the driver. There's nobody fixing it anymore, and they won't open-source even obsolete drivers.

[–] ChogChog@lemmy.world 4 points 7 hours ago (1 children)

I JUST ran into this issue myself. I’m running Proxmox on an old Laptop and wanted to use its 750M…. Which is one of those legacy cards now that I guess means I’d need to downgrade the kernel to use?

I’m not knowledgeable enough to know the risks or work I’d be looking at to get it working so for now, it’s on hiatus.

[–] DickFiasco@sh.itjust.works 3 points 7 hours ago

You might be able to use the Nouveau driver with the 750M. Performance won't be great, but might be sufficient if it's just for server admin.

[–] devfuuu@lemmy.world 11 points 12 hours ago

It's a good way for people to learn about fully hostile companies to the linux ecosystem.

[–] Manticore@lemmy.nz 0 points 13 hours ago* (last edited 6 hours ago) (2 children)

Similar for me. All the talk about what software Linux couldn't handle, I didn't learn that Linux is incompatible with Nvidia until AFTER I updated my GPU. I don't want to buy another GPU after less than a year, but Windows makes me want to do a sudoku in protest... but also my work and design software wont run properly on Linux and all anybody can talk about is browsers and games.

I'm damned whether I switch or not.

[–] Damage@feddit.it 17 points 12 hours ago (2 children)

Linux hates Nvidia

got that backwards

[–] Manticore@lemmy.nz 2 points 6 hours ago

My point is they dont work together. I can believe Nvidia 'started' it, but it doesnt matter or help me solve my problem. I've decided I want to try Linux but I can't afford another card so I'm doing what I can.

[–] chocrates@piefed.world 2 points 8 hours ago (1 children)

Linus openly hated Nvidia, but I suspect Nvidia started it

[–] woelkchen@lemmy.world 4 points 7 hours ago

If you only suspect then you never heard the entire quote and only know the memes.

[–] M137@lemmy.world 0 points 9 hours ago (1 children)

You somehow still learned wrong, and I don't understand how any of that happened. Nvidia not working well with Linux is so widely known and talked about, I knew about it, and the actual reason (which is the reverse of what you think), for several years before switching. I feel like you must have never tried to look anything up, spent any time in a place like lemmy or any forums with a Linux focus and basically must have decided to and kept yourself in some bubble of ignorance and no connection to learn anything.

[–] Manticore@lemmy.nz 4 points 6 hours ago* (last edited 5 hours ago)

This is an uncharitable interpretation of what I said.

Nvidia doesn't tell me it doesn't work. Linux users do. When I first used Linux for coding all those years ago, my GPU wasn't relevant, nobody mentioned it during my code bootcamp or computer science certification several years ago, and ubuntu and Kubuntu both booted fine.

When I upgraded my GPU, I got Nvidia. It was available and I knew what to expect. Simple as.

Then as W10 (and W11) got increasingly intolerable, I came to Linux communities to learn about using Linux as a Windows replcement, looking into distros like Mint and Garuda, and behold: I come across users saying Linux has compatibility issues with Nvidia. Perhaps because it is 'so well known' most don't think to mention it, I learned about it from a random comment on a meme about gaming.

I also looked into tutorials on getting Affinity design software to work on which distros, and the best I could find was shit like, I finally got it to run so long as I don't [do these three basic functions].

I don't care who started it, I can already believe it's the for-profit company sucking up to genAI. But right now that doesn't help me. I care that it's true and that's the card I have, and I'm still searching for distros that will let me switch and meets work needs and not just browsing or games.

I'm here now, aware that they don't work, still looking for the best solution I can afford, because I did look up Linux.

[–] moody@lemmings.world 7 points 14 hours ago

People buy Nvidia for different reasons, but not everyone faces any issues with it in Linux, and so they see no reason to change what they're already familiar with.

[–] ashughes@feddit.uk 9 points 15 hours ago

Yeah, I stopped using Nvidia like 20 years ago. I think my last Nvidia card may have been a GeForce MX, then I switched to a Matrox card for a time before landing on ATI/AMD.

Back then AMD was only just starting their open source driver efforts so the “good” driver was still proprietary, but I stuck with them to support their efforts with my wallet. I’m glad I did because it’s been well over a decade since I had any GPU issues, and I no longer stress about whether the hardware I buy is going to work or not (so long as the Kernel is up to date).

[–] DaddleDew@lemmy.world 6 points 15 hours ago* (last edited 15 hours ago)

I had an old NVidia gtx 970 on my previous machine when I switched to Linux and it was the source of 95% of my problems.

It died earlier this year so I finally upgraded to a new machine and put an Intel Arc B580 in it as a stop gap in hopes that video cards prices would regain some sanity eventually in a year or two. No problems whatsoever with it since then.

Now that AI is about to ruin the GPU market again I decided to bite the bullet and get myself an AMD RX 9070 XT before the prices go through the roof. I ain't touching NVidia's cards with a 10 foot pole. I might be able to sell my B580 for the same price I originally bought it for in a few months.

[–] GraveyardOrbit@lemmy.zip 4 points 14 hours ago (2 children)

That’s fine and dandy until you need to do ML, there is no other option

[–] SillySausage@lemmynsfw.com 9 points 13 hours ago (1 children)

I successfully ran local Llama with llama.cpp and an old AMD GPU. I'm not sure why you think there's no other option.

[–] GraveyardOrbit@lemmy.zip 5 points 12 hours ago (2 children)

Amd had approximately 1 consumer gpu with rocm support so unless your framework supports opencl or you want to fuck around with unsupported rocm drivers then you’re out of luck. They’ve completely failed to meet the market

[–] SillySausage@lemmynsfw.com 3 points 6 hours ago (1 children)

Llama.cpp now supports Vulkan, so it doesn't matter what card you're using.

[–] GraveyardOrbit@lemmy.zip 1 points 6 hours ago

Devs need actual support, tensor and other frameworks

[–] piccolo@sh.itjust.works 6 points 11 hours ago (1 children)

I mean... my 6700xt dont have offical rocm support, but the rocm driver works perfectly fine for it. The difference is amd hasnt but the effort in testing rocm on their consumer cards, thus cant make claims support for it.

[–] GraveyardOrbit@lemmy.zip 1 points 10 hours ago

The fact that it’s still off label like that is kinda nuts to me. ML and AI have been huge money makers for a decade and a half and amd seemingly doesn’t care about gpus. I wish they would invest more in testing and packaging drivers for the hardware that’s out there. In the year of our lord 2025 I shouldn’t have to compile from source or use aur packages for drivers 😭