260
I can't use AMD (midwest.social)

It's necessary for my very important hobby of generating anime nudes.

top 50 comments
sorted by: hot top controversial new old
[-] const_void@lemmy.ml 158 points 3 months ago

Can we stop using the Steven Crowder meme already. The guy is a total chode.

[-] NegativeInf@lemmy.world 65 points 3 months ago

Lol. He gives chodes a bad rep. Call him what he is. A christofascist misogynist grifter.

[-] Vorticity@lemmy.world 38 points 3 months ago

I don't really disagree, but I think that was the original intent of the meme; to show Crowder as a complete chode by having him assert really stupid, deeply unpopular ideas.

The meme's use has become too soft on Crowder lately, though, I think.

[-] Chewy7324@discuss.tchncs.de 15 points 3 months ago

I notice lately that many memes origins are worse than I thought from the context they are used in. Racist, homophobic, and lying people are not something I usually accept as entertainment, but they sneak their way unnoticed into my (non-news) feed through memes. I guess most people won't know the origins of the meme and use it according to the meaning they formed on their own. Other memes like the distracted boyfriend meme are meaningless stock photos, so I understand why many people use memes without thinking about the origins.

Anyway, thanks for pointing out who the person in the picture actually is.

[-] slacktoid@lemmy.ml 123 points 3 months ago
[-] pelotron@midwest.social 23 points 3 months ago

I must admit when I learned this was Crowder I had a sad

[-] Pantherina@feddit.de 16 points 3 months ago

Just change and reupload :D

[-] Diplomjodler@feddit.de 93 points 3 months ago

Oh please. There are better templates than this stupid Nazi cunt. I really don't want to see this fuckface.

[-] TheFadingOne@feddit.de 140 points 3 months ago

Yes! This is a nice alternative template for example.

[-] DumbAceDragon@sh.itjust.works 18 points 3 months ago

For the longest time I just thought he was that one guy from modern family.

load more comments (1 replies)
load more comments (1 replies)
[-] null@slrpnk.net 81 points 3 months ago

I thought NixOS was the new Arch btw

[-] JovialSodium@lemmy.sdf.org 35 points 3 months ago* (last edited 3 months ago)

From: https://knowyourmeme.com/memes/btw-i-use-arch

BTW I Use Arch is a catchphrase used to make fun of the type of person who feels superior because they use a more difficult Linux distribution.

I think that's fair to apply to some NixOS users. Source: BTW I use NixOS.

[-] wurstgulasch3000@lemmy.world 7 points 3 months ago

I mean the barrier of entry is kind of high if you're used to more traditional package managers.

Source: I tried using nix on my Debian machine

load more comments (6 replies)
load more comments (2 replies)
[-] PlexSheep@feddit.de 9 points 3 months ago

Damn you're kinda right

load more comments (1 replies)
[-] turkishdelight@lemmy.ml 60 points 3 months ago

At least the Arch people are not shilling for some corp.

[-] nexussapphire@lemm.ee 12 points 3 months ago

I'm tired of people taking sides like companies give a shit about us. I wouldn't be surprised to see five comments saying something like "you shouldn't buy Nvidia AMD is open source" or "you should sell your card and get an amd card."

I'd say whatever you have is fine, it's better for the environment if you keep it for longer anyway. There are soo many people who parrot things without giving much though to an individuals situation or the complexity of a company's behavior. Every companies job is to maximize profit while minimizing loss.

Basically if everyone blindly chose AMD over Nvidia the roles would flip and AMD would start doing the things Nvidia is doing to maintain dominance, increase profit, reduce cost and Nvidia would start trying to gain more market share from AMD by opening up, becoming more consumer friendly, competitively priced

For individuals, selling your old card and buying a new AMD card for the same price will net you with a slower card in general or if you go used there is a good chance it doesn't work properly and the buyer ghosts you. I should know, I tried to get a used AMD card and it died every time I ran a GPU intensive game.

I also went the other way upgrading my mother's Nvidia card with a new AMD card that was three times as expensive as her Nvidia card ($50) would be on eBay and it runs a bit slower than her Nvidia card did. She was happy about the upgrade though because I used that Nvidia card in her movie server resulting in better live video transcoding than a cheap AMD card would.

[-] ReveredOxygen@sh.itjust.works 11 points 3 months ago

Who is saying to sell your card so you can buy AMD?

load more comments (2 replies)
load more comments (5 replies)
[-] Phegan@lemmy.world 58 points 3 months ago

Steven Crowder is a despicable human and does not deserve a meme template.

[-] RizzRustbolt@lemmy.world 16 points 3 months ago

I thought we were using the Calvin and Hobbes image now.

load more comments (1 replies)
load more comments (3 replies)
[-] hperrin@lemmy.world 44 points 3 months ago

I run Stable Diffusion with ROCm. Who needs CUDA?

[-] HakFoo@lemmy.sdf.org 8 points 3 months ago

What distro are you using? Been looking for an excuse to strain my 6900XT.

I started looking at getting it running on Void and it seemed like (at the time) there were a lot of specific version dependencies that made it awkward.

I suspect the right answer is to spin up a container, but I resent Docker's licensing BS too much for that. Surely by now there'd be a purpose built live image- write it to a flash drive, reboot, and boom, anime ~~vampire princes~~ hot girls

[-] PumpkinEscobar@lemmy.world 7 points 3 months ago

If you don’t like docker take a look at containerd and podman. I haven’t done any cuda with podman but it is supposed to work

load more comments (9 replies)
[-] carl@upload.chat 7 points 3 months ago* (last edited 3 months ago)

I can confirm that it works just fine for me. In my case I'm on Arch Linux btw and a 7900XTX, but it needed a few tweaks:

- Having xformers installed at all would sometimes break startup of stable-diffusion depending on the fork
- I had an internal and an external GPU, I want to set HIP_VISIBLE_DEVICE so that it only sees the correct one
- I had to update torch/torchvision and set HSA_OVERRIDE_GFX_VERSION

I threw what I did into https://github.com/icedream/sd-multiverse/blob/main/scripts/setup-venv.sh#L381-L386 to test several forks.

[-] anarchy79@lemmy.world 4 points 3 months ago

CUDA?! I barely even know'a!

load more comments (1 replies)
[-] RustyShackleford@programming.dev 26 points 3 months ago* (last edited 3 months ago)

Earlier in my career, I compiled tensorflow with CUDA/cuDNN (NVIDIA) in one container and then in another machine and container compiled with ROCm (AMD) for cancerous tissue detection in computer vision tasks. GPU acceleration in training the model was significantly more performant with NVIDIA libraries.

It's not like you can't train deep neural networks without NVIDIA, but their deep learning libraries combined with tensor cores in Turing-era GPUs and later make things much faster.

load more comments (5 replies)
[-] lorty@lemmy.ml 17 points 3 months ago

Brother of "I need nVidia for raytracing" while only playing last decade games.

[-] dinckelman@lemmy.world 9 points 3 months ago

I completely unironically know people who bought a 4090 exclusively to play League

[-] MossyFeathers@pawb.social 7 points 3 months ago

Not gonna lie, raytracing is cooler on older games than it is newer ones. Newer games use a lot of smoke and mirrors to simulate raytracing, which means raytracing isn't as obvious of an upgrade, or can even be a downgrade depending on the scene. Older games, however, don't have as much smoke and mirrors so raytracing can offer more of an improvement.

Also, stylized games with raytracing are 10/10. Idk why, but applying rtx to highly stylized games always looks way cooler than on games with realistic graphics.

[-] RaoulDook@lemmy.world 4 points 3 months ago* (last edited 3 months ago)

Quake 2 does looks pretty rad in RTX mode

load more comments (3 replies)
[-] aniki@lemm.ee 12 points 3 months ago* (last edited 3 months ago)

I'm holding out building a new gaming rig until AMD sorts out better ray-tracing and cuda support. I'm playing on a Deck now so I have plenty of time to work through my old backlog.

[-] t0fr@lemmy.ca 14 points 3 months ago

I was straight up thinking of going to AMD just to have fewer GPU problems on Linux myself

[-] Rikj000@discuss.tchncs.de 10 points 3 months ago

In my experience,
AMD is a bliss on Linux,
while Nvidia is a headache.

Also, AMD has ROCM,
it's their equivalent of Nvidia's CUDA.

[-] TropicalDingdong@lemmy.world 6 points 3 months ago

Yeah but is it actually equivalent?

If so I'm 100% in but it needs to actually be. a drop in replacement for "it just works" like cuda is.

Once I've actually got drivers all set cuda "just works". Is it equivalent in that way? Or am I going to get into a library compatibility issue in R or Python?

[-] Deckweiss@lemmy.world 7 points 3 months ago* (last edited 3 months ago)

Not all software that uses CUDA has support for ROCM.

But as far as setup goes, I just installed the correct drivers and ROCM compatible software just worked.

So - it can be a an equivalent alternative, but that depends on the software you want to run.

load more comments (8 replies)
load more comments (2 replies)
[-] grue@lemmy.world 6 points 3 months ago

CUDA isn't the responsibility of AMD to chase; it's the responsibility of Nvidia to quit being anticompetitive.

load more comments (6 replies)
load more comments (1 replies)
[-] fl42v@lemmy.ml 11 points 3 months ago

Mb zluda also works

[-] possiblylinux127@lemmy.zip 9 points 3 months ago

Rocm is the AMD version

[-] Scraft161@iusearchlinux.fyi 7 points 3 months ago

last I heard AMD is working on CUDA working on their GPUs and I saw a post saying it was pretty complete by now (although I myself don't keep up with that sort of stuff)

[-] CeeBee@lemmy.world 9 points 3 months ago

Well, right after that Nvidia amended their license agreements stating that you cannot use CUDA with any translation layers.

The project you're thinking of is ZLUDA.

[-] Scraft161@iusearchlinux.fyi 10 points 3 months ago

NVIDIA finally being the whole bitch it seems, not unexpected when it comes to tech monopolies.

In the words of our lord and savior Linus Torvalds "NVIDIA, fuck you! 🖕", amen.

In all reality, a lot of individuals aren't gonna care when it comes to EULA B's unless they absolutely depend on it and this whole move has me want an AMD gpu even more.

load more comments (2 replies)
load more comments (3 replies)
[-] anarchy79@lemmy.world 5 points 3 months ago* (last edited 3 months ago)

I need NVDA for the gainz

Edit: btw Raspberry PI is doing an IPO later this year, bullish on AMD

[-] turkishdelight@lemmy.ml 4 points 3 months ago
[-] ByteWelder@lemmy.ml 4 points 3 months ago

My only regret for picking team red is that DaVinci Resolve doesn’t support hardware encoding.

load more comments (1 replies)
[-] XEAL@lemm.ee 4 points 3 months ago

Stable Diffusion works on Radeon 7900XTX on Ubuntu.

load more comments (4 replies)
[-] ornery_chemist@mander.xyz 4 points 3 months ago* (last edited 3 months ago)

Man I just built a new rig last November and went with nvidia specifically to run some niche scientific computing software that only targets CUDA. It took a bit of effort to get it to play nice, but it at least runs pretty well. Unfortunately, now I'm trying to update to KDE6 and play games and boy howdy are there graphics glitches. I really wish HPC academics would ditch CUDA for GPU acceleration, and maybe ifort + mkl while they're at it.

load more comments (1 replies)
load more comments
view more: next ›
this post was submitted on 16 Mar 2024
260 points (74.3% liked)

linuxmemes

19671 readers
104 users here now

I use Arch btw


Sister communities:

Community rules

  1. Follow the site-wide rules and code of conduct
  2. Be civil
  3. Post Linux-related content
  4. No recent reposts

Please report posts and comments that break these rules!

founded 1 year ago
MODERATORS