this post was submitted on 04 Aug 2023
89 points (94.9% liked)

PC Gaming

8581 readers
212 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
top 39 comments
sorted by: hot top controversial new old

Thanks, I hate it

[–] GrossGhost@kbin.social 19 points 1 year ago (3 children)

I love my 1080ti (╥﹏╥) I really do...

[–] adora@kbin.social 4 points 1 year ago (1 children)

lol i'm running the same card!

[–] GrossGhost@kbin.social 3 points 1 year ago

I know it's old and there are newer and better ones out now, but it really is a fantastic card. It's served me well for a long time, and I guess it will have to continue doing so. Keep chugging little guy ❤︎

[–] dandroid@dandroid.app 1 points 1 year ago

The only reason I upgraded my 1080TI was because I'm a huge dummy and got a really expensive monitor that only had HDMI 2.1 and no port. So my 1080TI couldn't use gsync on it. So after overpaying for a monitor, I overpaid for a graphics card as well. Yay, sunk cost fallacy.

My system kicks ass now, though. Still can barely play Jedi Survivor.

[–] Robdor@lemmy.world 1 points 1 year ago

That's a great card though and if it does what you want/need then you are doing just fine.

[–] ono@lemmy.ca 12 points 1 year ago (3 children)

Increased sales volume should be bringing prices down, not up. Perhaps the real problem is that we need more production capacity?

In the meantime, I'll happily use an AMD card for gaming.

[–] jjagaimo@lemmy.ca 6 points 1 year ago (1 children)

Limiting supply has the tendency to increase the price. There's no incentive to significantly increase supply when they can milk it for all its worth and create increases in price between generations faster than inflation. Just look at the price increases after covid. Prices have come down slightly but there has been a permanent upward shift in GPU prices.

[–] ono@lemmy.ca 4 points 1 year ago* (last edited 1 year ago)

Just look at the price increases after covid.

It's not really an apples-to-apples comparison. Most of the things people continue buying at prices inflated by covid issues and corporate greed are things they need, like food. Graphics cards are more of a luxury.

If GPU prices stay high, I simply won't buy them. Even if I were to finally relent when my current one eventually dies, I would be buying them far less often than I otherwise would, meaning less profit for the sellers in the long term.

[–] Kolanaki@yiffit.net 3 points 1 year ago* (last edited 1 year ago)

This sounds more like the prices would be going up because a new emerging hobby has a need for multiple extremely powerful GPUs the same way crypto mining did. That hobby just happens to be running AI at home instead of using cloud services. So there would be a drop in supply as a few hoarders snap up everything for that or to scalp at higher prices.

They just literally can't make them fast enough to keep prices stable against the shitty kinds of consumers.

Yes, the problem is production capacity, but it's very difficult to get that capacity up and running. For example, Intel started building 2 factories in Ohio last year. They won't be up and running until at least 2025.

This stuff is complicated and nobody predicted the rise of covid, crypto currencies, or AI, or if they did nobody was convinced enough to dedicate potential billions of dollars to building capacity to capitalize on it.

[–] charles@lemmy.ca 12 points 1 year ago (3 children)

Guess I'll be hanging on to my 1060 for a bit longer... Been looking to upgrade for a while now but I just can't justify the prices, especially in Canadian dollars.

[–] Nevrome@lemmy.ca 5 points 1 year ago (1 children)

GPUs tend to be cheaper/more affordable on AMD's side. My local PC shop had decent deals on the RX6600XT and RX6750XT lately. For PC gaming only, they do the job.

If you don't mind switching from Nvidia to AMD, I'd check them.

[–] charles@lemmy.ca 1 points 1 year ago (1 children)

Yeah I don't really do much with my desktop anymore outside of gaming so AMD would definitely be good enough. I just haven't gotten lucky when I remember to check prices lol, I seem to miss a lot of the deals when they come up.

[–] Nevrome@lemmy.ca 3 points 1 year ago (1 children)
[–] charles@lemmy.ca 2 points 1 year ago (1 children)

Would that be decent for 1440p at 100-120hz?

[–] Nevrome@lemmy.ca 1 points 1 year ago* (last edited 1 year ago) (1 children)

I do run 1440p/120Hz on most of my games and I don't even have the 6650XT (I own an ASRock PG 6600XT).

It would be more than enough for most titles in medium-to-high settings at 1440p/120Hz. Some recent games (yes, you, Diablo 4) will max out the 8GB vRAM so you'll have to tinker with settings until you hit a sweetspot but it manages 1440p.

Edit; as I wrote the reply, the card price increased to 400 CAD. Keep looking for deals, add them to your saved/watch list and order then.

[–] charles@lemmy.ca 2 points 1 year ago

Awesome!! Really appreciate the recommendation!! I'll definitely be keeping an eye out for it. I'd love to be able to finally move my 1060 to my unraid server for transcoding.

[–] UncleBadTouch@lemmy.ca 2 points 1 year ago

got me an RX-6650-XT mech from amazon for just under $400 after tax and other bs (canadian $$) keep an eye on the card(s) you want and hold out till a sale. Also, I only looked at the msi store, not the rando's selling cards. I know lots can be trusted, but so many are just scammers so I didnt want the chance of a problem.

[–] notst@lemmy.world 2 points 1 year ago (1 children)

Was about to update my 1080 then the Steam Deck came along and suddenly I have no desire for a new GPU. I enjoy nice graphics but not THAT much, I guess.

[–] charles@lemmy.ca 2 points 1 year ago

Completely agree, there's a good chance I'd have upgraded by now if it weren't for my steamdeck. Especially after getting a dock, I love being able to quickly hook it up to a monitor/tv when I want a bigger screen.

[–] a1studmuffin@aussie.zone 9 points 1 year ago (1 children)

I'm surprised we haven't seen TPU cards (think Coral AI but at a larger scale) being made and sold for this purpose, especially if they're faster and more energy efficient at AI-oriented tasks than GPUs.

[–] You999@lemmy.world 3 points 1 year ago

Well there is the asus AI Accelerator card but that's just 8 coral TPUs. I think the real reason why we don't see large TPUs is because nvidia cards has had tensor cores built into the architect since Volta and with a GPU you don't have to worry about system memory speed since you have 80Gb of HBM.

[–] dmm@lemmyfly.org 7 points 1 year ago

Glad I snatched a used 3090 for like USD 600

[–] pixelscience@lemm.ee 5 points 1 year ago* (last edited 1 year ago) (1 children)

I feel like this could be completely avoided if Nvidia would just make a reasonably priced, widely available specialized AI card with no video outputs or game features.

Currently their cards geared towards AI are insanely priced and not very attractive with such a poor price/performance ratio vs relatively cheap gaming cards.

[–] mosiacmango@lemm.ee 2 points 1 year ago* (last edited 1 year ago)

And all of their insanely over priced AI dedicated cards are selling out at max production, leading to record earnings for Nvidia.

I wouldn't expect they upset that apple cart until amd/intel force their hand.

[–] NaoPb@beehaw.org 5 points 1 year ago

And prices hadn't even dropped to a level where I could afford a new GPU.

I guess I'll keep sticking to what I have.

[–] Mandy@beehaw.org 4 points 1 year ago (1 children)
[–] Psythik@lemm.ee 1 points 1 year ago (1 children)

You poor bastard. My GF has that GPU. The only game she can max out is the Sims (but that's all she plays so it's good enough for her.) the 1050ti can't even handle Wallpaper Engine beyond 15 FPS.

[–] Mandy@beehaw.org 1 points 1 year ago

Max settings are pretty meaningless tbh I don't notice much difference in a lot of cases anyway, fps is where its at

And before that it was 750ti soooo

[–] MangoPenguin@lemmy.blahaj.zone 3 points 1 year ago* (last edited 1 year ago) (1 children)

Hmm, archive.is is blocking me with a constant captcha, passing it just reloads the block page again, it seems like their captcha system is broken or something.

[–] alessandro@lemmy.ca 1 points 1 year ago (1 children)

Well, that's counter-intuitive since the archive.org is all about preservation and availability of data, here's the full link : https://metro.co.uk/2023/08/04/pc-graphics-cards-to-get-more-expensive-again-thanks-to-ai-boom-19269496/

[–] villasv@lemmy.ca 3 points 1 year ago* (last edited 1 year ago)

archive.is is privately funded by its owners, while archive.org is a registered non-profit. They are not the same.

[–] villasv@lemmy.ca 3 points 1 year ago* (last edited 1 year ago) (1 children)

I guess that should also translate into possibly higher prices for the Mac Pro/Studio line? Or not? I don't really know, but not worrying about GPU prices feels nice.

[–] Logster998@sh.itjust.works 2 points 1 year ago

Apples neural engine cores aren’t as good as nvidias tensor cores, which is considered “the best” hardware for AI (I don’t know what AMD does). Even the Mac Pro with pcie can’t run GPUs, so that shouldn’t be an issue. Nobody bought Macs for mining crypto in 2021, so it’s less likely people will do it with AI.

[–] Syldon@feddit.uk 2 points 1 year ago (2 children)

How big do they think the AI market is going to be? It is not going to compete with the consumer demand for very long. Chips last for years, so once AI chip is purchased it will be in use until the next generation GPU arrives. So yes the initial purchase may be predominantly expensive AI chips that will make AMD and Nvidia a boatload of cash. But that is a finite market, and TSMC make a lot of chips. Chips for industry have always be at the forefront of the cash cow that is GPU and CPU sales. Intel is also making an entrance for the low end.

I think I will be waiting a while. I have little interest in being gouged for no other reason than greed.

[–] altima_neo@lemmy.zip 2 points 1 year ago

Yeah they're nuts if they think consumer grade graphics cards are of any use to anyone seriously dealing with ai.

The biggest thing holding back cards right now, even the 4090 is vram. AI needs a ton of it, especially if you're trying to train an AI.

More than likely, the will be more demand for those 48 gig+ enterprise cards.

[–] keldzh@lemmy.ca 1 points 1 year ago

But AI begin to be a consumer product too. It's no more just bots to help tech support, but products used by people (search in Bing with the help of ChatGPT, CoPilote from GitHub to help developers write code and so on). With increased usage it'll need more GPU to calculate more answers. And I'm not sure which market is bigger.

[–] Sigmatics@lemmy.ca 1 points 1 year ago* (last edited 1 year ago)

I can wait a couple generations till the next AI winter . Or until we have widespread dedicated AI chips, this graphics card training needs to stop