My 1080Ti finally died this year (started overheating). I've kept it though, in the hope I can fix it one day...
Every other part is just cobbled together from older rigs or sporadic upgrade pushes when a sale looks good.
This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.
Be warned:
If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.
My 1080Ti finally died this year (started overheating). I've kept it though, in the hope I can fix it one day...
Every other part is just cobbled together from older rigs or sporadic upgrade pushes when a sale looks good.
It's easy to go too far in either direction instead of just doing what fits your needs (which in fairness, can sometimes be difficult to precisely pin down). Blindly going "it's old, I need to upgrade" or "it still runs, it's not worth upgrading" will sometimes be right but it's not exactly tailored advice.
Someone I know was holding out for ages on a 4790K (2014), and upgraded a year or two ago to a then-current-gen system and said the difference it made to their workflow was huge - enough that they actually used that experience to tell their boss at work that the work systems (similar to what they had had themselves) should get upgraded.
At the end of 2022 I had had my current monitor(s) for about 10 years and had spent years of hearing everyone saying "wow upgrading my monitor was huge", saying that either 1440p was such an upgrade over 1080p and/or that high refresh rate (120+Hz) was such an upgrade over 60Hz. I am (or at least was in the past) a pretty competitive player in games so you'd think I'd be a prime candidate for it, but after swapping from a 60Hz 1200p screen to a 144Hz 1440p screen for my primary monitor I... honestly could barely notice the difference in games (yes, the higher refresh rate is definitely enabled, and ironically I can tell the difference easily outside of games lol).
I'm sensitive to input latency, so I can (or at least could, don't know if I still can) easily tell the difference between the responsiveness of ~90 FPS and ~150 FPS in games, so it's extra ironic that pumping the refresh rate of the screen itself didn't do much for me.
I noticed a night and day difference myself with the refresh rate from going from 60hz to 120hz after waiting for years to do so. I noticed it immediately on first person games because things went buttery smooth.
I can't tell the difference anywhere else
I know someone like this, who also insisted that windows 7 was just better
He back tracked immediately after a system upgrade, updated to win 10 and started bragging about his specs
Maybe it's just my CPU or something wrong with my setup, but i feel like new games (especially ones that run on Unreal Engine 5) really kick my computers ass at 1440p. Just got the 7900xtx last year and using a ryzen 9 3900xt i got from 2020 for reference. I remember getting new cards like 10 years ago and being able to crank the settings up to max with no worries, but nowadays I feel I gotta worry about lowering settings or having to resort to using upscaling or frame generation.
Games dont feel very optimized anymore, so I can see why people might be upgrading more frequently thinking it's just their pc being weak. I miss the days where we could just play games in native resolution.
Not just you. The difference between a poorly optimized game, and a game that looks even better but is well optimized, is insane these days.
Can confirm, pc bought in 2016, upgraded CPU and GFX card, can play VR games and games at 4k with decent framerates.
there is no way in hell a 2014 computer is able to run modern games on medium settings at all, let alone running well. my four year old computer (Ryzen 5 4000, GTX 1650, 16 GB RAM) can barely get 30-40 fps on most modern games at 1080p even on the absolute lowest settings. don't get me wrong, it should still work fine. however, almost no modern games are optimized at all and the "low" settings are all super fucking high now, so anon is lying out of his ass.
I was with them until my girlfriend gifted me a 180Hz monitor last year and now I can't deal with less than 90 FPS so I had to finally upgrade my RX580 (I just found out it stopped getting driver updates in January 2024 so I guess it was about time). High refresh rates ruin you.
Because you're the lemming who isn't running off the cliff. It pisses them off.
My 2008 librebooted t440p thinkpad Says hold my beer. Browses the web like its a 2025 desktop Its amazing Except for the compile times (it runs gentoo :D)
5 years here, a lower-end purchase to begin with. Still works fine. Only a few games I need to lower to True Potato settings to run.
I want to say I upgrade every 6 years. Getting mid to upper specs and a mid range video card and it’ll last you for a long time.
Same, same. Except I don't really play games, but use the computer for other hobbies. It's still plenty fast and does everything I need it to do. So why buy something that does exactly the same, just is newer and looks different?
I had to replace my computer because it died.