0
submitted 2 years ago* (last edited 2 years ago) by cosecantphi@hexbear.net to c/technology@hexbear.net

I'm not much of a tech person and I have no idea if my observations are worth anything, but from where I'm sitting it seems computer technology isn't advancing anywhere near as quickly as it was from the 80s to the early 2010s.

The original Moore's law is dead and has been for a very long time, but the less specific trend of rapidly increasing computational power doesn't seem to hold much water anymore either. The laptop I have now doesn't feel like much of an improvement on the laptop I had four years ago at a similar price point. And the laptop I had six years ago is really only marginally worse.

So for those in the know on the relevant industry, how are things looking in general? What is the expected roadmap for the next 10 to 20 years? Will we ever get to the point where a cheap notebook is capable of running today's most demanding games at the highest settings, 144fps, and 4k resolution? Sort of like how today's notebooks can run the most intensive games of the 90s/early 2000s.

top 4 comments
sorted by: hot top controversial new old
[-] Diglie@hexbear.net 0 points 2 years ago

I think people's perception of "computer" is not advancing alongside technology increases.

Everyone arguing about "Moore's Law" is always focusing on the individual PC as their target for investigation while computational power is still rapidly increasing. Its in a much more distributed fashion than the times of yore & Moore. Focusing on individual devices is pointless. Soon we're liable to be computing on glorified screens with hardly any graphics processing capabilities or computation capabilities and it will all happen somewhere far away.

Look at the overall network, the vast array of servers and clouds and the massive computational power of our planet- its skyrocketing at a breakneck pace. Who cares about transistor size, cpu speed, all that shit when in 5 seconds of interfacing with the internet you're liable to pull data from 5,000 different sources with an astronomical number of data points.

[-] cosecantphi@hexbear.net 0 points 2 years ago* (last edited 2 years ago)

I don't get how cloud computing can really beat having your own powerful PC. No matter how powerful the server your connecting to is, won't the limitations imposed by distance and the speed of light always make it preferable for the computation to be done within a chip a few cm across rather than a server hundreds of miles away?

[-] CanYouFeelItMrKrabs@hexbear.net 0 points 2 years ago

For gaming it's definitely better to run games locally. But I remotely access my office computer from laptops and can hardly tell that I'm using a remote connection.

And also right now with us on this website, there is computing going on keeping the site running and transmitting the info across the network. The computing of our personal devices are one part of that.

[-] Des@hexbear.net 1 points 1 year ago

requires infrastructure investment however. which means state capacity. unless the neoliberal order folds i see that being a hard limit for universal cloud computing outside of major urban areas.

this post was submitted on 27 Apr 2023
0 points (NaN% liked)

technology

23236 readers
99 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS