A.I. and big tech does not want you to have computing power to challenge their digital hegemony.
They will start pushing dumber and dumber devices and making development boxes so out of reach that only mega-wealth can afford to buy them.
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
Rules:
Related communities:
A.I. and big tech does not want you to have computing power to challenge their digital hegemony.
They will start pushing dumber and dumber devices and making development boxes so out of reach that only mega-wealth can afford to buy them.
Dumb devices will not be able to run shitty vibe coded OSes and apps. Your modern Android phones has orders of magnitude more computing power than 20 years old PDA despite having the same (or even less) functionality. Or even compared to 10 year old Android device. Software has been becoming slower and more bloated for decades, and it's only going to accelerate with "ai".
There will be more software restrictions and locked down "ecosystems" but I don't see the hardware becoming weaker. There is no going back.
I uninstalled google services and shit from a 60€ android phone and boom! Now stand-by battery life is 7 days and before it was 2~ days
Well yeah, if it's not doing anything the battery will last longer, yup.
Microsoft and Nvidia have been trying for years to offload computing power to their own systems, while your computer becomes little more than a remote access terminal into this power when these companies allow you access to it.
See; Nvidia Now, Xbox Cloud Gaming, and pretty much every popular LLM (there are self-hosted options, but that's not the major market rn, or the direction it's headed)
There's ofc struggles there, that they have had a hard time over comming. Particularly with something like gaming, you need a low latency, high speed internet connection; but that's not necessary for all applications, and has been improving (slowly).
Software has been becoming slower and more bloated for decades and it's only going to accelerate with "ai".
This is mostly true, but a little misleading. (although the AI part is absolutely correct)
This is mostly a result of having more powerful hardware. When you're working with very limited hardware, you have to be clever about the code you write. You're incentivized to find trade-offs and workarounds to get past physical limitations. Computer history is filled with stuff like this.
Starting around the mid 90s, computer hardware was advancing at such a rapid pace that the goalposts shifted. Developers had fewer limitations, software got more ambitious and teams got larger. This required a methodology change. Code suddenly needed to be easier to understand and modify by devs who might not have a full understanding of the entire codebase.
This also had a benefit to the execs, where entirely unoptimized, or even sometimes unfinished code could be brought to market and that meant a faster return on investment.
Today we are seeing the results of that shift. Massive amounts of RAM and powerful CPUs are commonplace in every modern device, and code is inefficient, but takes up basically the same percentage of resources that it always has.
This change to AI coding is unavoidable because the industry has decided that they want development to be fast and cheap at the cost of quality.
The goal here isnt to have personal devices run the shitty vibe-coded apps, it's to lease time in datacenters to have them run it for you and stream it to your device for a monthly fee.
It could when you're literally just running a basic OS and everything else is in "the cloud". Like that Windows 365 box Microsoft released recently that doesn't actually run Windows itself
And who is going to create this perfect and resource efficient OS? Literally all tech corporations are headed in the opposite direction. All proprietary consumer OSes are getting more bloated by the hour, and their developers are being replaced with incompetent vibe coders.
Incompetent? I'll have you know that I'm a prompt engineer. 😏
I'm not amateur. I end every query with "and no bugs, please."
Does it at least apologize to you when it adds bugs anyway?
Imagine if people who know how to use search engine properly called themselves "search engine prompt engineer". Maybe people who are good at communicating should start calling themselves "human interface prompt engineer".
Social engineer
It already exists, just the shitty cut down operating systems used by existing thin clients
Yeah and the fucking FTC isn’t going to do anything about it.
We need Lina Kahn back.
What was that book again?
I don't know but I do know that the reason Sparc boxes and Solaris/SunOS is known by people who worked in business or academia is because that there were Intel PCs that let affordable computing reach the masses even while Crystal Tower computing existed.
Now it seems that affordable PCs are not what the Mega-Wealthy want so they will make every computing device capable of creating a challenge to A.I. as expensive as possible just like Sun did with their hardware.
They can do this because the market can't respond to make more competition. And tariffs make that worse.
For a purpose nobody wants, and will be used for mass surveillance and bleeding edge sentiment manipulation.
well hundreds of CEOs and billionaires want it. I know some that made AI their whole case to get the CEO job. Just not us normal 99% of the world
Correct. I will draw your attention to onion futures
Wow that was a fascinating read.
Very interesting story
My favorite part is that even if these data centers get built, hardware to support LLMs is improving at a pretty fast rate due to the stupid amounts of money being burned on it.
This hypothetical hardware will be out of date by the time the data centers are ready for it, and they'll either be built with out of date tech or they'll blow past their budget due to actual differences in what it costs to make the up to date hardware vs what was planned.
That's why they're making it expendable. Those chips are designed to survive no more than 5 years of service in data centers. An unprecedented low level of durability provisioning. They are intentionally making e-waste.
My homelab welcomes the "outdated" or in need of maintenance chips coming my way.
Oh no I can't use the latest text generation tool and all I can do it crazy simulation, 3d model, and graphics work. What a shame. /s
Uhh. Just chiming in here as someone that does business to business IT support.... Most of the NPC office workers are almost demanding access to "AI" stuff.
I'm not saying this will turn out well, in fact, I think it will probably end poorly, but I'm not on charge around here. There's a nontrivial outcry for "AI" tools in business.
There's profit happening with it right now. Maybe not enough to offset costs yet, but there's a market for these things in the mindless office drone space.
To be absolutely clear, I think it's an idiotic thing to have/use, especially for any work in IT, but here we are. I have middle managers quoting chat GPT as if it's proof that what they want, can be done. I've been forwarded complete instructions to use fictional control panels and fictional controls to do a thing, when it's not possible for that thing to be done.
"AI" is a corporate yes-man in the worst ways possible. Not only are they agreeing that something can be done, even if it's not possible, but it's providing real enough looking directions that it seems like what it's proposing can be done, is actually possible and reasonable. I once asked copilot how to drive to the moon and it told me I'd run out of gas. While I would definitely run out of gas trying to get to the moon by car, when I'm done trying and I've run out of gas, I wouldn't be any closer to the moon than I usually am.
The thing is an idiot on wheels at the best of times, and a blatant liar the rest of the time. I don't know how people can justify using it in business when a mistake can lead to legal action, and possibly a very large settlement. It's short sighted and it's not worth the time nor effort involved in the whole endeavor.
Simply because they can read the writing on the wall. Corporate made every single decision possible to signal "use AI or get fired." With mass layoffs being driven mainly by whole industries pivoting to AI, people are fighting desperately to stay relevant. Every pundit and tech guru is singing "learn AI or become unemployable." It is a strive for survival, not a heartfelt trust or belief on the tech. Hell, they might not even understand how it works, people just know they need it in their CV to keep their meager income coming.
As someone who works in a knowledge industry, anyone relying on AI for their workload will end up with more errors than solutions. IT requires a high degree of accuracy in the information you handle that gets you to a solution. Out of everything you can say about AI, you can't say that it's highly accurate.
Any time I've given a technical question to copilot or chat GPT, I usually get nonsense back. It will not help me solve any of the issues I need to solve as a part of my job.
I understand how the current version of "AI" works, and from that knowledge, I know that for any meaningful task I face with even a small amount of complexity, these so-called "AI" bots can't possibly have any relevant answers. Most of the time I can't find relevant answers on the Internet by trying. Sometimes I only get adjacent information that helps lead me to the unique solution I need to implement.
"AI" in IT support actually makes things go slower and cause more issues and frustration than actual tangible help with anything that needs to be done. You end up going down rabbit holes of misinformation, wasting hours of time trying to make an ineffective "solution" work, just because some "AI" chatbot sent you on a wild goose chase.
For the most part AI is the best OCR ever designed. And if used for that it really is great. Most AI agents you see our there are mostly just used for that: ocr.
It's also nice-ish to start writing simple programs, if you know how it works it sets you more or less in the right path in a few prompts. That head start can be nice.
It also helps in Excel with charting.
It also is helpful for acquiring knowledge. AS LONG AS YOU CHECK THE LINKED SOURCES.
If you don't you will crash and burn. Not eventually but quick.
So yes, AI does have uses and Yes, it will cost some people their jobs, especially in knowledge Industries and IT.
But then again, that's a tale as old as time. Stuff changes.
(AI) datacenters will not go away. Desktop processing will vanish. And then, 15 years from now, someone gets a great idea and starts selling Personal AI computers. And this cycle will redo from start.
Machine learning for OCR makes a ton of sense. Human writing is highly dynamic, especially handwriting. It makes sense that OCR would benefit from a trained model for recognising words.
This actually isnt that weird, happens all the time
However, its less common that it impacts a common consumer product of the same type.
But a thing to be used in making a huge project causing prices to shoot up ahead of time is very normal.
Its just usually stuff like concrete, steel, lumber, etc that is impacted the most, but turns out RAM as a global industry wasnt ready to scale up to a sudden huge spike in demand.
Give it a couple yesrs and it'll level out as producers scale up to meet the new demand.
I just don't understand how selling everything they produce increases costs. Are they just charging more for more profits, or is the increased prices funding increased production?
The vast difference in economics between retail and manufacturing don't make sense to me.
A bigger share of the current factory output is being routed to the tech companies rather than the consumer component shops meaning there's less to go around. Since there's less to go around, consumer-facing stores are forced to bid higher to be able to maintain their stock which is then passed on to us plebs
At that scale it's kind of like an auction for the capacity to produce the chips (and it's the DRAM chips, not the finished modules in this case).
So for a DIMM retailer to get enough chips to make a product they need to out bid Nvidia :-(
I can help:
That was the excuse for memory.
They're also stopping production on affordable desktop graphics cards that don't AI well.
Next up, They're going to stop making SSD and force the price of NVMe up.
They've yet to release that they're going to drive up the prices of CPU's, motherboards and power supplies.
AI companies have found ways to force consumers to use their products against their will. These hardware vendors no longer need to compete to get you to buy, if you use a search engine, an browser, a mail service or a major social network, you're using their stuff and the AI company buys it on your behalf. Since they can just charge you for that and you can't do anything about that price, your purchases no longer matter.
It's like food, cars, you know, everything but paychecks are going up in numbers.
We're pushing the gas pedal to a boring dystopia.
It is happening to cars as well. Eventually you will only be able to rent a car and pay for each feature by the minute. Features such as brakes, lights, windshield wipers...
As long as I can still own a bicycle
This is arguably the best, brief Analysis of Bullshit that I’ve ever seen.
One nice thing about the current situation is we have a very effective and intuitive way to explain to gamers that capitalism is bad.
Time to make our own gear i guess. Anyone got chip production skills?
I've got an air fryer, a deep fryer, a standard oven... Hell, I think I even have a spare potato lying around. Let's do this shit!
Bluesky user discovers futures
That's not entirely fair, though. The key part of the statement is, "to service a demand that doesn't exist." The problem with AI is that there is little user demand for it, so all the capacity being aggressively built is going to eventually hit a brick wall.
I shouldn't say there is little user demand. There is little paying user demand, but that's the thing that makes and breaks investments the size of these. Enshittification is built-in, and for a change the cost of the data centers is a visible reminder that someone is going to pay, eventually.
Given the size of the players involved and their political connections, I assume the chump is going to be the tax payer in the end. That is going to be fun to watch: billionaires getting a massive bailout while you are getting kicked out of health insurance.
And the fact that it's normalized to someone like you is a bigger problem.
It's not even futures, just basic demand through scarcity.
There is an expectation that there will be a severe shortage of RAM soon, which causes people to panic buy it now, which drives prices up. It's not that complicated.
Tulip mania.