GamingChairModel

joined 2 years ago
[–] GamingChairModel@lemmy.world 2 points 1 day ago (1 children)

Apple supports its devices for a lot longer than most OEMs after release (minimum 5 years since being available for sale from Apple, which might be 2 years of sales), but the impact of dropped support is much more pronounced, as you note. Apple usually announces obsolescence 2 years after support ends, too, and stop selling parts and repair manuals, except a few batteries supported to the 10 year mark. On the software/OS side, that usually means OS upgrades for 5-7 years, then 2 more years of security updates, for a total of 7-9 years of keeping a device reasonably up to date.

So if you're holding onto a 5-year-old laptop, Apple support tends to be much better than a 5-year-old laptop from a Windows OEM (especially with Windows 11 upgrade requirements failing to support some devices that were on sale at the time of Windows 11's release).

But if you've got a 10-year-old Apple laptop, it's harder to use normally than a 10-year-old Windows laptop.

Also, don't use the Apple store for software on your laptop. Use a reasonable package manager like homebrew that doesn't have the problems you describe. Or go find a mirror that hosts old MacOS packages and install it yourself.

Most Costco-specific products, sold under their Kirkland brand, are pretty good. They're always a good value and they're sometimes are among the best in class separate from cost.

I think Apple's products improved when they started designing their own silicon chips for phones, then tablets, then laptops and desktops. I have beef with their operating systems but there's no question that they're better able to squeeze battery life out of their hardware because of that tight control.

In the restaurant world, there are plenty of examples of a restaurant having a better product because they make something in house: sauces, breads, butchery, pickling, desserts, etc. There are counterexamples, too, but sometimes that kind of vertical integration can result in a better end product.

Their horizontal integration is made more seamless by the vertical integration.

On an Apple laptop, they're the OEM of the hardware product itself, while also being the manufacturer of the CPU and GPU and the operating system. For most other laptops that's 3 or 4 distinct companies.

Yeah, getting too close turns into an uncanny valley of sorts, where people expect all the edge cases to work the same. Making it familiar, while staying within its own design language and paradigms, strikes the right balance.

Even the human eye basically follows the same principle. We have 3 types of cones, each sensitive to different portions of wavelength, and our visual cortex combines each cone cell's single-dimensional inputs representing the intensity of light hitting that cell in its sensitivity range, from both eyes, plus the information from the color-blind rods, into a seamless single image.

[–] GamingChairModel@lemmy.world 23 points 6 days ago (1 children)

This write-up is really, really good. I think about these concepts whenever people discuss astrophotography or other computation-heavy photography as being fake software generated images, when the reality of translating the sensor data with a graphical representation for the human eye (and all the quirks of human vision, especially around brightness and color) needs conscious decisions on how those charges or voltages on a sensor should be translated into a pixel on digital file.

[–] GamingChairModel@lemmy.world 3 points 1 week ago (2 children)

my general computing as a subscription to a server.

You say this, but I think most of us have offloaded formerly local computing to a server of some kind:

  • Email organization, including folders and attachments, has mostly shifted from a desktop client saving offline copies retrieved and then deleted from the server, to web and app and even IMAP interfaces to the canonical cloud server organization.
  • A huge chunk of users have shifted their productivity tasks (word processing, spreadsheets, presentations, image editing and design) to web-based software.
  • A lot of math functionality is honestly just easier to plug into web-based calculators for finance, accounting, and even the higher level math that Wolfram Alpha excels at.
  • Lots of media organization, from photos to videos to music, are now in cloud-based searchable albums and playlists.

All these things used to be local uses of computing, and can now be accessed from low powered smartphones. Things like Chromebooks give a user access to between 50-100% of what they'd be doing on a full fledged high powered desktop, depending on the individual needs and use cases.

[–] GamingChairModel@lemmy.world 10 points 1 week ago (1 children)

Do MSI and ASUS have enough corporate/enterprise sales to offset the loss of consumer demand? With the RAM companies the consumer crunch is caused by AI companies bidding up the price of raw memory silicon well beyond what makes financial sense to package and solder onto DIMMs (or even directly solder the packages onto boards for ultra thin laptops).

[–] GamingChairModel@lemmy.world 1 points 1 week ago (2 children)

The key part of the statement is, "to service a demand that doesn't exist."

But that's basically always true of big projects. The people financing the project believe that the demand will exist in the future, and know it will take time and investment of resources to get to the point where they will meet that future demand.

They can be wrong on their projections of future demand, but that happens all the time, too. A classic example is when a city hosts the Olympics or World Cup and builds out a lot of infrastructure to meet that anticipated demand for both that specific event and the long term needs of the resident population. Sometimes it works, like with certain mass transit systems expanded for those events, and sometimes it doesn't, like when there are vacant stadiums sitting underused for decades after.

Or, the analogy I always draw is to the late 90's when telecom was building it a bunch of fiber networks for the anticipated future demand for Internet connections. Most of those ended up in bankruptcy, with the fiber assets sold for a fraction of the cost of building them. But they still ended up being useful. Just not worth the cost.

I think the same will happen with a lot of the data center infrastructure. Data centers will still be useful. A lot of the infrastructure for supporting those data centers (power and cooling systems, racks, network connections) will still be useful. There's just no guarantee that they'll be worth what they cost to build. And when that happens, we might see a glut in used data-center-grade computing equipment, and maybe hobbyists will score some deals at auctions to make their own frankenservers for their own purposes, and completely blow normal homelabbing out of the water.

"As long as the music's playing, you've got to get up and dance."

That's Citi's former CEO, who explained that he would devote his company's resources to making money in a bubble (during the 2007 housing bubble), even when he knew it was a bubble.

The memory chip producers are absolutely going to try to maximize production during this bubble. The normal life cycle is to run fabs on offsetting cycles where at any given time, the company has a few fabs in the planning stages, in the construction stages, R&D stages, early "risk" production, high volume production, and retooling for a new process.

That means that during a bubble, it makes sense to try to accelerate the speed at which new fabs come online or old fabs get retooled. It makes sense to keep old fabs running longer at higher yields, even for previous generation product. These aren't mature businesses that were already planning on running the same factories forever. They already anticipate the cycle of multiple generations, and what that looks like is going to be more aggressive during periods where customers are throwing money at them.

[–] GamingChairModel@lemmy.world 1 points 2 weeks ago

I think with cheaper consumer desktops using IDE hard drives, that worked out of the box, but some more exotic storage configurations (SCSI, anything to do with RAID) were a little bit harder to get going.

[–] GamingChairModel@lemmy.world 1 points 2 weeks ago (1 children)

My first Linux distro was Ubuntu in 2006, with a graphical installer from the boot CD. It was revolutionary in my eyes, because WinXP was still installed using a curses-like text interface at the time. As I remember, installing Ubuntu was significantly easier than installing WinXP (and then wireless Internet support was basically shit in either OS at the time).

 

Curious what everyone else is doing with all the files that are generated by photography as a hobby/interest/profession. What's your working setup, how do you share with others, and how are you backing things up?

view more: next ›