this post was submitted on 10 Feb 2026
86 points (96.7% liked)
Technology
80978 readers
4654 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
32GB is actually considered the bare minimum for most of the common locally run LLM models. Most folks don't run a locally run LLM. They use a cloud service, so they don't need a huge pile of RAM locally. However, more privacy focused or heavy users with cost concerns might choose to run an LLM locally so they're not paying per token. With regards to locally run LLMs, this would be comparable to renting car when you need it vs buying one outright. If you only need a car once a year, renting is clearly the better choice. If you're driving to work everyday then clearly buying the car yourself is a better deal overall.
You are perfectly fine not liking AI, but you're also out-of-touch if you think 32GB is too big for anything. Lots of other use cases need 32GB or more and have nothing to do with AI.
I agree with your frustration with subscription laptops. I hope people don't use it.
well hp is aware that laptops are quickly becoming out of reach money-wise for a larger and larger chunk of consumers, they just had to figure out some way to exploit that.
$420 a year for a laptop doesn't sound like robbery at first, until you consider it's just money out the window, and they're 100% harvesting every 1 and every 0 input and output from that laptop that they still own/control. i haven't even looked at the fine print, which i'm willing to bet makes the whole thing exponentially worse
It all reads like a giant racket. AI requires 32GB of RAM on your laptop, 32GB of RAM is expensive, so you have to lease, and it's expensive because AI requires RAM to run in the cloud. It's a problem in search of a solution, and it keeps making new problems along the way.
Its only a problem if you want to run AI. If you don't want AI locally or cloud based, then no need to spend the money on the high end 32GB model (for AI purposes) or paying for a cloud subscription. No one is required to get the 32GB model if they don't want it.