this post was submitted on 18 Feb 2026
786 points (99.5% liked)
Greentext
7884 readers
938 users here now
This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.
Be warned:
- Anon is often crazy.
- Anon is often depressed.
- Anon frequently shares thoughts that are immature, offensive, or incomprehensible.
If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I've tried a number of local models and even the 8b models aren't that good. Unless there's some insane breakthrough, much better hardware will be required to get the kind of results that would be timely enough or high quality enough to be useful.
So it might drive the kind of performance enhancement that will be needed to truly democratize and make the technology accessible, but until then more performance is needed.
My 2024 laptop has basically increased $800 or so in price because of the buy ups. This will either drive optimization or kill progress or maybe some of each on a continuum.
I also firmly believe that part of the storage and ram buy-up was intended to make higher end compute further out of reach of us plebs, forcing us further into the “everything as a service" model and that corporate AI is a big bet that they can lay off even more people </foil hat time>
That said, if you found good results with qwen, 8b, dm me a link for the specific model, I'd love to try it. I'm still a hobbyist. 😁
I use the Alpaca flatpak, it just lets you download a variety of models, manages them all inside a contained local environment.
Even has some tools support that is expanding, basic web searches, speech to text, text to speech... and if you can find a GGUF format model, supposedly Alpaca can run this manually, and there's a good deal on huggingface.
https://github.com/Jeffser/Alpaca
Unfortunately, if you're running Windows, I... have no clue how to set up an LLM there.
Also your tin foil hat thing isn't even tin foil hat.
Like, various people in the AI space have outright stated that they want to see a paradigm where everyone just rents compute time from them because PCs are othereise too expensive, while acting like it just happens to be the new reality that everything is so expensive, for some reason.
Nvidia went from gaming GPUs being about 50% of its business to something more like 5%, in about 5 years.
Fortunately the AI bubble will be popping soon, as ... everyone has run out of money to lend.
Unfrotunately this will destroy the economies of the West.
Yay capitalism!
For the moment, I haven't had the motivation to switch everything over to Linux, but it is coming down the line. To that end, I do know how to set up models and windows, and it's not all that hard, but what is the specific model name? Is it just the Quen 8b?
Come to think of it, I might actually be able to install the flat pack into the Windows subsystem for Linux if it behaves the way I think it's supposed to.
Could be a very interesting experiment.
Well, if you're coming from a Windows packground, a flatpak is roughly, to the user at least, similar to an exe.
You download a flatpak, install it, blingo blango it has its own environment that is essentially sandboxed, as it pulls in its own dependencies and such.
But, you'll need to either go with a linux distro that comes with flatpak support pre-configured, or, set up flatpak support on a different distro.
Once you've got either of those, there are free app 'stores' for flatpak that make it extremely simple to browse, download, install a flatpak program.
Then you just click, download Alpaca, run it, and its got a menu, add new models, search through what it has access to, "Qwen 3", 8b parameter variant, download, then use it.
I am personally using Bazzite at the moment, I used to use a bunch of Debian, variants of Debian (Ubuntu, PopOS), have futzed around with Arch and even Void... Bazzite is so far the happy medium I've found between stability, extensibility, and also being pretty close to cutting edge in terms of driver updates and kernel updates.
If you wanna try WSL (which is named backwards, but whatever), I... I have no idea what you'd have to do to get flatpaks working... on... Windows... but if you think you can, best of luck!