this post was submitted on 22 Mar 2026
19 points (75.7% liked)

Technology

42208 readers
316 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 7 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] rolandtb303@lemmy.ml 4 points 1 day ago

self-contained, offline

While on one part i do like the all-ini-one ui and services, i feel as though it could have been done a little better without hosting a mini web-server just to use localhost on it.

Most if not all of the tools here are based on snapshots of online websites running in a browser, along with Docker ontop of it. While the intention is good and there are some neat ideas in here, why not just bundle native, offline FOSS programs that do the job already? for instance, cyberchef can be replaced with respective linux programs (eg base64, hexdump, grep, awk/sed, and gpg, just to name a few. graphical versions of these programs exist as well, so it's not like you need to use the terminal, it's just the most versatile environment for this type of stuff). No need for a webserver or anything.

However i will say, the offline wikipedia and maps are cool, unfortunately they're the only neat things in this project.

Now let's get to the point, an AI chatbot. What, does the dev think we have money to burn? Much less if SHTF and NVDIA RTX GPUs are scrapped for metal? (which they should be anyways). Now i know it's local, and that it most likely has data already trained on it so that it has the 100% guarantee of not huffing its own fumes and hallucinating, but compared to the absolute power usage that'll bring because of the sheer amount of resources it's hogging out trying to spit out an answer, a search engine could do just as good, and it won't hog up your GPU while at it. That's not even getting into the current ssd/gpu/ram situation right now. On its front page, its own recommended spec sheet says 32 gigs of ram. yeah that's a bit steep. 1TB SSD, i could kinda see why, but if i assume that most of the information is just text, you don't really need 1TB, but it is better safe than sorry. Still, that'll be pretty expensive if we're going by today's prices. When SHTF do you really think that most people are going to be rocking killer rigs with 8/16core CPUs, 32+ gigs of RAM and an RTX GPU? For the millionares and spoiled gamers who already have those? Sure, but for the masses? They'll mostly be using laptops with 4-6 cores, 8 gigs of ram, and a mid-range gpu if they're lucky, or integrated graphics.

Sure, you can say that having AI in it is somehow beneficial and tout how "everyone is using it", but don't get all pissy when your power bank runs out of juice at the worst time, let alone word gets out and your place gets raided and your 20-year-old 5090 is turned into scrap. All because you thought AI is good enough.

All in all it's a good premise, but it could be executed way better than just snapshotting websites, then slapping AI onto it and calling it a day.