this post was submitted on 15 Apr 2025
416 points (97.7% liked)

Privacy

43955 readers
436 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 6 years ago
MODERATORS
 

A chart titled "What Kind of Data Do AI Chatbots Collect?" lists and compares seven AI chatbots—Gemini, Claude, CoPilot, Deepseek, ChatGPT, Perplexity, and Grok—based on the types and number of data points they collect as of February 2025. The categories of data include: Contact Info, Location, Contacts, User Content, History, Identifiers, Diagnostics, Usage Data, Purchases, Other Data.

  • Gemini: Collects all 10 data types; highest total at 22 data points
  • Claude: Collects 7 types; 13 data points
  • CoPilot: Collects 7 types; 12 data points
  • Deepseek: Collects 6 types; 11 data points
  • ChatGPT: Collects 6 types; 10 data points
  • Perplexity: Collects 6 types; 10 data points
  • Grok: Collects 4 types; 7 data points
you are viewing a single comment's thread
view the rest of the comments
[–] pennomi@lemmy.world 142 points 8 months ago (2 children)
[–] exothermic@lemmy.world 17 points 8 months ago (5 children)

Are there tutorials on how to do this? Should it be set up on a server on my local network??? How hard is it to set up? I have so many questions.

[–] pennomi@lemmy.world 11 points 8 months ago

Check out Ollama, it’s probably the easiest way to get started these days. It provides tooling and an api that different chat frontends can connect to.

[–] TangledHyphae@lemmy.world 5 points 8 months ago

https://ollama.ai/, this is what I've been using for over a year now, new models come out regularly and you just "ollama pull " and then it's available to run locally. Then you can use docker to run https://www.openwebui.com/ locally, giving it a ChatGPT-style interface (but even better and more configurable and you can run prompts against any number of models you select at once.)

All free and available to everyone.

[–] skarn@discuss.tchncs.de 1 points 8 months ago* (last edited 8 months ago) (3 children)

If you want to start playing around immediately, try Alpaca if Linux, LMStudio if Windows. See if it works for you, then move from there.

Alpaca actually runs its own Ollama instance.

[–] smee@poeng.link 1 points 8 months ago

Ollama recently became a flatpak extension for Alpaca but it's a one-click install from the Alpaca software management entry. All storage locations are the same so no need to re-DL any open models or remake tweaked models from the previous setup.

[–] SeekPie@lemm.ee 1 points 8 months ago* (last edited 8 months ago)

And if you want to be 100% sure that Alpaca doesn't send any info anywhere, you can restrict it's network access in Flatseal as it's a flatpak.

[–] nimpnin@sopuli.xyz 1 points 8 months ago

I used this a while back, it was pretty straightforward https://github.com/nathanlesage/local-chat

[–] TuxEnthusiast@sopuli.xyz 13 points 8 months ago* (last edited 8 months ago) (2 children)

If only my hardware could support it..

[–] skarn@discuss.tchncs.de 7 points 8 months ago

I can actually use locally some smaller models on my 2017 laptop (though I have increased the RAM to 16 GB).

You'd be surprised how mich can be done with how little.

[–] smee@poeng.link 2 points 8 months ago

It's possible to run local AI on a Raspberry Pi, it's all just a matter of speed and complexity. I run Ollama just fine on the two P-cores of my older i3 laptop. Granted, running it on the CUDA-accelerator (GFX card) on my main rig is beyond faster.