this post was submitted on 23 Feb 2026
112 points (100.0% liked)
Slop.
801 readers
393 users here now
For posting all the anonymous reactionary bullshit that you can't post anywhere else.
Rule 1: All posts must include links to the subject matter, and no identifying information should be redacted.
Rule 2: If your source is a reactionary website, please use archive.is instead of linking directly.
Rule 3: No sectarianism.
Rule 4: TERF/SWERFs Not Welcome
Rule 5: No bigotry of any kind, including ironic bigotry.
Rule 6: Do not post fellow hexbears.
Rule 7: Do not individually target federated instances' admins or moderators.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Where I work just had a consultant present to the board, and the senior exec, in two different sessions, on how to integrate AI across the workplace. I saw the slides. It was literally 'here's how to prompt AI, and here's some freeware on creating your own agent'.
Now, we're likely to go on the hook for ~~$45~~ $30 AUD* per user per month for copilot m365 (not copilot free, that's entirely different, 365 is the 'enterprise' version which does the same thing while promising
it won't use company data for training)
The threat is that 'if your workers don't use the copilot from your tenant, they will be putting company data into public LLMs, so you have to cough up. It's a direct threat, as Microsoft integrates the free version of copilot across all its apps. Insidious planet destroying criminal stuff which is a big stick being presented as this productivity-enhancing carrot
Edit: *(I had assumed it was USD and did a rough conversion into my currency, it's actually 30 AUD, ~22usd. It's still more than we can afford and infinitely more than it's worth)
I’m honestly shocked my employer (and basically every university system) hasn’t considered switching because of the huge liability this exposes them to
Copilot is 100% going to get someone in huge trouble for exposing protected health information, and it should be considered malware on any computer in a health system
i respect the racket way more than the actual product
That's something I've been thinking about ever since ChatGPT went public.
Years ago, it was revealed that some online translation site had a bunch of documents from various companies stored because people kept pasting them in, there's no way company documents aren't being put into the slop machines all the time.
With AI specifically, cacheing like that is a LOT harder, one of the major reasons why the cost of compute has only gone up over time instead of going down as the tech gets more mature.
A lot of NDAs are gonna be worthless when all the chats leak