this post was submitted on 22 Jan 2026
83 points (100.0% liked)

Slop.

771 readers
588 users here now

For posting all the anonymous reactionary bullshit that you can't post anywhere else.

Rule 1: All posts must include links to the subject matter, and no identifying information should be redacted.

Rule 2: If your source is a reactionary website, please use archive.is instead of linking directly.

Rule 3: No sectarianism.

Rule 4: TERF/SWERFs Not Welcome

Rule 5: No bigotry of any kind, including ironic bigotry.

Rule 6: Do not post fellow hexbears.

Rule 7: Do not individually target federated instances' admins or moderators.

founded 1 year ago
MODERATORS
 

https://www.nature.com/articles/d41586-025-04064-7

Why does nature even have columns anyway

top 31 comments
sorted by: hot top controversial new old
[–] Crucible@hexbear.net 55 points 4 days ago (3 children)

If a single click can irrevocably delete years of work, ChatGPT cannot, in my opinion and on the basis of my experience, be considered completely safe for professional use.

I think the fact that it's trained off stolen material, or that it literally just makes shit up, should probably be bigger red flags on how acceptable it is to use professionally

[–] very_poggers_gay@hexbear.net 23 points 4 days ago

astronaut-2

tbh, ever believing that ChatGPT was completely safe for professional use shows a lack of critical thinking and questionable fit for a job in academia

[–] Le_Wokisme@hexbear.net 21 points 4 days ago (3 children)

stolen

please stop upholding the capitalist farce of intellectual property. the making shit up part is way more important and doesn't depend on structures we're aiming to tear down

[–] Crucible@hexbear.net 41 points 4 days ago (1 children)

I don't give a shit about intellectual property, I care that the artists who make the art can't buy food because Grok and Microsoft have trained bots to remove those artists' livelihoods with slop generation

[–] Le_Wokisme@hexbear.net 8 points 4 days ago (2 children)

yeah the operative there isn't stolen then?

[–] chgxvjh@hexbear.net 7 points 3 days ago* (last edited 3 days ago) (1 children)

Ignoring the theft legitimizes the idea that the training data, the models trained on the data, and output produced from those models belongs to the corporations.

[–] Le_Wokisme@hexbear.net 2 points 3 days ago (1 children)

the bad thing they're doing isn't depriving anyone of the original thing, as would occur if i went to someone's house and physically stole their painting.

if copying isn't theft when I copy stuff, it's not theft when somebody we don't like does it either.

[–] chgxvjh@hexbear.net 2 points 3 days ago

In a bourgeois context it's a matter of property, in a post capitalist context it's a matter of consent

[–] chgxvjh@hexbear.net 10 points 3 days ago

It's enclosure of intellectual commons and misuse of individual creations.

[–] aanes_appreciator@hexbear.net 2 points 3 days ago

even all that shit aside, would be useful facebook messenger as the corpus of his research if it convinced him to? Even Microsoft hamfists AI into shit that you can save to your local fucking hard drive !!!!

[–] save_vs_death@hexbear.net 44 points 4 days ago* (last edited 4 days ago) (3 children)

Finally a good feature from chat-gpt. I wish every chat-gpt user a very lose all your work made in it. Also, reading the article, the guy seemingly never made any real backups and completely relied on him being able to use chat-gpt discussions as data archives he could always refer back to. It always astounds me how people who work in research can be so completely ignorant about backing up the fucking data.

[–] Kumikommunism@hexbear.net 27 points 4 days ago (1 children)

I wouldn't even trust any amount of career-essential documents to be solely stored in Google Drive, and people are doing it in a fucking chat bot.

[–] chgxvjh@hexbear.net 4 points 3 days ago

It's like projects keeping all their documentation in discord channels.

[–] DasRav@hexbear.net 20 points 4 days ago (1 children)

As someone who has worked in customer support IT, this guy sounds like the worst kind of guy that you can be saddled with trying to help with PC issues. Unbothered with how things work, just so long as they do and entirely unable to cope with even the smallest disruption if his self-made and arcane process. Of course in this case it wasn't a small disruption, but that only makes the helpless meltdown worse.

My anxiety always makes me assume I’m the worst person bothering IT with my issues- but these things remind me I’m on the high end of the tech literacy bell curve (scary!). TYFYS o7

I’ve only lost meaningful work once through lazy backup - that was enough for me! In my work I store files locally and have a backup on an external hard drive as well as on a server. All organized with the same file structure so it’s a simple dragon drop to back things up. The thought of not backing up data or archiving it in any meaningful gives me too much anxiety otherwise lol.

[–] microfiche@hexbear.net 51 points 4 days ago (1 children)

IMO the issue is not with the data consent checkbox, but this here:

I had come to rely on the artificial-intelligence tool, for my work as a professor of plant sciences at the University of Cologne in Germany.

[–] Keld@hexbear.net 34 points 4 days ago (1 children)

If your professor grades you using chat gpt no jury would convict you.

[–] WokePalpatine@hexbear.net 13 points 4 days ago

A man and his chatbot wife can't be convicted of the same crime.

[–] mayakovsky@hexbear.net 36 points 4 days ago* (last edited 4 days ago) (1 children)

Having signed up for OpenAI’s subscription plan, ChatGPT Plus, I used it as an assistant every day — to write e-mails, draft course descriptions, structure grant applications, revise publications, prepare lectures, create exams and analyse student responses, and even as an interactive tool as part of my teaching.

Idk what the rules are in Germany, but feeding student work into AI or really any tool not managed by the school is generally a big no no. He didn't even have the "use this data to train models" turned off. Wtf

[–] Keld@hexbear.net 8 points 3 days ago

GDPR is pathetic when it comes to AI

[–] MF_COOM@hexbear.net 37 points 4 days ago

So wait hang on you're saying I shouldn't be relying on a guessing machine for my elite, highly technical and highly remunerative position?

[–] save_vs_death@hexbear.net 29 points 4 days ago (1 children)

A lot of these places should "allow" AI use for their employees but require them to complete a form that says "I will henceforth be completely responsible for all the inaccuracies and errors AI slop will introduce in my work, the buck stops at me" and you'd see a lot of people throw it completely in the bin.

[–] aanes_appreciator@hexbear.net 3 points 3 days ago

nah my last job had an "AI use policy" and K still regularly saw slop all over the place because the CTO was too busy squirting his vital essence onto his keyboard over every mention of "AI" to bother enforcing any of his bullshit rules. It's all a façade.

[–] Llituro@hexbear.net 29 points 4 days ago (1 children)

if i'm this guy's co-faculty, i'm trying to get his tenure yoinked posthaste

[–] JustSo@hexbear.net 5 points 4 days ago

Based if true

[–] AntiOutsideAktion@hexbear.net 18 points 4 days ago

Why does nature even have columns anyway

For something to willfully distract yourself with

[–] Tychoxii@hexbear.net 8 points 3 days ago

Sorry Director, the stochastic parrot ate my homework

[–] aanes_appreciator@hexbear.net 3 points 3 days ago

lmao "i used the software for something it wasn't designed to do, then accidentally deleted everything without a backup" did this dipshit think chat gpt dot com was some kind of notes app come ON