this post was submitted on 19 Feb 2024
318 points (97.0% liked)

Technology

60052 readers
2902 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Study featuring AI-generated giant rat penis retracted entirely, journal apologizes::A peer-reviewed study featured nonsensical AI images including a giant rat penis in the latest example of how generative AI has seeped into academia.

top 50 comments
sorted by: hot top controversial new old
[–] TransplantedSconie@lemm.ee 118 points 10 months ago* (last edited 10 months ago)

Me:

Impactful world news: Pass

Troubling local US news: Pass

News about giant rat penis: Click

sips coffee slowly

[–] rtxn@lemmy.world 86 points 10 months ago* (last edited 10 months ago) (5 children)

I hate the way AI is being used here, but those labels are fucking GOLD.

  • Senctollic stem cells
  • Dizlocttal stem ells
  • Dissilced
  • Rat
  • Testtomcels
  • Iollotte sserotgomar cell
  • Spermatocial stem cells
  • Stenm cells
  • Retat
  • dck
[–] ThrowawaySobriquet@lemmy.world 42 points 10 months ago (1 children)

It's like trying to read in a dream

[–] rtxn@lemmy.world 18 points 10 months ago

At least it correctly labelled the Rat, but kinda missed the dck

[–] otp@sh.itjust.works 20 points 10 months ago

See Figure 1 for a diagram of retat dck

[–] rob_t_firefly@lemmy.world 6 points 10 months ago
  • Air vent
  • Fan
  • Saddam Hussein
[–] Rukmer@lemmy.world 4 points 10 months ago

I loved the labels. Testtomcels.

[–] Hamartiogonic@sopuli.xyz 3 points 10 months ago

Ever tried to see what happens when you request “an anatomical diagram of a spider, school book style”. I mean, just start by counting the legs, and once you’ve stopped laughing you can dive into the labels. It’s going to be wild. If you’re into microbiology, try asking for a similar diagram of a prokaryotic cell for extra giggles.

[–] thehatfox@lemmy.world 66 points 10 months ago (1 children)

Well that’s a headline I didn’t expect to see this morning.

Regardless of the rights and wrongs of AI generated images, it’s quite concerning something like this makes it into a scientific journal at all.

[–] doctorcrimson@lemmy.today 14 points 10 months ago (1 children)

Yeah, the Journal is at a huge loss of credibility with this. Their entire purpose is to be respectable and review submissions with a high degree of scrutiny.

[–] notthebees@reddthat.com 7 points 10 months ago

Frontiers in isn't the greatest journal to begin with

[–] ApeNo1@lemm.ee 45 points 10 months ago (1 children)
[–] nieceandtows@programming.dev 7 points 10 months ago

Stuart Hugh Mungus

[–] starman2112@sh.itjust.works 44 points 10 months ago* (last edited 10 months ago) (1 children)

It's not so much the use of AI that's upsetting as it is the "peer review" process. There needs to be a massive change in how journals review studies, before reasonable people start to question every study based on cases like this. How many false studies are currently used for important shit that we just haven't caught yet?

[–] brsrklf@jlai.lu 13 points 10 months ago* (last edited 10 months ago) (2 children)

It got published, people noticed it, people saw it was bullshit, it got retracted. Publishing is not the end of the line.

It's an extreme example, but it's still an example of the system working in the end. Reasonable people are supposed to question what they read, not blindly trust it, that's how you catch "important shit".

The problem is not that some bad papers get published. The problem would be them staying unchallenged. And it's also a problem that laymen consider one random study is an undeniable proof of their argument (potentially ignoring the thousands of studies contradicting it).

[–] agamemnonymous@sh.itjust.works 25 points 10 months ago (2 children)

Of course some things will always slip through the cracks, but this is egregious. What does their peer-review process look like that this passed through it?

[–] candybrie@lemmy.world 16 points 10 months ago

Right? Even when skimming papers, it's usually: read title & abstract, look at figures, skim results & conclusion. If you don't notice that the figure doesn't have real words, how is anyone making sure the methodology makes sense? That the results show what the conclusion says they show?

[–] brsrklf@jlai.lu 1 points 10 months ago* (last edited 10 months ago)

I am not disagreeing that this is ridiculous, I was just saying that this stupidity is not what should convince people not to take some random paper for an absolute truth, just because it was published.

Even if you eliminate fraud, bullshit and even honest mistakes, that's just not how science works.

[–] DudeDudenson@lemmings.world 2 points 10 months ago (1 children)

A shame most people are trained by both the school system and society to just take things at face value

[–] rusticus@lemm.ee 1 points 10 months ago

An even greater shame is that almost no people are trained on basic statistics and think they can debunk a published study in PNAS with a Google search and some random guys blog.

[–] 4grams@awful.systems 36 points 10 months ago (3 children)

ai is going to speedrun us into idocracy, isn’t it? why learn when you can ask dr. sbaitso to just do it for you?

[–] TurtleJoe@lemmy.world 13 points 10 months ago (1 children)

It's certainly not helping.

We're already dealing with the problem of half the (US) population only believing things when they align with their political views and now on can't even Google something and be sure that the entire first page of results isn't SEO AI hallucinated misinformation.

[–] Grandwolf319@sh.itjust.works 5 points 10 months ago* (last edited 10 months ago)

“Search engine optimized artificial intelligence hallucinated misinformation”

Omg, we are in a cyberpunk dystopia aren’t we?

[–] BearOfaTime@lemm.ee 4 points 10 months ago (1 children)

Peer review was already a joke, as exposed a couple years ago by two researchers who got a paper full of BS published.

It's been wells established that nearly all published research papers are irreproducible.

[–] 4grams@awful.systems 5 points 10 months ago

How long until there’s an accepted study of the benefits of electrolytes on plants. Probably already exists in gatoraids filing cabinet.

[–] Shadywack@lemmy.world 1 points 10 months ago (1 children)

Doctooore Sbaitso, please enter your name.

Man I wondered if I'd ever talk to anyone else that used it. I liked asking him to pronounce "abcdefghijklmnopqrstuvwxyz", and he actually did a pretty good job.

[–] 4grams@awful.systems 1 points 10 months ago

I support a law that all AI voices must use the Dr. Sbaitso voice. Imagine the impressive inefficiency of training an AI voice with the output from a 1980’s? Sound blaster.

[–] drislands@lemmy.world 35 points 10 months ago (3 children)

A few things came together for me here.

The paper had two reviewers, one in India and one based in the U.S.

.

"...a reviewer of the paper had raised concerns about the AI-generated images that were ignored."

.

...the U.S.-based reviewer who said that they evaluated the study based solely on its scientific merits and that it was up to Frontiers whether or not to publish the AI-generated images...

.

"The authors failed to respond to these requests. We are investigating how our processes failed to act on the lack of author compliance... "

They don't outright say it in the article, but it looks like the reviewer based in India was the one who actually raised concerns about the garbage images. The authors were supposed to respond, but didn't, and the journal published anyway.

I will readily admit that this is just my own conclusion here, but -- I wonder if there was an element of racism that went into ignoring the reviewer's concerns?

[–] asdfasdfasdf@lemmy.world 15 points 10 months ago (7 children)

Why do you bring up race? Is there anything that would imply that?

People are lazy and incompetent as fuck, and it's been an industry wide problem that publishing companies in general have lower and lower standards of quality.

load more comments (7 replies)
[–] MirthfulAlembic@lemmy.world 10 points 10 months ago

Check out their controversies section on Wikipedia. This doesn't seem out of character for this publication. It's more likely incompetence than malice.

[–] Meron35@lemmy.world 30 points 10 months ago (2 children)

AI generated medical research can't make it past peer review, it can't hurt you

AI generated medical research that made it past peer review:

[–] Silentiea@lemm.ee 3 points 10 months ago

I mean "made it past"...

The 2 reviewers both brought up the images as weird, and the journal published anyways, so...

[–] Shadywack@lemmy.world 1 points 10 months ago

Giant rat penises will only hurt you if you have an underlying medical condition (anal fissures, etc).

[–] NutWrench@lemmy.world 22 points 10 months ago

The paper was authored by three scientists in China, edited by a researcher in India, reviewed by two people from the U.S. and India, and published in the open access journal Frontiers in Cell Development and Biology on Monday. 

Now THAT is Maximum Trolling. I hope someone at Cell Development got fired

[–] Evil_incarnate@lemm.ee 18 points 10 months ago (1 children)

I was out in the snow and mine retracted entirely as well.

[–] frozenicecube@lemmy.ca 11 points 10 months ago

"I WAS IN THE POOL!!!!!"

[–] Macaroni_ninja@lemmy.world 14 points 10 months ago

South Park strikes again

[–] flop_leash_973@lemmy.world 7 points 10 months ago

Jesus, talk about getting rat fucked.

[–] psy32nd@lemmy.world 6 points 10 months ago

Time for a "dick of a rat" joke

[–] doctorcrimson@lemmy.today 5 points 10 months ago

At first I was like "Why" and then I realized the study was about rat penises and not about AI so now I'm furious and I hope that researcher's school rescinds his degrees.

[–] glowie@h4x0r.host 2 points 10 months ago

Trust the science

[–] NigelFrobisher@aussie.zone 2 points 10 months ago