this post was submitted on 26 Apr 2024
347 points (95.5% liked)

science

14597 readers
29 users here now

A community to post scientific articles, news, and civil discussion.

rule #1: be kind

<--- rules currently under construction, see current pinned post.

2024-11-11

founded 1 year ago
MODERATORS
 

A London librarian has analyzed millions of articles in search of uncommon terms abused by artificial intelligence programs

Librarian Andrew Gray has made a “very surprising” discovery. He analyzed five million scientific studies published last year and detected a sudden rise in the use of certain words, such as meticulously (up 137%), intricate (117%), commendable (83%) and meticulous (59%). The librarian from the University College London can only find one explanation for this rise: tens of thousands of researchers are using ChatGPT — or other similar Large Language Model tools with artificial intelligence — to write their studies or at least “polish” them.

There are blatant examples. A team of Chinese scientists published a study  on lithium batteries on February 17. The work — published in a specialized magazine from the Elsevier publishing house — begins like this: “Certainly, here is a possible introduction for your topic: Lithium-metal batteries are promising candidates for….” The authors apparently asked ChatGPT for an introduction and accidentally copied it as is. A separate article in a different Elsevier journal, published by Israeli researchers on March 8, includes the text: In summary, the management of bilateral iatrogenic I’m very sorry, but I don’t have access to real-time information or patient-specific data, as I am an AI language model.” And, a couple of months ago, three Chinese scientists published a crazy drawing of a rat with a kind of giant penis, an image generated with artificial intelligence for a study on sperm precursor cells.

you are viewing a single comment's thread
view the rest of the comments
[–] Silentiea@lemmy.blahaj.zone 12 points 6 months ago (1 children)

Honestly, the worst part isn't that its penis is so gigantic, it's that the labels are nonsense. An image like that is already not perfectly to scale out anything, so something being exaggerated can be weird but isn't necessarily a deal breaker (albeit that one is pretty darn weird)

usually when things aren't to scale, they tend to extract and isolate them, rather then pull this kind of shit, though sometimes i've seen similar things, just without the monstrous misrepresentation of biology lol.