this post was submitted on 13 Jul 2025
511 points (96.9% liked)

Comic Strips

18116 readers
2041 users here now

Comic Strips is a community for those who love comic stories.

The rules are simple:

Web of links

founded 2 years ago
MODERATORS
 

you are viewing a single comment's thread
view the rest of the comments
[–] logicbomb@lemmy.world 92 points 1 day ago (33 children)

My knowledge on this is several years old, but back then, there were some types of medical imaging where AI consistently outperformed all humans at diagnosis. They used existing data to give both humans and AI the same images and asked them to make a diagnosis, already knowing the correct answer. Sometimes, even when humans reviewed the image after knowing the answer, they couldn't figure out why the AI was right. It would be hard to imagine that AI has gotten worse in the following years.

When it comes to my health, I simply want the best outcomes possible, so whatever method gets the best outcomes, I want to use that method. If humans are better than AI, then I want humans. If AI is better, then I want AI. I think this sentiment will not be uncommon, but I'm not going to sacrifice my health so that somebody else can keep their job. There's a lot of other things that I would sacrifice, but not my health.

[–] DarkSirrush@lemmy.ca 60 points 1 day ago (3 children)

iirc the reason it isn't used still is because even with it being trained by highly skilled professionals, it had some pretty bad biases with race and gender, and was only as accurate as it was with white, male patients.

Plus the publicly released results were fairly cherry picked for their quality.

[–] Ephera@lemmy.ml 17 points 20 hours ago (1 children)

Yeah, there were also several stories where the AI just detected that all the pictures of the illness had e.g. a ruler in them, whereas the control pictures did not. It's easy to produce impressive results when your methodology sucks. And unfortunately, those results will get reported on before peer reviews are in and before others have attempted to reproduce the results.

[–] DarkSirrush@lemmy.ca 7 points 18 hours ago

That reminds me, pretty sure at least one of these ai medical tests it was reading metadata that included the diagnosis on the input image.

load more comments (1 replies)
load more comments (30 replies)