this post was submitted on 01 May 2026
174 points (90.3% liked)

PC Gaming

14586 readers
924 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] FauxLiving@lemmy.world 3 points 15 hours ago (2 children)

Sure, tools make people worse at doing the thing without tools.

Using AutoCAD made draftsmen worse at drafting, that doesn't matter because there is no occasion where you need to draft complex plans without a computer. If AI diagnosis makes doctors worse at reading MRIs... that would only matter in a world where they're reading MRIs but also don't have access to a computer. There is no hospital that has a functional MRI machine that wouldn't be able to access these tools.

The important thing is that the doctors, when using these AI tools, are measurably more effective. The result is the thing that matters for public health, not any individual's ability to operate without their tools.

https://newsnetwork.mayoclinic.org/discussion/mayo-clinic-ai-detects-pancreatic-cancer-up-to-3-years-before-diagnosis-in-landmark-validation-study/

Researchers used the AI model to analyze nearly 2,000 CT scans, including scans from patients later diagnosed with pancreatic cancer — all originally interpreted as normal. The system, called the Radiomics-based Early Detection Model (REDMOD), identified 73% of those prediagnostic cancers at a median of about 16 months before diagnosis — nearly double the detection rate of specialists reviewing the same scans without AI assistance.

Doubling the early detection rate of one of the most deadly types of cancers will result in many more lives being saved.

[–] SaveTheTuaHawk@lemmy.ca 1 points 14 hours ago (1 children)

That's not AI. Those algorithms were pattern matching developed at Carnegie Mellon 15 years ago. so now they want to call it AI.

Radiologist and pathologist have always had a massive error rate because of human cognition bias.

[–] FauxLiving@lemmy.world 1 points 13 hours ago

Machine Learning isn't restricted to neural networks.

Radiologist and pathologist have always had a massive error rate because of human cognition bias.

Seems like that's a bad thing and we should be happy that there are tools which improve their accuracy.

[–] XLE@piefed.social 1 points 14 hours ago (2 children)

Are you confident that the American healthcare system wouldn't declare experts to be a redundancy and simply replace them with the AI? Not only would that fit with their well-known profit motive, it is explicitly what AI companies claim they want to do.

I would love to live in a utopia where AI can be used ethically, but it is dangerous to promote the assumption that it magically just will be.

[–] SaveTheTuaHawk@lemmy.ca 2 points 14 hours ago (1 children)

Are you confident that the American healthcare system wouldn’t declare experts to be a redundancy and simply replace them with the AI?

That would lead to a legal liability. The reality is all radiology scans and pathology slide images are cross checked by software and if there is a discrepancy, another pathologist is consulted. This is because the error rate of pathologists and radiologists is conservatively 1% which is far too high.

[–] FauxLiving@lemmy.world 2 points 14 hours ago (1 children)

Are you confident that the American healthcare system wouldn’t declare experts to be a redundancy and simply replace them with the AI?

Yes.

Nothing about this tool replaces experts any more than a calculator or computer can replace a human mathematician.

I would love to live in a utopia where AI can be used ethically, but it is dangerous to promote the assumption that it magically just will be.

I don't assume that AI will always be used ethically (see: War, LLM propaganda bots, etc). Like every technology it is possible to do bad things with it and it will require regulations and laws addressing this.

Dismissing a technology because it is used by bad people, if you actually applied that standard consistently in your life, would have you living naked in a cave without access to fire or tools.

You don't need to believe in a utopia to understand that a world where 70% of pancreatic cancer is detected 3 years earlier is better than one where 30% of pancreatic cancer is detected 3 years earlier.

[–] XLE@piefed.social 1 points 13 hours ago (1 children)

FauxLiving, I appreciate your guarantees about the future, but can you demonstrate why the for-profit medical and AI industries wouldn't cut corners if the AI behaved the way you hope it will?

[–] FauxLiving@lemmy.world 1 points 12 hours ago (1 children)

First, this is a peer-reviewed result not me expressing my hopes.

Second, this application does not replace radiologists. It is a tool for radiologists in one specific type of diagnosis.

If you have some hypothetical future outcome in mind, then the burden of proof is on you to prove your position, not on me to disprove it.

The data shows that this system works.

[–] XLE@piefed.social 1 points 3 hours ago

FauxLiving, a paper isn't the same as you predicting the future. It does not show the system works. It is disgusting that you pretend to be an astrologer based on these opinions.