this post was submitted on 01 May 2026
159 points (89.6% liked)

PC Gaming

14586 readers
881 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] Flatfire@lemmy.ca 6 points 17 hours ago (1 children)

There's a balance to be struck here. Relying on automation tooling wholesale will always make you worse. There's a reason that even though we have calculators, it's important to know the fundamental maths that would let you perform those same calculations yourself. For the majority of people, it's probably not critical, but if you need to validate that information, you cerainly want to be able to understand how the original conclusion was drawn.

The same goes for software engineering, where AI is seeing heavy use. People asking it to build who programs receive bug riddled and inefficient code, but software engineers who are using it for rapid prototyping or to reduce the work of rewriting common functions in different projects are going to be more effective because they understand what the resulting structure should look like.

AI is not a replacement for the human, and if there's a future for it, it will be assistive to the fundamentals and knowledge human specialists already posess. But that requires the continued education and development of skills within the industries these tools are deployed in.

[โ€“] XLE@piefed.social 1 points 17 hours ago* (last edited 17 hours ago)

Code generation and medical result generation are similar enough to compare (I think), but to expound on the point I was making to the other person I replied to: There is far less medical data online than there is code. We basically have every code textbook online. We have tons of examples to create scaffolds from. We don't have so much medical data, and the people promoting the tools to the medical field tend to be the tech bros who don't mention the caveats of what their products can do.

In other words, if AI could be good in medicine, it needs to be rolled out by none of the people who are currently pushing for it, and the caveats need to be explained in a way that none of them do. (It's not objective, it will not create new science like OpenAI CEO Sam Alman says, etc.) If AI boosters managed to convince the medical field of the same things, they have already convinced politicians and journalists of, I think the result would be rapid quality degradation of treatment, deskilling, lots of unnecessary death. And boosters that promote potential benefits without acknowledging that are being very reckless.