this post was submitted on 13 May 2026
462 points (97.1% liked)
LinkedinLunatics
6831 readers
475 users here now
A place to post ridiculous posts from linkedIn.com
(Full transparency.. a mod for this sub happens to work there.. but that doesn't influence his moderation or laughter at a lot of posts.)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I can’t speak for veterinary, but in dentistry AI can be problematic because:
A) it really over-diagnoses - it’s very very sensitive, meaning that it identifies things that aren’t necessarily clinically relevant
B) it does not compare to previous radiographs, so it cannot give reasonable clinical judgement on whether decay is active or arrested.
It could be a helpful tool to give you a laundry list of places to check. However, I’ve used demo software and did not find it added anything for me, although I have 15 years experience. You still need to use your clinical judgement.
I do worry about the younger clinicians being over-reliant on it, as they have it pushed on them by multi-practice dental corporations.
Yep, the “laziness” of reliance was a big part of my immediate reaction. I’ve recently found out that veterinary clinics across the country, and especially in Texas, were aggressively moved on by investment capital in the years between now and my last dogs. The pet economy, including medical care, has long been seen as recession proof. Medical care in particular is seen as emotionally driven and ripe for greater exploitation, so of course the vultures moved in. Having AI flag everything, allowing you to show a customer tons of things to worry about and milk them for more diagnostics, a.k.a. more money, is the logical conclusion.
It’s a disturbing trend - dental offices are also being acquired by investment groups that run them as purely a business rather than a health care service.
I have spoken to some younger dentists when I go to courses, and it sounds as though they are receiving disciplinary conversations/actions when they do not treat what the AI tells them to. The business expects them to transfer the responsibility of diagnosis to a computer program ( and frankly a shitty one) yet still be medicolegally responsible for making the decision to treat.
New fear unlocked