this post was submitted on 07 Feb 2026
342 points (98.9% liked)

Technology

80795 readers
3674 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Perspectivist@feddit.uk 42 points 1 day ago (3 children)

I think the interventions here are more like: "that's a trash can someone pushed onto the road - let me help you around it" rather than: "let me drive you all the way to your destination."

It's usually not the genuinely hard stuff that stumps AI drivers - it's the really stupid, obvious things it simply never encountered in its training data before.

[–] MoffKalast@lemmy.world 7 points 1 day ago* (last edited 1 day ago) (1 children)

Saw this blog post recently about waymo's sim setup for generating synthetic data and they really do seem to be generating pretty much everything in existence. The level of generalization of the model they seem to be using is either shockingly low or they abort immediately at the earliest sign of high perplexity.

[–] Kushan@lemmy.world 5 points 21 hours ago

I'm guessing it's the latter, they need to keep accidents to a minimum if they're ever going to get broad legislation to legalise them.

Every single accident is analysed to death by the media and onlookers alike, with a large group of people wanting it to fail.

This is a prime example, we've known about the human intervention for a while now but period people seem surprised that those people are in another country.

[–] Zwuzelmaus@feddit.org 1 points 1 day ago* (last edited 1 day ago) (1 children)

it's the really stupid, obvious things

Hm. Interesting. But that makes them look even mode incapable than I feared.

[–] Perspectivist@feddit.uk 5 points 1 day ago (2 children)

Broadly speaking, an AI driver getting stumped means it's stuck in the middle of the road - while a human driver getting stumped means plowing into a semi truck.

I'd rather be inconvenienced than killed. And from what I've seen, even our current AI drivers are already statistically safer than the average human driver - and they're only going to keep getting better.

They'll never be flawless though. Nothing is.

[–] MrScottyTay@sh.itjust.works 5 points 1 day ago (1 children)

Ai drivers have run over and crushed people slowly before too though because they didn't see the person as an "obstacle" to be avoided, or because they were on the ground, it didn't see them

[–] Perspectivist@feddit.uk 3 points 1 day ago (3 children)

And they always will. You need to look at the big picture here, not individual cases. If we replaced every single car on US roads with one driven by AI - proven to be 10 times better a driver than a human - that would still mean 4,000 people getting killed by them each year. That, however, doesn't mean we should go back to human drivers and 40,000 people killed annually.

[–] ltxrtquq@lemmy.ml 16 points 1 day ago (1 children)

You need to look at the big picture here, not individual cases.

By that logic...

We should really be investing in trains and busses, not cars of any type.

[–] walden@wetshav.ing -5 points 1 day ago (2 children)

I think your logic is flawed. The discussion is about a specific form of transportation. By your own logic, you should be suggesting that people fly everywhere.

[–] Clent@lemmy.dbzer0.com 1 points 22 hours ago

Yes. AI human transformation drones make far more sense. Much easier to avoid things because airspace can be controlled. Just need to figure out how to do efficiently that the ride is more than 5 minutes.

[–] ltxrtquq@lemmy.ml 1 points 1 day ago (1 children)

For long distance maybe, but immediately saying we should all fly everywhere because it has the fewest deaths per passenger mile would really not be looking at the big picture.

[–] walden@wetshav.ing 0 points 21 hours ago (1 children)

Ah, so you do understand there's a difference in why someone would chose one type of transportation over another.

[–] zbyte64@awful.systems 2 points 16 hours ago (1 children)

But you don't understand that they're talking about systems while you're talking about personal choice.

[–] walden@wetshav.ing 0 points 15 hours ago

I may have gotten sucked into the .ml user's what-about-ism, but I started off by just trying to point out the flaw in their logic.

System, personal choice, whatever -- it doesn't really matter because .ml user is trying to spin facts to support their agenda. I don't know what their agenda is other than just being contentious.

[–] zbyte64@awful.systems 1 points 16 hours ago* (last edited 16 hours ago)

Big picture is AI not being able to operate under unusual conditions means that the "10 times better" (if it were only true) has a big fucking caveat where we can't say the stat will hold true if we replace all drivers.

[–] Wildmimic@anarchist.nexus 4 points 1 day ago (2 children)

I fully agree with you, but there is the issue of robotaxis crashing 3x as often as human drivers - and thats with a human supervisor on board. So if we switched completely to AI cars with the current level of integration, thats 120000 people killed.

[–] RobotToaster@mander.xyz 8 points 1 day ago

Tesla made the idiotic decision to rely entirely on cameras, waymo used lidar and other sensors to augment vision.

[–] pennomi@lemmy.world 3 points 1 day ago (1 children)

That’s Tesla, not Waymo. Tesla’s hardware is shit and does not even include lidar. You can’t judge the entire industry by the worst example.

[–] Perspectivist@feddit.uk -1 points 1 day ago* (last edited 1 day ago)

New HW4 Teslas do in fact include a front-facing radar, but it's currently only used for collecting data - not for FSD.

Still, gotta give them credit for getting by with vision-only quite well. I don't personally see any practical reason why you absolutely must include LiDAR. We already know driving relatively safely with vision only is possible - all the best drivers in the world do it.

[–] Zwuzelmaus@feddit.org 1 points 1 day ago (1 children)

current AI drivers are already statistically safer than

As long as they use level 3 autonomous cars and then cheat with remote operators instead of using real level 5 cars, such statistics remain quite meaningless.

However, they tell about the people who use them as arguments.

[–] errer@lemmy.world 3 points 23 hours ago (1 children)

As the OP stated, the low velocity cases are not causing deadly accidents. And you can’t drive by wire at high speed (too much latency). So I doubt it’s affecting the stats in any meaningful way.

Honestly I much prefer they have a human as a backup than not.

[–] snooggums@piefed.world 0 points 23 hours ago (1 children)

As the OP stated, the low velocity cases are not causing deadly accidents.

Make humans drive as slow as these cars and deaths will drop too.

[–] errer@lemmy.world 2 points 23 hours ago

The cars aren’t driving that slow the vast majority of the time…