this post was submitted on 02 Apr 2025
1114 points (98.8% liked)

Technology

68348 readers
3067 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

you are viewing a single comment's thread
view the rest of the comments
[–] brygphilomena@lemmy.dbzer0.com 9 points 4 days ago (1 children)

Don't waymos have remote drivers that take control in unexpected situationsml?

[–] dogslayeggs@lemmy.world 6 points 4 days ago (2 children)

They have remote drivers that CAN take control in very corner case situations that the software can't handle. The vast majority of driving is don't without humans in the loop.

[–] NotMyOldRedditName@lemmy.world 2 points 3 days ago* (last edited 3 days ago) (1 children)

They don't even do that, according to Waymo's claims.

They can suggest what the car should do, but they aren't actually doing it. The car is in complete control.

Its a nuanced difference, but it is a difference. A Waymo employee never takes control of or operates the vehicle.

[–] KayLeadfoot@fedia.io 2 points 3 days ago (1 children)

Interesting! I did not know that - I assumed the teleoperators took direct control, but that makes much more sense for latency reasons (among others)

I always just assumed it was their way to ensure the vehicle was really autonomous. If you have someone remotely driving it, you could argue it isn't actually an AV. Your latency idea makes a lot of sense as well though. Imagine taking over and causing an accident due to latency? This way even if the operator gives a bad suggestion, it was the car that ultimately did it.