577
submitted 10 months ago by L4s@lemmy.world to c/technology@lemmy.world

Tesla knew Autopilot caused death, but didn't fix it::Software's alleged inability to handle cross traffic central to court battle after two road deaths

you are viewing a single comment's thread
view the rest of the comments
[-] anlumo@feddit.de -3 points 10 months ago

I've never sat in a Tesla, so I'm not really sure, but based on the things I've read online, autopilot and FSD are two different systems on Tesla cars you can engage separately. There shouldn't be any confusion about this.

[-] Miqo@lemmy.world 4 points 10 months ago

I've never sat in a Tesla, so I'm not really sure

There shouldn't be any confusion about this.

U wot m8?

[-] Thorny_Thicket@sopuli.xyz 2 points 10 months ago

There's three tiers. Autopilot, Enhanced Autopilot and FSD Beta

[-] r00ty@kbin.life 1 points 10 months ago

Well, if it's just the lane assistance autopilot that is causing this kind of crash. I'd agree it's likely user error. The reason I say if, is because I don't trust journalists to know or report on the difference.

I am still concerned the FSD beta is "out there" though. I do not trust normal users to understand what beta means, and of course no-one is going to read the agreement before clicking agree. They just want to see their car drive itself.

[-] anlumo@feddit.de 2 points 10 months ago

If it were about the FSD implementation, things would be very different. I'm pretty sure that the FSD is designed to handle cross traffic, though.

I do not trust normal users to understand what beta means

Yeah, Google kinda destroyed that word in the public conciousness when they had their search with the beta flag for more than a decade while growing to be one of the biggest companies on Earth with it.

When I first heard about it, I was very surprised that the US even allows vehicles with beta self-driving software on public roads. That's like testing a new fire fighter truck by randomly setting buildings on fire in a city and then trying to stop that with the truck.

[-] Ocelot@lemmies.world 0 points 10 months ago* (last edited 10 months ago)

Yeah, I don't trust a machine that has been trained for millions of hours and simulated every possible traffic scenario tens of millions of times and has millisecond reaction time while seeing the world in a full 360 degrees. A system that never drives drunk, distracted or fatigued. You know who's really good at driving though? Humans. Perfect track record, those humans.

this post was submitted on 21 Aug 2023
577 points (94.9% liked)

Technology

55610 readers
2656 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS