161
submitted 1 year ago by L4s@lemmy.world to c/technology@lemmy.world

95 Tesla deaths have involved fires or Autopilot. How the EV maker fares in fatalities per million miles.::Tesla deaths from fires and Autopilot make up 24% of the fatalities in crashes. Learn about the death rates per million miles and vs other cars.

top 50 comments
sorted by: hot top controversial new old
[-] kokesh@lemmy.world 13 points 1 year ago

Much less deaths than without the feature turned on and much much less than other cars.

[-] ghariksforge@lemmy.world 8 points 1 year ago

The evidence for this claim is lacking.

[-] rm_dash_r_star@lemm.ee 11 points 1 year ago

In the fourth quarter of 2022, Tesla said its cars using Autopilot were involved in one crash per every 4.85 million miles or 0.2 crashes per million miles. That's compared with around 0.7 crashes per million miles for Teslas not using Autopilot

Well if you can believe Tesla's numbers, autopilot is still safer than manual. Though really I think the criteria for safety should be more strict than just being somewhat safer. Other car makers won't take autopilot beyond self-park for good reason. It's not that Tesla's technology is so much better, other makers are just more responsible about liability.

[-] vin@lemmynsfw.com 3 points 1 year ago

Tesla is using stats to lie. Autopilot can only be used on highways. If you just look at crash rate on highways, it’s very close to autopilot’s rate. And ideally it should be compared to other assistive technologies.

[-] jocanib@lemmy.world 2 points 1 year ago

Narrator: you cannot believe Tesla's numbers.

[-] machinin@lemmy.world 2 points 1 year ago

Other car makers won't take autopilot beyond self-park for good reason.

Mercedes offers level 3, which is fully hands off, and they take full responsibility for accidents that happen while the system is engaged. Ford and GM also have systems that are on par with Tesla and likely have better safety measures.

It's not that Tesla's technology is so much better, other makers are just more responsible about liability.

Definitely agree with this. Tesla just cuts corners.

[-] vacuumpizzas@t.bobamilktea.xyz 9 points 1 year ago

In 2016, the first known fatality linked to a self-driving car took place when a Tesla Model S failed to stop and crashed into a semitrailer truck.

Ah, this one is hard to forget. I remember this one vividly because it sparked all sorts of philosophical discussions around the use of self-driving cars. Hypothetical scenarios like “Between a family of 5 with children, should the car choose to kill the driver to save the family” and the different variations of the trolley problem.

Determining the responsible party was always a puzzle to me. The current state of auto-pilot requires hands-on attention from the driver, so the accountable party is arguably the driver. But with a fully autonomous vehicle, where the steering wheel isn’t installed, is the car manufacturer accountable for deaths and accidents?

[-] Terevos@lemm.ee 5 points 1 year ago

But with a fully autonomous vehicle, where the steering wheel isn’t installed, is the car manufacturer accountable for deaths and accidents?

Yes

[-] vacuumpizzas@t.bobamilktea.xyz 1 points 1 year ago

A contrived metaphor: if an unleashed dog bites a person, is the dog owner no longer responsible for the incident?

You could say that it’s up to the car owner to install a steering wheel, like how a dog owner should use a leash. But this would be a gap between when the person receives the car and when they could install the steering wheel (assuming the wheel installation can be performed).

[-] machinin@lemmy.world 5 points 1 year ago

with a fully autonomous vehicle, where the steering wheel isn’t installed, is the car manufacturer accountable for deaths and accidents?

We already have this. Mercedes offers level 3 capability where they take responsibility for any accidents using the system. We'll know that Tesla has made real progress when they stop blaming drivers for every single accident that happens.

[-] Burns@lemm.ee 5 points 1 year ago

In the fourth quarter of 2022, Tesla said its cars using Autopilot recorded were involved in one crash per every 4.85 million miles — or 0.2 crashes per million miles. That's compared with around 0.7 crashes per million miles for Teslas not using Autopilot, and a US average of 1.5 crashes per million miles. The company has not released any figures for deaths per million miles, or the total number of miles driven, which would be needed to calculate that figure.

[-] ghariksforge@lemmy.world 4 points 1 year ago

Why do we trust Tesla's figures? Elon is a pathological liar.

Independently collected evidence that does not rely on Tesla shows that Tesla's are not safe.

[-] weinermeat@lemmy.world 2 points 1 year ago

This is FUD. You must pay attention at all times when using Autopilot, the car begs you to do so. Not paying attention is equivalent to setting cruise control in any other vehicle and closing your eyes.

[-] ghariksforge@lemmy.world 4 points 1 year ago

Tesla runs ads saying that their cars drive itself and they don't need a driver.

[-] weinermeat@lemmy.world 6 points 1 year ago

Can you provide an example? I genuinely haven’t seen that but I live a relatively advertisement-free lifestyle. As a Tesla owner, I’ve never heard that from anyone in real life either.

[-] ghariksforge@lemmy.world 2 points 1 year ago
[-] PipedLinkBot@feddit.rocks 2 points 1 year ago

Here is an alternative Piped link(s): https://piped.video/shorts/FfJ_Hw0kJl4

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source, check me out at GitHub.

[-] weinermeat@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

Ok, counter point is this: https://electrek.co/2023/01/17/real-story-behind-tesla-staged-self-driving-video/

“The engineer added: “The intent of the video was not to accurately portray what was available for customers in 2016. It was to portray what was possible to build into the system.” Elluswamy also confirmed that the Autopilot team put the video together as a “demonstration of the system’s capabilities” at the request of Musk.”

I do remember that video and did not take it as an advertisement of what the product would do in 2016.

[-] ghariksforge@lemmy.world 1 points 1 year ago

The first seconds of the video literally says "our cars don't need a driver".

[-] LibertyLizard@slrpnk.net 3 points 1 year ago* (last edited 1 year ago)

While it’s always easy to blame irresponsible drivers, when you market your product as “full self driving”, the behavioral consequences of that decision are obvious, and yes, you share in the blame for the carnage you helped create.

[-] weinermeat@lemmy.world 2 points 1 year ago

Autopilot is not FSD. I agree the advertising of FSD is poor, but the FSD you’re thinking of is in beta, opt-in only, and also tries to force you to pay attention.

[-] bluekieran@lemmy.world 2 points 1 year ago

He doesn't need to lie when he can just choose what to release - if they're not releasing deaths per million miles, it's because the numbers aren't as favourable.

[-] TheGreenGolem@lemm.ee 5 points 1 year ago

If someone is from the US, please explain it to me: how is this shit still legal there?

Isn't there some regulations about what can you put on the roads? Aren't you responsible to prove that your solution works before you get a green light to put it on the roads? Or was Autopilot ever approved by any regulator?

[-] vin@lemmynsfw.com 4 points 1 year ago

Autopilot is a teslas name for adaptive cruise control, which has been in cars since decades. Assistive technologies don’t need approvals.

[-] rm_dash_r_star@lemm.ee 3 points 1 year ago* (last edited 1 year ago)

Honestly I don't know how Tesla got past USA regulation on its autopilot. In the USA aircraft have to meet FAA safety standards. A new model aircraft has to go through extensive flight testing and receive FAA certification before it can go into production. I'm not sure who governs automotive safety in the USA, but they did something unusual to push through it. Either that or automotive safety regulations in the USA are just not that strict.

There's also the fact the batteries they use can spontaneously catch fire (properly known as thermal runaway). I would think that to pose a big safety consideration, but evidently it's not a problem for US regulations. There is a Li-Ion battery technology much less prone to thermal runaway (LiFePO4) and some cars use it. It's greatly safer and has about five times greater battery longevity, but it's also about twenty percent heavier. I think it's a fair trade-off to avoid a fiery death.

[-] Yendor@reddthat.com 4 points 1 year ago

There's also the fact the batteries they use can spontaneously catch fire (properly known as thermal runaway).

Thermal runaway is not spontaneous. It requires sustained heating, which is typically caused by serious damage. So yes, lithium batteries can catch fire after a crash. But do you know what else catches first after a crash - ICE powered cars (at a rate 10x higher than EVs). And ICE cars are FAR more dangerous, because unlike an EV that burns slowly for hours, an exploding gas tank releases its energy in an instant. Ask a firefighter what’s more dangerous.

There is a Li-Ion battery technology much less prone to thermal runaway (LiFePO4) and some cars use it. It's greatly safer and has about five times greater battery longevity, but it's also about twenty percent heavier. I think it's a fair trade-off to avoid a fiery death.

Most Teslas use LiFePO4 (commonly called LFP) batteries. Most other manufactures still use NMC barriers. Both are far safer than the explosive dinosaur juice that ICE vehicles run on.

[-] ghariksforge@lemmy.world 0 points 1 year ago
[-] ghariksforge@lemmy.world 0 points 1 year ago

These don't test self driving features, which are the source of the problem. FSR and Autopilot are unsafe according to NHTSA data.

[-] Chreutz@lemmy.world 4 points 1 year ago

Euro NCAP test lane keeping, auto braking etc, and the 2022 Model Y scored 98%, the highest ever.

[-] weinermeat@lemmy.world 1 points 1 year ago

Do you own a Tesla? Basic autopilot is merely lane keeping combined with automatic cruise control. The car begs you to pay attention when you turn it on. Would you turn on cruise control and then take a nap? It’s the same thing. Users of autopilot need to pay attention at all times.

[-] Yendor@reddthat.com 0 points 1 year ago

TACC, AEB, FCW, LA and a bunch of other aspects of Autopilot are all tested, and have done extremely well.

IIHS have probably the most stringent testing in the world, and they ranked the Model 3 the safest car ever tested. The Model Y was the second safest car ever tested.

[-] FuglyDuck@lemmy.world -1 points 1 year ago

Maybe if they remembered to do the quality control check.

load more comments
view more: next ›
this post was submitted on 15 Jul 2023
161 points (92.1% liked)

Technology

57932 readers
2867 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS