503
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 14 Aug 2023
503 points (96.7% liked)
Technology
59234 readers
3834 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
This is stupid. Teslas can park themselves, they're not just on rails. It should be pulling over and putting the flashers on if a driver is unresponsive.
That being said, the driver knew this behavior, acted with wanton disregard for safe driving practices, and so the incident is the driver's fault and they should be held responsible for their actions. It's not the courts job to legislate.
It's actually the NTSB's job to regulate car safety so if they don't already have it congress needs to grant them the authority to regulate what AI behavior is acceptable/define safeguards against misbehaving AI.
There's no way the headline is true. Zero percent. The car will literally do exactly what you stated if it goes too long without driver engagement and I've experienced it first hand.
Evidently, he was aware enough to respond to the alerts, per the logs (as stated in the WSJ video that's in the article). It shows a good bit of the footage, too.
Seems like they need something better for awareness checking than just gripping the wheel and checking where your eyes are pointed. And obviously better sensors for object recognition.
The headline doesn't state that the warnings were consecutive.
Perhaps the driver was just aware enough to keep squelching warnings and prevent the car from stopping altogether?
I'll grant you, though, 150 warnings is still a little tough tough to believe...
I turned off the "lane assist" in our Mazda because it kept steering me back toward obstacles I was trying to avoid, like cyclists, oversized loads, potholes, etc. I don't know why anyone thought that was a good idea.
But try buying a car without those features now...sigh.
Use your turn signal to indicate your direction change and it won't do that.
If you're swerving to avoid a sudden obstacle you reasonably may not have the foresight or reaction to flip on a signal. The car still needs to not force you back on collision course.
That's a good point, and is probably why they designed it so that if you swerve hard, lane assist shuts off. It only nudges you back to the middle of the lane if you are gently drifting to a side, so it only works in situations where your turn signal can be used to avoid it. Or you can just disable it if you drive a BMW or otherwise can't use turn signals.
Even moving over slightly in the lane to avoid a pothole triggers it; it doesn't seem like a turn signal should be necessary in that situation. Instead the situation seems to be that I'm seeing the pothole and altering the car's course gently to avoid it, and I get close to the line and it freaks out.
I guess if I drove right up to the obstacle then swerved, it wouldn't do it...but I was always taught swerving was a last-resort thing, best to drive as smoothly as possible. (This was my dad's argument, and I said, "Uh, SOMEONE taught me to not swerve unless it was necessary..." (him). He laughed.
The driver is responsible for this accident, Tesla still should be liable imo for all the shady and outright misleading advertising around their so called "self driving". Compare Tesla's marketing to like GMs of Hyundai's, both of which essentially have parity with Teslas system in terms of actual features, and you'll see a big difference
Sounds like the injured officers are suing. It's a civil case not criminal, so I'm not sure how much the court would actually be asked to legislate. I'd be interested to hear their arguments, though I'm sure part of their reasoning for suing Tesla over the driver is they have more money.
Yes. Actually, just stopping in the middle of the road with hazard lights would be sufficient.
You say that yet a Tesla did exactly that, which caused some tailgaters to crash into the back of it, and everyone blamed the Tesla for causing an accident.
https://theintercept.com/2023/01/10/tesla-crash-footage-autopilot/