this post was submitted on 23 Jun 2025
903 points (97.7% liked)

Technology

71890 readers
4740 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] spankmonkey@lemmy.world 231 points 3 days ago (5 children)

Paraphrasing:

"We only have the driver's word they were in self driving mode..."

"This isn't the first time a Tesla has driven onto train tracks..."

Since it isn't the first time I'm gonna go ahead and believe the driver, thanks.

[–] Pika@sh.itjust.works 151 points 3 days ago (2 children)

Furthermore, with the amount of telemetry that those cars have The company knows whether it was in self drive or not when it went onto the track. So the fact that they didn't go public saying it wasn't means that it was in self-drive mode and they want to save the PR face and liability.

[–] IphtashuFitz@lemmy.world 95 points 3 days ago (1 children)

I have a nephew that worked at Tesla as a software engineer for a couple years (he left about a year ago). I gave him the VIN to my Tesla and the amount of data he shared with me was crazy. He warned me that one of my brake lights was regularly logging errors. If their telemetry includes that sort of information then clearly they are logging a LOT of data.

[–] atomicbocks@sh.itjust.works 24 points 2 days ago (1 children)

Modern cars (in the US) are required to have an OBD-II Port for On-Board Diagnostics. I always assumed most cars these days were just sending some or all of the real-time OBD data to the manufacturer. GM definitely has been.

[–] Pika@sh.itjust.works 24 points 2 days ago (1 children)

Dude, in today's world we're lucky if they stop at the manufacturer. I know of a few insurances that have contracts through major dealers and they just automatically get the data that's registered via the cars systems. That way they can make better decisions regarding people's car insurance.

Nowadays it's a red flag if you join a car insurance and they don't offer to give you a discount if you put something like drive pass on which logs you're driving because it probably means that your car is already getting that data to them.

We just got back from a road trip in a friend's '25 Tundra and it popped up a TPMS warning for a faulty sensor then minutes later he got a text from the dealership telling him about it and to bring it in for service.

[–] catloaf@lemm.ee 43 points 2 days ago (2 children)

I've heard they also like to disengage self-driving mode right before a collision.

[–] Mouselemming@sh.itjust.works 24 points 2 days ago* (last edited 2 days ago) (2 children)

Since the story has 3 separate incidents where "the driver let their Tesla turn left onto some railroad tracks" I'm going to posit:

Teslas on self-drive mode will turn left onto railroad tracks unless forcibly restrained.

Prove me wrong, Tesla

[–] Tarquinn2049@lemmy.world 2 points 2 days ago (1 children)

Map data obtained and converted from other formats often ends up accidentally condensing labeling categories. One result is train tracks being categorized as generic roads instead of retaining their specific sub-heading. Another, unrelated to this, but common for people that play geo games is when forests and water areas end up being tagged as the wrong specific types.

[–] Mouselemming@sh.itjust.works 1 points 2 days ago

Aha. But that sounds correctable.... So not having any people assigned to checking on railroads and making sure the system recognizes them as railroads would be due to miserliness on the part of Tesla then.... And might also say something about why some Teslas have been known to drive into bodies of water (or children, but that's probably a different instance of miserliness)

[–] AA5B@lemmy.world 1 points 2 days ago (1 children)

I mean …… Tesla self driving allegedly did this three times in three years but we don’t yet have public data to verify that’s what happened nor do we in any way compare it to what human drivers do.

Although one of the many ways I think I’m an above average driver (just like everyone else) is that people do a lot of stupid things at railroad crossings and I never would

[–] Mouselemming@sh.itjust.works 6 points 2 days ago

I'm pretty sure Tesla self-drive does a lot of stupid things you never would, too. That's why they want you at the wheel, paying attention and ready to correct it in an instant! (Which defeats the whole benefit of self-drive mode imho, but whatever)

The fact that they can avoid all responsibilities and blame you for their errors is of course the other reason.

[–] XeroxCool@lemmy.world 14 points 3 days ago (3 children)

The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.

Should a manufacturer be held accountable for legitimate flaws? Absolutely. Should drivers be absolved without the facts just because we don't like a company? I don't think so. But if Tesla has proof fsd was off, we'll know in a minute when they invade the driver's privacy and release driving events

[–] spankmonkey@lemmy.world 66 points 3 days ago (2 children)

Tesla has constantly lied about their FSD for a decade. We don't trust them because they are untrustworthy, not because we don't like them.

[–] BlueLineBae@midwest.social 9 points 3 days ago* (last edited 3 days ago) (4 children)

I have no sources for this so take with a grain of salt... But I've heard that Tesla turns off self driving just before an accident so they can say it was the drivers fault. Now in this case, if it was on while it drove on the tracks I would think would prove it's Tesla's faulty self driving plus human error for not correcting it. Either way it would prove partly Tesla's fault if it was on at the time.

[–] meco03211@lemmy.world 6 points 3 days ago (1 children)

Pretty sure they can tell the method used when disengaging fsd/ap. So they would know if it was manually turned off or if the system lost enough info and shut it down. They should be able to tell within a few seconds if accuracy the order of events. I can't imagine a scenario that wouldn't be blatantly obvious where the tesla was able to determine an accident was imminent and shut off fsd/ap wroth enough time to "blame it on the driver". What might be possible is that the logs show fsd shut off like a millisecond before impact/event and then someone merely reported that fsd was not engaged at the time of the accident. Technically true and tesla lawyers might fight like hell to maintain that theory, but if an independent source is able to review the logs, I don't see that being a possibility.

[–] pixeltree@lemmy.blahaj.zone 9 points 3 days ago

Of course they know, they're using it to hide the truth. Stop giving a corporation the benefit of the doubt where public safety is concerned, especially when they've been shown to abuse it in the past

[–] AA5B@lemmy.world 4 points 2 days ago

They supposedly also have a threshold, like ten seconds - if FSD cuts out less than that threshold before the accident, it’s still FSD’s fault

That would require their self driving algorithm to actually detect an accident. I doubt it's capable of doing so consistently.

[–] spankmonkey@lemmy.world 2 points 3 days ago (2 children)

On a related note, getting unstuck from something like train tracks is a pretty significant hurdles. The only real way is to back up IF turning onto the tracks wasn't a drop down of the same depth as the rails. Someone who is caught off guard isn't going to be able to turn a passenger car off the tracks because the rails are tall and getting an angle with the wheels to get over them isn't really available.

So while in a perfect world the driver would have slammed on the brakes immediately before it got onto the tracks, getting even the front wheels onto the tracks because they weren't fast enough may have been impossible to recover from and going forward might have been their best bet. Depends on how the track crossing is built.

[–] ayyy@sh.itjust.works 0 points 2 days ago (2 children)

If you’re about to be hit by a train, driving forward through the barrier is always the correct choice. It will move out of the way and you stay alive to fix the scratches in your paint.

[–] Ledericas@lemm.ee 1 points 2 days ago* (last edited 2 days ago)

not if your in a tesslar.

[–] spankmonkey@lemmy.world 2 points 2 days ago (1 children)

Maybe you should read the article.

[–] ayyy@sh.itjust.works 0 points 2 days ago

I meant more in the general sense, I recognize that cars can get stuck places.

[–] roguetrick@lemmy.world -1 points 3 days ago

I guess I'm a train now.

[–] AA5B@lemmy.world 2 points 2 days ago (1 children)

They promote it in ways that people sometimes trust it too much …. But in particular when releasing telemetry I do t remember tha ever being an accusation

[–] ayyy@sh.itjust.works 5 points 2 days ago

It’s more about when they don’t release it/only selectively say things that make them look good and staying silent when they look bad.

[–] aramis87@fedia.io 7 points 2 days ago

How is a manufacturer going to be held responsible for their flaws when musk DOGE'd every single agency investigating his companies?

[–] lka1988@lemmy.dbzer0.com 5 points 3 days ago

The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.

I owned an FJ80 Land Cruiser when that happened. I printed up a couple stickers for myself, and for a buddy who owned a Tacoma, that said "I'm not speeding, my pedal's stuck!" (yes I'm aware the FJ80 was slow as dogshit, that didn't stop me from speeding).

[–] TheKingBee@lemmy.world 3 points 2 days ago* (last edited 2 days ago) (2 children)

Maybe I'm missing something, but isn't it trivial to take it out of their bullshit dangerous "FSD" mode and take control? How does a car go approximately 40-50 feet down the tracks without the driver noticing and stopping it?

[–] spankmonkey@lemmy.world 6 points 2 days ago

On some railroad crossings you might only need to go off the crossing to get stuck in the tracks and unable to back out. Trying to get out is another 30-40 feet.

Being caught off guard when the car isn't supposed to do that is how to get stuck in the first place. Yeah, terrible driver trusting shit technology.

[–] shaggyb@lemmy.world 3 points 2 days ago (1 children)
[–] TachyonTele@piefed.social 7 points 2 days ago

Ideally you hit the brakes before buyin the tesla.

[–] NuXCOM_90Percent@lemmy.zip 2 points 3 days ago (1 children)

I mean... I have seen some REALLY REALLY stupid drivers so I could totally see multiple people thinking they found a short cut or not realizing the road they are supposed to be on is 20 feet to the left and there is a reason their phone is losing its shit all while their suspension is getting destroyed.

But yeah. It is the standard tesla corp MO. They detect a dangerous situation and disable all the "self driving". Obviously because it is up to the driver to handle it and not because they want the legal protection to say it wasn't their fault.

[–] AA5B@lemmy.world 1 points 2 days ago (2 children)

At my local commuter rail station the entrance to the parking lot is immediately next to the track. It’s easily within margin of error for GPS and if you’re only focusing immediately in front of you the pavement at the entrance probably look similar.

There are plenty of cues so don’t rolled shouldn’t be fooled but perhaps FSD wouldn’t pay attention to them since it’s a bit of an outlier.

That being said, I almost got my Subaru stuck once because an ATV trail looked like the dirt road to a campsite from the GPS, and I missed any cues there may have been

[–] XeroxCool@lemmy.world 2 points 2 days ago

Sounds reasonable to mix up dirt roads at a campsite. Idk why the other commenter had to be so uptight. I get the mixup in the lot if it's all paved and smooth, especially if say you make a left into the lot and the rail has a pedestrian crossing first. Shouldn't happen, but there's significant overlap in appearance of the ground. The average driver is amazingly inept, inattentive, and remorseless.

I'd be amused if your lot is the one I know of where the train pulls out of the station, makes a stop for the crosswalk, then proceeds to just one other station.

But the part of rail that's not paved between? That should always be identifiable as a train track. I can't understand when people just send it down the tracks. And yet, it still happens. Even at the station mentioned above where they pulled onto the 100mph section. Unreal.

[–] NuXCOM_90Percent@lemmy.zip 1 points 2 days ago

You uh... don't need to tell people stuff like that.