Over the past 18 months, Tesla has been working with the Dutch vehicle approval organization, RDW, to get approval for its Full Self-Driving (Supervised) semi-autonomous driving system.
After a long consultation period, which included covering almost one million miles with FSD (Supervised) active and offering ride-along trials with 13,000 people in numerous European countries, the RDW deemed the technology safe to be given the green light.
Teslaโs controversial CEO, Elon Musk, has long-promised to introduce the partially autonomous cruise control system to other markets outside of the US, where it has been on sale for years. But the company has regularly butted up against regulatory red tape.
According to a press statement put out by Tesla to promote its European debut, the company says that when FSD (Supervised) is engaged, collisions are up to "seven-times less likely per kilometer driven compared to manual driving alone".
However, safety campaigners, such as Dan OโDowd of The Dawn Project, reiterates that โ59 people have been killed in over 3,000 crashes involving Teslaโs self-driving software in the U.S. since 2021 aloneโ.
"The RDW's decision is deeply troubling given Tesla FSD's myriad of well-documented safety defects," OโDowd adds.
What's more, the companyโs Robotaxis, which use a similar hardware suite that relies on the vehicleโs external cameras and artificial intelligence to navigate the world, as opposed to a plethora of radar and Lidar sensors like rivals, have made the headlines because data suggests they crash four times more often than the average human driver, according to Fortune. Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more. Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors By submitting your information you agree to the Terms & Conditions and Privacy Policy and are aged 16 or over.
In a bid to bolster its safety credentials, Tesla has made a number of changes to its software for the version that will go on sale in the Netherlands.
Not a Tesla App reported that those customers that had first-hand ride-along experience with Euro-spec FSD (Supervised) noticed that it differed to the technology found in the US.
Dutch owners will have to pass a mandatory safety quiz before FSD activates, for example, while the 'Sloth' to 'Mad Max' speed profiles in the US version have been ditched in favor of more straightforward 'Max Speed' setting in the Netherlands.
Analysis: Europe will be watching closely
While it is easy to think that the recent ruling in the Netherlands will automatically open the door for FSD (Supervised) to be used in the rest of Europe, it is highly likely that many other markets will continue to exercise caution.
Even RDW, the organization that gave the green-light to FSD (Supervised|) in the Netherlands, says that the system is not "self-driving," adding that the "driver remains responsible and must always remain in control."
This confusion with messaging used to promote the technologyโs capabilities has caused plenty of problems in the US, including the National Highway Traffic Safety Administration launching an investigation into the safety of the technology.
Recently, it escalated its probe to "Engineering Analysis", which it says will evaluate the systemโs ability to operate in reduced roadway visibility.
All the while, Elon Musk continues to promote the fact that every iteration of the FSD software will โfar exceed human levels of safety" and that users will soon be able to text and drive, when realistically, itโs simply a Level 2 semi-autonomous cruise control system that is also offered by the likes of Ford and BMW.
"reiterates that โ59 people have been killed in over 3,000 crashes involving Teslaโs self-driving software in the U.S. since 2021 aloneโ
more than 42,000 people were killed on the road in just the US alone, almost all by human drivers!
The real need is not perfection, it is at least as good as humans. Better than humans would be nice. Don't be fooled by statistics, they can be manipulated to say whatever you want, and it isn't just Tesla doing it!
Sadly even though I'm sure data exists I've never seen anyone independent publish anything trust worthy on how self driving compares to humans. All I've seen is data from someone with obvious biases. (they might be right but they still have a bias and so need independent analysis)
It's easy to make numbers look scary yes, and it'd be better if we had more accurate numbers. The only way were going to get that though is better laws mandating their release. Even now, Tesla is allowed to redact information about crashes in their Robotaxis (which this isn't about) which is a problem.
But lets just go by what he used on his site, which today lists 3013 crashes and 59 fatalities since 2021, but keep in mind, the FSD software was very beta until early 2024 when it finally went wide. This guy HATES Tesla, so I'm sure if there were more he'd have updated it accordingly.
Today, just over 9 billion miles have been driven on FSD since 2021.
So, 3013 crashes, over 9 billion miles is 1 crash every 2,987,056 miles, or 1 crash every 4,779,289 km.
NHSTA says 1.06 fatalities happen every 100 million miles, and the year before it was 1.16
So, 9 billion miles / 100 million miles * 1.06 = ~95.4 fatalities, and the year before would be 104.4
So Tesla has had 59, and there would have been 95.4 And while sad that there were fatalities on earlier versions of software, that isn't the software on the road today, so they should hopefully be at a declining rate as well, but we'll need more data to track that.
Before people used to say but Tesla's numbers are all highway so it's not a good comparison against the NHSTA numbers which includes city driving, and you know what, they are right about that, and it is a problem, but Tesla has finally started sharing city miles, and they're up to 3.3 billion now, so these statistics are starting to include them and will start giving a better picture.
And I just want to stress, this is for SUPERVISED driving. Tesla's Robotaxi's are NOT this safe.
Human + Robot > Human OR Robot in our current state.
Edit: oh and it looks like Dan's numbers include Autopilot, not only FSD which this is about, which is far far inferior, but that probably increases the miles driven substantially as Tesla's reporting on FSD numbers is specifically about FSD. They've also stopped offering AP on new cars.
Edit: Updated expected fatalities to 95.4, I accidentally used 1.05 instead of 1.06 when i calculated it.
It literally says the robotaxies crash four times as much as humans
That is a headline yes. However I have not been able to trace it down to anyone who is independent and has looked at what it really means. There are too many ways to cherry-pick numbers to make a point. I don't want headlines, I want someone who understands traffic safety and how to work with numbers to do an in-depth analysis. This is hard work, and so far it is missing (except as done by the self driving companies and thus biased)