Teslas' Habit Of Driving Onto Train Tracks Has Senators Demanding An Investigation, Like That's Bad Or Something
If you haven't heard, Teslas have a habit of driving onto train tracks while using their so-called "self-driving" software. That's also what those in the business call "a bad thing." You know, because trains drive on those tracks, and when a train hits a car, it typically ends horribly. Of course, Tesla's "Full Self-Driving" software is only a Level 2 system and isn't capable of actual self-driving, but when you name something "Full Self-Driving," people are going to believe the name. Now, two senators are calling for an investigation, NBC News reports.
On Monday, Massachusetts Senator Ed Markey and Connecticut Senator Richard Blumenthal sent a letter to the National Highway Traffic Safety Administration saying:
We write today to express deep concern regarding the failure of Tesla's Full Self-Driving (FSD) system to safely detect and respond to railroad crossings. Despite past investigations by the National Highway Traffic Safety Administration (NHTSA) into Tesla's system, FSD reportedly continues to pose an ongoing threat to safety on our public roads. Because collisions between trains and cars often cause significant fatalities and injuries, FSD's failure to safely navigate railroad crossings creates serious risk of a catastrophic crash. We urge NHTSA to immediately launch a formal investigation into this disturbing safety risk and take any necessary action to protect the public.
NHTSA still hasn't responded to NBC News' request for a comment, but a few weeks ago, when the outlet published its investigation into Teslas failing to stop at railroad crossings, the agency said, "We are aware of the incidents and have been in communication with the manufacturer." And if that doesn't give you full confidence that they plan to do something about it, I don't know what will.
Will NHTSA do anything
For some reason, NHTSA's response wasn't enough for Markey or Blumenthal, who cited the NBC News investigation in their letter, saying:
Over the past year, a growing number of Tesla drivers have reported incidents in which vehicles equipped with FSD failed to recognize or properly respond to railroad crossings. According to a September investigation by NBC News, at least six Tesla owners experienced near-collisions or crashes while FSD was engaged at or near train tracks.
Several of these incidents were documented with video evidence showing the vehicle driving past warning signs, ignoring active crossing gates, or otherwise requiring sudden human intervention to avoid disaster. ... Something is clearly wrong with the operation of Tesla's FSD near train tracks.
The senators then went on to call out NHTSA directly, writing:
The potential consequences of this kind of failure at a railroad crossing cannot be overstated and necessitate urgent action by NHTSA. ... Unfortunately, NHTSA's response to the reports concerning railroad crossings has, so far, been deeply insufficient, with the agency doing nothing more than acknowledging it is aware of the incidents and stating it is in contact with the manufacturer. The seriousness of these incidents warrants an immediate official investigation. The traveling public desperately needs a traffic safety agency aggressively investigating vehicle safety defects, not an agency on autopilot.
Get it? Autopilot? Like Tesla's other driver-assistance software? Will NHTSA actually do anything, though? That remains to be seen.
Other problems with Tesla's FSD
When NBC News spoke with experts in the industry, they said Tesla uses a "black-box AI model in which errors can't be easily explained even by its creators" and explained that the electric automaker likely hadn't spent enough time training its model on railroad crossings. Other evidence suggests that isn't the only serious safety issue Tesla didn't think would be important enough to train its software on, though. Full Self-Driving also doesn't stop for school buses.
Back in May, independent testing from safety advocate Dan O'Dowd's Dawn Project showed that not only did the latest version of Tesla's so-called "Full Self-Driving" software fail to stop for a school bus in every single test it ran, it also ran over the child-size mannequin they used to simulate a child crossing the street at the bus stop. The school bus had its stop sign deployed and its lights flashing, but the Tesla didn't stop. Even worse, the system detected the child but just didn't think a child was worth stopping for, either. Legally, you're supposed to stop for the bus in that situation, and you definitely aren't supposed to run over any kids.
And it isn't like Tesla only learned about this problem a couple months ago. It's been an issue the Dawn Project has been harping on for nearly three years, starting with a full-page ad in the New York Times, and in 2023, it even ran a Super Bowl ad. When I spoke to Dan O'Dowd a couple of months ago, he told me:
We saw that happen. We reproduced it again and again to make sure it was real, and it was real. And then we thought this was going to be a huge scandal, and we were going to publish it. So we published it as a New York Times full page ad and said, "This is going to create a huge scandal, and then they're going to have to fix it, right? Or they'll have to pull it off the road, or they're going to do something about it." But they didn't. I mean...they didn't do a thing. And the government didn't recall it. The government didn't test it.
Musk also still hasn't taught his cars to stop for school buses, either. "He doesn't care," O'Dowd told me. "Empathy is not in his playbook. If it kills other people, it doesn't matter. That's where we stand."