But Elon Musk is disputing the claim, News.Az reports, citing foreign media.
***
The allegation is that the “Full Self-Driving” mode failed to detect the right-hand turn as the vehicle approached the Y-shaped overpass split, crashing into the barrier and leaving driver Justine Saint Amour with nerve and spinal injuries.
The hood also flew open, and pieces of truck careened off. Saint Amour’s infant in the backseat remained unharmed.
Musk responded in a post on X that “Logs show driver disengaged Autopilot four seconds before crashing.”
Though it’s difficult to determine in the short video, Saint Amour’s lawyer contends that she had to. “She tried to take control, but crashed into the barrier,” said attorney Bob Hilliard in a statement. “That’s not self-driving.”
The issue is when and why precisely the driver chose to manually take over the wheel during the incident that occurred last year. In the lawsuit, Hilliard notes that the so-called “Full Self-Driving” mode is classified as SAE Level 2, meaning that drivers must be prepared to take over at any moment if there’s a safety issue. He argues that the sensing technology and the alert system for drivers are insufficient.
“While engineers at Tesla recommended the super-human vision of lidar be included for self-driving vehicles, and competitors like Waymo and Cruise relied heavily on lidar, Musk chose instead to rely only upon cheap video cameras,” the lawsuit says.
Lidar uses light to measure distances from objects during the driving process, but has been eschewed by Musk for being expensive and “freaking stupid.”
What Musk and many enthusiastic Tesla fans are arguing is that disabling FSD at four seconds before the crash didn’t give the Cybertruck a chance to automatically make the turn itself. They believe the driver is at fault for disengaging it and that the self-driving feature is unfairly being maligned.
“Every time there’s an accident, the headline rushes to blame Tesla. Then the telemetry comes out and the narrative quietly dies,” wrote one X user. “350 ft is what four seconds of travel time equates to at 60 mph. Plenty of time to avoid a crash,” argued another.
What may be the deciding factor in the case is the Cybertruck’s speed at the moment. Even if the driver turned off FSD too soon, and that’s a big if, the vehicle may have been travelling too fast at the turn itself, implying it wouldn’t have made it, and the driver had to take over. But details of the vehicle’s speed and the level of alert aren’t immediately evident in the dashcam video.
This is not the first time the system has come under scrutiny. Last year, a California judge allowed a class action lawsuit to proceed, arguing that marketing that allegedly over-promised the full self-driving feature may have misled thousands of customers. But it’s also not the first time Musk has challenged a crash with vehicle log data. In 2021, the Wall Street Journal reported that “Fatal Tesla Crash in Texas Believed to Be Driverless,” with Musk angrily responding that the vehicle had not purchased the FSD system, and Autopilot was not enabled, implying the “driverless” claim was a misrepresentation.
This case may be a bit more complicated to determine fault. What the courts end up seeing regarding the speed and timing is likely what will decide. Armchair judges on social media will have to just wait and see.
19
Mar


