Could Tesla CEO Elon Musk’s casual relationship to the truth be his undoing? The world’s richest man is being blamed by family members for the death of a California man whose Model S crashed while Autopilot was engaged. They say claims made by Musk that self-driving technology was perfect and ready for road use contributed to driver Genesis Giovanni Mendoza Martinez’s Tesla crash in February of 2023.
He was killed while behind the wheel of the Model S he bought, thinking it could drive itself. This comes from a lawsuit filed by Mendoza’s parents and his brother, who was also severely injured in the crash, according to The Independent. Tesla, of course, did not take these allegations lightly. The Austin, Texas-based automaker argued that its cars have “a reasonably safe design as measured by the appropriate test under the applicable state law,” adding that the accident “may have been caused in whole or in part” by the 31-year-old’s “own negligent acts and/or omissions.”
Tesla went on to say that “no additional warnings would have, or could have prevented the alleged incident, the injuries, losses and damages alleged.” Maybe, but Musk has spent years at this point making false claims about the abilities of both Autopilot and Full Self-Driving. It’s not unreasonable to think a Tesla buyer would take the company’s CEO at his word, but what do I know?
Here’s more on the lawsuit, from The Economic Times:
The lawsuit eventually alleges that the Autopilot system of Tesla is actually flawed and unable to recognize emergency vehicles while it led to the fatal collision, asserted Independent. At the same time, it also accuses Tesla of neglecting to address known issues with Autopilot and misleading its consumers about the technology’s capabilities.
The complaint at the same time highlights numerous statements by Tesla CEO Elon Musk that allegedly misrepresented the functionality of Autopilot while contributing to public misconceptions about the safety of the system, noted Independent. The case has drawn severe attention to ongoing concerns regarding Tesla’s self-driving technology and its implications for public safety.
Here’s what a lawyer for the Mendoza family told The Independent about the suit and where things are now:
“This is yet another example of Tesla using our public roadways to perform research and development of its autonomous driving technology. The injuries suffered by the first responders and the death of Mr. Mendoza were entirely preventable. What’s worse is that Tesla knows that many of its earlier model vehicles continue to drive our roadways today with this same defect putting first responders and the public at risk.”
Schreiber said Tesla puts cars on the road with an Autopilot feature he described as “ill-equipped to perform,” and that instead of announcing a recall to correct problems, the company simply releases new software and calls it an “update.”
“It’s this rush of pushing product out that is not really ready for primetime,” Schreiber said.
The lawsuit alleges Mendoza was pretty much duped by the things Musk, the world’s richest man, had posted on social media bout Autopilot. The lawsuit reportedly says he “believed those claims were true, and thus believed the ‘Autopilot’ feature with the ‘full self driving’ upgrade was safer than a human driver and could be trusted to safely navigate public highways autonomously.” Unfortunately for him, the system very much could not be trusted to do those things.
This is a bit more information on the crash itself, from The Independent:
Shortly after Valentine’s Day last year, at around 4 a.m., Giovanni was driving his Tesla northbound on Interstate 680, with Caleb in the passenger seat and the Autopilot engaged, according to the complaint.
In the distance, a fire truck was parked diagonally across two lanes of traffic, with its emergency lights flashing, to divert oncoming cars away from a collision site, the complaint continues. It says a second fire truck was also on the scene, along with two California Highway Patrol vehicles, all of which also had their emergency lights activated.
As the brothers made their way down the road, the vehicle suddenly broadsided the first fire truck, slamming into it at high speed, the complaint states.
“At the time of the collision, Giovanni was not controlling the Subject Vehicle, but he was instead passively sitting in the driver’s seat with the ‘Autopilot’ feature engaged,” the complaint continues. “In fact, data from the Tesla itself showed that the Subject Vehicle was in ‘Autopilot’ for approximately 12 minutes prior to the crash, with no accelerator pedal or brake pedal inputs from Giovanni during that time. The approximate speed of the Subject Vehicle was 71 mph during the 12-minute period.”
The data further showed that Giovanni “generally maintained contact with the steering wheel until the time of the crash,” according to the complaint.
“As a result of the collision, the Subject Vehicle sustained major frontal damage, crushing Giovanni’s body,” it says. “Giovanni survived, at least momentarily, but subsequently died from the injuries he sustained in the collision.”
The lawsuit also apparently goes into detail about other Autopilot and FSD crashes – alleging Musk’s team neglected to fix existing bugs before releasing features to the public.
Listen, I know folks like you and I know better than to take the things Musk says at face value, but for people who aren’t as savvy about the car world, it isn’t a huge leap to not question the things the automaker’s CEO is saying. It’s terrible what happened to this poor guy, and it’s hard not to put at least some blame at the feet of a CEO who convinced him what he was doing was actually safe.