Friday, October 18, 2024
No menu items!
HomeTechnologyTesla's Full Self-Driving software under investigation by federal safety regulator

Tesla’s Full Self-Driving software under investigation by federal safety regulator

The top U.S. automotive safety regulator has opened a new investigation into Tesla’s so-called “Full Self-Driving (Supervised)” software after four reported crashes in low-visibility situations — including one where a pedestrian was killed.

The National Highway Traffic Safety Administration’s (NHTSA) Office of Defects Investigation announced Friday that it is probing the driver assistance system to find out whether it can “detect and respond appropriately to reduced roadway visibility conditions,” such as “sun glare, fog, or airborne dust.” The agency also wants to know if other crashes have occurred in these conditions beyond the ones that were reported.

The investigation comes just one week after Tesla CEO Elon Musk revealed the prototype of his company’s “CyberCab,” a two-seater car that he said is supposed to serve as a robotaxi, following years of unmet promises. Musk also claimed at the event that Tesla’s Model 3 sedan and Model Y SUV would be able to operate without supervision in California and Texas at some point in 2025, though he did not offer any details about how that will happen.

In April, NHTSA closed a nearly three-year probe into Autopilot, the less-capable driver assistance software that Tesla offers, after investigating almost 500 crashes where the system was active. The agency found 13 of those crashes were fatal. At the same time that it closed that probe, NHTSA opened a new investigation into the recall fix that Tesla had issued to address problems with Autopilot.

Tesla’s software also faces other legal threats. The Department of Justice is investigating the claims Tesla has made about its driver-assistance features, and the California Department of Motor Vehicles has accused Tesla of inflating the software’s capabilities.

The company also faces a number of lawsuits over Autopilot crashes. It settled one of the most high-profile cases that was set to go to trial earlier this year. The company has said in the past that it makes drivers aware they are supposed to constantly monitor Full Self-Driving and Autopilot and be ready to take over at any moment.

The new investigation announced Friday specifically calls out four crashes where the Full Self-Driving (Supervised) system was active, all occurring between November 2023 and May 2024.

The November 2023 crash happened in Rimrock, AZ. It involved a Model Y that struck and killed a pedestrian. Another crash happened in January 2024 in Nipton, CA, where a Model 3 crashed into another car on the highway during a dust storm. In March 2024 a Model 3 crashed into another car on the highway in Red Mills, VA during cloudy conditions. And in May 2024, a Model 3 crashed into a stationary object on a rural road in Collinsville, OH in foggy conditions.

NHTSA’s defects investigations team splits its probes into four levels: Defect Petition, Preliminary Evaluation, Recall Query and Engineering Analysis. The agency classified this new investigation as a preliminary evaluation. NHTSA normally tries to complete these types of probes within eight months.

This story has been updated with more details from NHTSA’s filings.

RELATED ARTICLES

Most Popular

Recent Comments