The National Highway Traffic Safety Administration (NHTSA) on Friday said it is investigating 2.4 million Tesla vehicles with the Full Self-Driving (FSD) software after four collisions were reported, including a fatal crash.
The auto safety regulator opened the preliminary evaluation after receiving four reports of crashes in which Tesla’s FSD software was engaged at times when there was reduced roadway visibility, such as sun glare, fog or airborne dust.
The NHTSA said that in one crash “the Tesla vehicle fatally struck a pedestrian. One additional crash in these conditions involved a reported injury.”
The Tesla vehicles that are the focus of NHTSA’s probe include the 2016-2024 Model S and X vehicles with the optional system as well as 2017-2024 Model 3, 2020-2024 Model Y and 2023-2024 Cybertruck vehicles.
TESLA SHARES FALL AFTER ROBOTAXI EVENT
The NHTSA’s preliminary evaluation is the first step in a process that could see the agency seek to recall the vehicles if it believes they pose an unreasonable risk to safety.
Ticker | Security | Last | Change | Change % |
---|---|---|---|---|
TSLA | TESLA INC. | 220.89 | -0.44 | -0.20% |
The agency’s review of the ability of FSD’s engineering controls to “detect and respond appropriately to reduced roadway visibility conditions” will include a look into whether similar FSD crashes have occurred in such conditions, as well as if any updates or modifications made to FSD by Tesla have affected its performance when visibility is reduced.
The NHTSA said the “review will assess the timing, purpose, and capabilities of any such updates, as well as Tesla’s assessment of their safety impact.”
MUSK UNVEILS ROBOTAXI, UNSUPERVISED FULL SELF-DRIVING FUTURE: ‘THAT’S WHAT WE WANT’
Tesla CEO Elon Musk has sought to increase the electric vehicle (EV)-maker’s focus on self-driving technology and robotaxis as it faces tough competition and weak consumer demand in the EV market.
Tesla’s FSD technology has been in development for years and eventually aims to reach a high level of automation capability, where the vehicle can do most driving tasks without human intervention. However, it has faced legal scrutiny stemming from at least two fatal accidents, including an incident in April when a Model S in FSD mode hit and killed a motorcyclist in the Seattle area.
Tesla explains on its website that FSD and its Autopilot feature are intended to be used by an attentive driver who can intervene and take control as needed.
“Autopilot and Full Self-Driving (Supervised) are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous,” Tesla wrote.
Reuters contributed to this report.
Read the full article here