The US government has launched a formal investigation into Tesla's Autopilot advanced driver assistance system after a series of crashes involving Tesla cars and parked emergency vehicles.
The investigation by the US National Highway Traffic Safety Administration (NHTSA) follows 11 crashes, in which a total of 17 people were injured and one killed, and potentially affects 765,000 cars.
The noticed issued by the NHTSA covers virtually every Tesla sold in the US since 2014, including the Model S, Model X, Model 3 and Model Y.
The NHTSA says that its Office of Defects Investigation (ODI) has identified 11 crashes involving Tesla cars that occurred when they encountered "first responder scenes" being attended to by emergency services vehicles.
The body says that most of the incidents took place after dark and that the crash scenes included control measures such as emergency vehicle lights, illuminated road signage boards and traffic cones.
According to the NHTSA, every Tesla involved in the crash had either its Autopilot or Traffic Aware Cruise Control advanced driver assistance systems (ADAS) enabled on its approach to the accident scene. The cars involved subsequently struck one or more vehicles involved in the first responder scenes.
Autopilot is a level-two ADAS system, meaning it can control both the vehicle’s steering and speed, although the NHTSA notes in its statement that the driver retains “primary responsibility for Object and Event Detection and Response (OEDR)”.
The NHTSA said that its investigation will “assess the technologies and methods used to monitor, assist and enforce the driver’s engagement with the dynamic driving task during Autopilot mode”. It will also look into any contributing circumstances for the crashes.
Autopilot has previously been investigated by US National Transport Safety Board (NTSB), which has recommended that the NHTSA require Tesla to introduce a better system to ensure drivers are paying attention when Autopilot is engaged.
In a report into a 2018 crash that was published last year, the NTSB determined that Tesla hadn’t done enough to prevent misuse of the system, but also that the NHTSA’s hands-off approach to regulating ADAS and related technology overlooked the risks of such systems.
READ MORE
Two killed in US crash while reportedly using Tesla Autopilot
Tesla releases early 'Full Self Driving' mode with strict warning
Join the debate
Add your comment
Currently the system while perfectly able to detect a stationary vehicle partially on the road, it isn't allowed to stear the car onto a lane of on coming traffic to go around it (even though it is perfectly capable of sensing the one coming cars). The FSD Beta can deal with this.
Suspect that this will just be dealt with using a bong and a dissconnection and autobrake, all being done with a over the air update.
Of course this news has knocked $30 billion off Tesla's value dispite being a $1 million fix.
It's basically a machine, the key thing that's missing is?, thinking like a human, and that fact at the moment is why an autonomous system fails.