Tesla Motors is under regulatory scanner again, this time regarding the safety of its semi-automated driving system.
What Happened: A formal investigation into Tesla's Autopilot having issues with recognizing parked emergency vehicles has been initiated by the National Highway Traffic Safety Administration (NHSTA), Associated Press reported, citing a posting on the agency's website.
The agency reportedly said it has identified 11 crashes since 2018 whenTeslaon Autopilot or Traffic Aware Cruise Control has rammed against vehicles with flashing lights, flares, an illuminated arrow board or cones warning of hazards.
The observation follows an investigation into Tesla's vehicles, including Model Y, X, S and 3, from 2014 through 2021 model years.
Autopilot is often misused by Tesla drivers, who drive under the influence of alcohol, or in random cases have even sat in the backseat while the system is on.
Data released by the NHSTA showed that out of the 31 crashes involving partially automated driver-assist investigated since June 2016, 25 involved Tesla Autopilot, in which 10 deaths were identified, the report said.
Why It's Important: Safety has been found to be a key issue with EVs involving automated driver assistance system.
Those opposed to the technology have blamed that Tesla misled car owners regarding the technology's abilities, prompting them to believe that they can turn their attention away from the road while the system is on.
Even as this semi-autonomous driving technology is fraught with issues, Tesla is now promoting its full self-driving as a subscription offering.
Following the release of the ninth FSD beta in July, Consumer Reports said it was concerned that Tesla has released the software directly to owners to test out on public roads, which could put others on the road at risk.
Incidentally, Tesla's CEO Elon Musk announced Sunday through a tweet that there has been a delay in the release of the new FSD software update.
TSLA Price Action: Tesla shares were down 4.69% at $683.52 Monday morning.
精彩评论