Topics

Latest

AI

Amazon

Article image

Image Credits:Tesla

Apps

Biotech & Health

mood

dashboard view of Tesla’s autopilot screen

Image Credits:Tesla

Cloud Computing

mercantilism

Crypto

Enterprise

EVs

Fintech

fund raise

contraption

Gaming

Google

Government & Policy

computer hardware

Instagram

Layoffs

Media & Entertainment

Meta

Microsoft

Privacy

Robotics

Security

societal

Space

Startups

TikTok

Transportation

Venture

More from TechCrunch

Events

Startup Battlefield

StrictlyVC

Podcasts

video

Partner Content

TechCrunch Brand Studio

Crunchboard

Contact Us

The National Highway Traffic Safety Administration closed a long - standing investigation into Tesla ’s Autopilot driver assistance system after brush up hundreds of crashes involving its misuse , include 13 that were fatal and “ many more involving serious injuries . ”

At the same sentence , NHTSA is opening a Modern probe to evaluate whether the Autopilot recall fix thatTesla enforce in Decemberis effective enough .

NHTSA ’s Office of Defects Investigation say indocumentsreleased Friday that it complete “ an blanket body of body of work ” which turned up evidence that “ Tesla ’s light driver interlocking organisation was not appropriate for Autopilot ’s permissive operating capability . ”

“ This mismatch result in a critical safety gap between driver ’ expectations of [ Autopilot ’s ] operating capability and the system ’s true capabilities , ” the bureau write . “ This interruption lead to foreseeable abuse and avertable clank . ”

The closing of the initial investigation , whichbeganin 2021 , strike out an end of one of the most seeable cause by the administration to audit Tesla ’s Autopilot package . Tesla is still feeling the pressure sensation of multiple other question , though .

The Department of Justice is alsoinvestigatingthe company ’s claims about the engineering , and the California Department of Motor Vehicles hasaccusedTesla of falsely advertise the capabilities of Autopilot and the more - ripe Full Self - drive genus Beta software . The company is also facing multiple cause regarding Autopilot . Tesla , meanwhile , isnow going“balls to the wall for autonomy , ” harmonise to CEO Elon Musk .

NHTSA said its probe reviewed 956 reported crashes up until August 30 , 2023 . In rough one-half ( 489 ) of those , the agency said either there “ was insufficient data to make an assessment , ” the other fomite was at demerit , Autopilot was institute to not be in use or the crash was otherwise unrelated to the investigation .

Join us at TechCrunch Sessions: AI

Exhibit at TechCrunch Sessions: AI

NHTSA said the remaining 467 crashes fell into three buckets . There were many ( 211 ) crashes where “ the frontal plane of the Tesla struck another vehicle or obstruction with adequate time for an attentive driver to respond to forefend or extenuate the clang . It say 145 crash involved “ roadway loss in low traction conditions such as wet roadways . And it pronounce 111 of the crash involved “ roadway leaving where Autosteer was unwittingly disengaged by the driver ’s input . ”

These crashes “ are often hard because neither the system nor the machine driver reacts appropriately , resulting in high - speed differential and high energy clangour outcomes , ” the agency wrote .

Tesla evidence driver they need to ante up attention to the route and keep their hands on the bike while using Autopilot , which it measures via a torque sensor and , in its newer cars , the in - cabin camera . But NHTSA , and other safety groups , have said that these warning and checks do not go far enough . In December , NHTSA sound out these standard were “ deficient to preclude abuse . ”

Tesla tally to issue a recall via a software update that would theoretically increase number one wood monitoring . But that update did not really appear tochange Autopilot much — a thought NHTSA seems to agree with .

Parts of that recall fix require the “ proprietor to choose in , ” and Tesla allow a driver to “ readily reverse ” some of the safeguards , according to NHTSA .

NHTSA spend nearly three geezerhood working on the investigation into Autopilot , and contact or interact with Tesla numerous times throughout the process . It performed many direct interrogation of the crashes , and relied on the company to provide data about them as well .

But the agency criticized Tesla ’s information inone ofthe supporting documents .

“ Gaps in Tesla ’s telematic information produce uncertainty regarding the actual rate at which vehicle operating with Autopilot engaged are necessitate in crashes . Tesla is not aware of every crash involving Autopilot even for severe crash because of disruption in telematic reporting , ” NHTSA write . According to the agency , Tesla “ largely receives information from clangour only with pyrotechnical deployment , ” entail when air bag , seat belt pre - tensioners or the pedestrian impact mitigation feature of the auto ’s hoodlum are triggered .

NHTSA exact that bound to this level mean Tesla is only pull in data on around 18 % of crashes that are report to the police . As a result , NHTSA wrote that the probe reveal crashes for which Autopilot was engaged that Tesla was not notified of via telematics .