A Timeline of the NHTSA Investigation Into Tesla Autopilot and Full Self-Driving Technology
Tesla's Level 2 self-driving system has been linked to crashes and deaths since 2016.
Shutterstock
Tesla has touted its suite of self-driving software as one of the chief benefits of buying its electric sedans or SUVs. Known as Autopilot and Full Self-Driving, the two systems are the face of Tesla's advanced technologies — but also the subject of controversy and safety inquiries by federal regulators.
Here's a look at how the company's Autopilot and Full Self-Driving systems work and a timeline of the events that drew regulatory scrutiny and led to high-profile recalls — including one this year affecting more than 300,000 Tesla vehicles.
What Are Tesla Full Self-Driving and Autopilot?
Autopilot and Full Self-Driving are Level 2 semi-autonomous driving systems that are available across the Tesla lineup. Each uses sophisticated software to process information collected by a network of cameras that continuously monitor a 360-degree area around the vehicle.
Autopilot comes standard with every Tesla. It's roughly analogous to a combination of the adaptive cruise control and lane-keeping features offered by other automakers. It's designed to allow the vehicle to match the speed of traffic and steer itself to stay within road markings.
Full Self-Driving is a step above Autopilot in terms of both features and capabilities. It's intended to allow the driver to set a destination in the navigation system and then have the car "drive itself" to that location, automatically changing lanes, moving through traffic, and entering and exiting highways as needed. With Full Self-Driving, the vehicle also is meant to be able to recognize traffic lights and park itself.
When Did Tesla's Problems Begin?
Tesla's website stresses that neither Autopilot nor Full Self-Driving makes the company's vehicles fully autonomous. A human driver must be present and paying attention at all times to intervene should either system fail to operate safely.
The problem of driver over-reliance on Autopilot gained widespread attention after a fatal 2016 crash in Florida. Commenting on the National Transportation Safety Board's investigation of that crash, NTSB Chairman Robert Sumwalt said that "[i]n this crash, Tesla's system worked as designed." He added that Tesla's system "was designed to perform limited tasks in a limited range of environments. The system gave far too much leeway to the driver to divert his attention to something other than driving."
The Tesla Model S involved in that crash, which was exceeding the speed limit, didn't slow down at all before it collided with a truck crossing its path. That incident was followed by other high-profile fatal crashes: California in 2018, Florida in 2019, and Texas in 2021, all of which involved high-speed impacts with other vehicles or stationary objects.
Nevertheless, the implication of a marketing term like "Full Self-Driving" — along with the company's frequent claims about its self-driving prowess — have helped lead to numerous instances of Tesla drivers diverting their attention entirely and relying solely on their vehicles' tech to drive for them. There are videos online, for example, of Tesla owners either asleep at the wheel or filming themselves with their hands completely off the steering wheel in complex driving situations.
When Did NHTSA Start Its Investigation?
Between 2016 and 2023 the National Highway Traffic Safety Administration launched 41 special crash investigations into incidents suspected of being related to the use of Tesla's Autopilot system. Nineteen people were killed in those crashes.
In August 2021, NHTSA escalated its investigation into a preliminary evaluation of the Tesla technology, with a focus on 11 crashes in which Autopilot-controlled automobiles crashed into parked emergency vehicles with their emergency lights flashing. The incidents resulted in one death and 17 injuries. NHTSA also reached out to 13 different automakers for data concerning crashes involving Level 2 self-driving systems.
The results of that outreach were made public less than a year later. For the period between July 1, 2021, and May 15, 2022, 392 crashes involving limited self-driving features were reported, NHTSA found. Tesla's Autopilot or Full Self-Driving systems were linked to 273 of these incidents, accounting for five of six reported deaths.
Shortly before this announcement, NHTSA upgraded its preliminary evaluation of Tesla's Autopilot into an engineering analysis, which is viewed as a precursor to the announcement of a safety recall. At this point, NHTSA clarified that its investigation was focused on 35 specific crashes — responsible for 14 deaths — in which Autopilot was active.
In February 2023, NHTSA announced a recall of 362,758 Tesla vehicles featuring the Full Self-Driving beta system. In its recall notice, the administration wrote that the system can cause Tesla automobiles to "act unsafe around intersections" by failing to stop (Tesla refers to this internally as "rolling stops"). Further, it said, the recalled vehicles may not respect speed limits, and on city streets they could present an "unreasonable risk to motor vehicle safety based on insufficient adherence to traffic safety laws."
Tesla Recalls Roll Into 2023
This was not the first recall related to Tesla's Full Self-Driving system. In November 2021, the company recalled nearly 12,000 vehicles over a software glitch that caused them to brake or warn drivers of a forward collision unexpectedly.
The corrective software was distributed using over-the-air updates. The company also recalled about 54,000 vehicles due to the Full Self-Driving system allowing the same "rolling stops" mentioned in the most recent recall notice.
The 2023 recall necessitated a software update distributed over the air to current Tesla owners, affecting Model S, Model X, Model 3, and Model Y vehicles built between 2016 and 2023.
NHTSA has not yet completed its investigation of Tesla's Autopilot system, which involves more than 830,000 vehicles. The safety agency characterizes the probe as "open and active."
Written by humans.
Edited by humans.
Benjamin Hunting is a writer and podcast host who contributes to a number of newspapers, automotive magazines, and online publications. More than a decade into his career, he enjoys keeping the shiny side up during track days and always has one too many classic vehicle projects partially disassembled in his garage at any given time. Remember, if it's not leaking, it's probably empty.
Related articles
View more related articles