Broaden your plant-based cooking beyond seasonal restrictions.
According to a police report, Tillman Mitchell, 17, was hit by a Tesla Model Y on North Carolina Highway 561 in March. The school bus was displaying its stop sign and flashing red warning lights when the incident occurred. The car, which was allegedly in Autopilot mode, never slowed down and struck Mitchell at 45 mph. The teenager was thrown into the windshield, flew into the air and landed face down in the road. The crash was one of 736 U.S. crashes since 2019 involving Teslas in Autopilot mode, according to a Washington Post analysis of National Highway Traffic Safety Administration data. The number of such crashes has surged over the past four years, reflecting the dangers associated with increasingly widespread use of Tesla’s futuristic driver-assistance technology as well as the growing presence of the cars on the nation’s roadways.

The number of deaths and serious injuries associated with Autopilot has also grown significantly. When authorities first released a partial accounting of accidents involving Autopilot in June 2022, they counted only three deaths definitively linked to the technology. The most recent data includes at least 17 fatal incidents, 11 of them since last May, and five serious injuries.

Tesla CEO Elon Musk has said that cars operating in Tesla’s Autopilot mode are safer than those piloted solely by human drivers. He has pushed the carmaker to develop and deploy features programmed to maneuver the roads — navigating stopped school buses, fire engines, stop signs and pedestrians — arguing that the technology will usher in a safer, virtually accident-free future. While it’s impossible to say how many crashes may have been averted, the data shows clear flaws in the technology being tested in real time on America’s highways.

Tesla’s 17 fatal crashes reveal distinct patterns, The Post found: Four involved a motorcycle. Another involved an emergency vehicle. Meanwhile, some of Musk’s decisions — such as widely expanding the availability of the features and stripping the vehicles of radar sensors — appear to have contributed to the reported uptick in incidents, according to experts who spoke with The Post.

Tesla and Elon Musk did not respond to a request for comment.

NHTSA said a report of a crash involving driver-assistance does not itself imply that the technology was the cause. “NHTSA has an active investigation into Tesla Autopilot, including Full-Self Driving,” spokeswoman Veronica Morales said, noting the agency doesn’t comment on open investigations. “NHTSA reminds the public that all advanced driver assistance systems require the human driver to be in control and fully engaged in the driving task at all times. Accordingly, all state laws hold the human driver responsible for the operation of their vehicles.”

Musk has repeatedly defended his decision to push driver-assistance technologies to Tesla owners, arguing that the benefit outweighs the harm.


“Once you believe that implementing autonomy will decrease the number of injuries and deaths, it becomes a moral obligation to deploy it even if it means facing lawsuits and criticism from many people,” stated Musk last year. “Because those whose lives were saved will never know it. However, those who die or get injured will definitely know, or their families will.”

Missy Cummings, a former senior safety adviser at NHTSA and a professor at George Mason University’s College of Engineering and Computing, expressed concern over the increase in severe and fatal crashes involving Tesla. She attributed the rise to the expanded rollout of Full Self-Driving, which provides driver assistance on city and residential streets and is now available to anyone.

Cummings also highlighted the number of fatalities compared to overall crashes as a worrying trend.

It is uncertain whether all crashes involving Tesla’s driver-assistance systems are captured in the data. NHTSA’s data includes incidents where it is unclear whether Autopilot or Full Self-Driving was in use, including three fatalities.

Following a federal order in 2021, NHTSA began collecting data on crashes involving driver-assistance technology. While the total number of such crashes is small compared to all road incidents, Tesla has been involved in the vast majority of the 807 automation-related crashes since reporting requirements were introduced. Tesla is also linked to almost all the deaths.

Subaru ranks second with 23 reported crashes since 2019, reflecting wider deployment and use of automation across Tesla’s fleet of vehicles and the wider range of circumstances in which Tesla drivers are encouraged to use Autopilot.

Autopilot, introduced by Tesla in 2014, is a suite of features that enables the car to drive itself from highway on-ramp to off-ramp, following lane lines and maintaining speed and distance behind other vehicles. Tesla offers it as a standard feature on its vehicles, with over 800,000 equipped with Autopilot on U.S. roads. Full Self-Driving, an experimental feature that requires purchase, allows Teslas to drive from point A to B by following turn-by-turn directions, stopping for stop signs and traffic lights, and responding to hazards along the way. Tesla emphasizes that drivers must monitor the road and intervene when necessary with either system.


The increase in collisions corresponds with Tesla's aggressive deployment of Full Self-Driving, which has grown from approximately 12,000 users to almost 400,000 in just over a year. Nearly two-thirds of all driver-assistance accidents that Tesla has reported to NHTSA occurred in the past year.

Philip Koopman, a professor at Carnegie Mellon University who has studied autonomous vehicle safety for 25 years, said the high number raises important questions.

"A significantly higher number is definitely a cause for concern," he said. "We need to understand whether it's due to worse accidents or if there's another factor, such as a dramatically larger number of miles being driven with Autopilot on."

In February, Tesla recalled more than 360,000 vehicles equipped with Full Self-Driving because the software prompted the cars to ignore traffic lights, stop signs, and speed limits.

According to documents published by the safety agency, the flouting of traffic laws "could increase the risk of a collision if the driver does not intervene." Tesla claimed to have resolved the problems with an over-the-air software update that addressed the danger.

While Tesla continuously modified its driver-assistance software, it also took the unprecedented step of removing radar sensors from new vehicles and disabling them from cars already on the road, depriving them of a critical sensor as Musk advocated for a simpler hardware set amid the global computer chip shortage. "Only very high-resolution radar is relevant," Musk said last year.





According to government filings first reported by Electrek, Tesla has recently taken measures to reintroduce radar sensors.

In a March presentation, Tesla claimed that Full Self-Driving crashes at a rate at least five times lower than vehicles in normal driving when comparing miles driven per collision. Without access to Tesla's detailed data, it is impossible to verify that assertion or Musk's description of Autopilot as "unequivocally safer."

Autopilot, which is mainly a highway system, operates in a less complex environment than the variety of situations encountered by a typical road user.

It is unclear which system was being used in the fatal accidents: Tesla has asked NHTSA not to reveal that information. In the section of the NHTSA data that specifies the software version, Tesla's incidents are listed in all capital letters as "redacted, may contain confidential business information."

Both Autopilot and Full Self-Driving have come under scrutiny in recent years. Transportation Secretary Pete Buttigieg told the Associated Press last month that Autopilot is not an appropriate name "when the fine print says you need to have your hands on the wheel and eyes on the road at all times."

NHTSA has launched numerous investigations into Tesla's accidents and other problems with its driver-assistance software. One inquiry focused on "phantom braking," a phenomenon in which vehicles suddenly slow down for imaginary hazards.

In a case last year detailed by The Intercept, a Tesla Model S allegedly using driver-assistance suddenly braked in traffic on the San Francisco Bay Bridge, causing an eight-car pileup that injured nine people, including a 2-year-old.

In other complaints filed with NHTSA, owners claim that the cars slam on the brakes when they encounter semi-trucks in oncoming lanes.

Many crashes occur in similar settings and conditions. For example, NHTSA has received more than a dozen reports of Teslas slamming into parked emergency vehicles while in Autopilot. Last year, NHTSA upgraded its investigation of those incidents to an "engineering analysis."

Last year, NHTSA also launched two consecutive special investigations into fatal accidents involving Tesla vehicles and motorcyclists. One occurred in Utah when a Harley-Davidson motorcyclist was traveling in a high-occupancy lane on Interstate 15 outside Salt Lake City just after 1 a.m., according to authorities. A Tesla in Autopilot mode collided with the motorcycle.