DETROIT — The U.S. government’s Traffic Safety Administration is investigating Tesla’s “fully self-driving” system after reports of crashes in low-visibility conditions, including one that killed a pedestrian.
The National Highway Traffic Safety Administration launched an investigation Thursday after the company reported in a written statement that four crashes occurred when Tesla vehicles encountered sun glare, fog or airborne dust. He said he did.
In addition to the pedestrian’s death, there were also injuries in other accidents, the agency said.
Investigators will look into the ability of “fully autonomous vehicles” to “detect and respond appropriately to poor road visibility conditions” and, if so, the circumstances that led to these crashes. It’s planned.
The study covered approximately 2.4 million Teslas from 2016 to 2024 model years.
A message was left for Tesla on Friday seeking comment, but the company has repeatedly said its system cannot drive itself and a human driver must be ready to intervene at any time.
Last week, Tesla held an event at its Hollywood studio to unveil a fully autonomous robotaxis with no steering wheel or pedals. Musk, who has previously promised self-driving cars, said the company plans to have its self-driving Model Y and 3 operating without a human driver next year. He said steering wheelless robotaxis will be available starting in California and Texas in 2026.
It’s unclear what impact the study will have on Tesla’s self-driving ambitions. NHTSA would have to approve robotaxis without pedals or steering wheels, but that is unlikely while the investigation is ongoing. However, if the company tries to add self-driving cars to existing models, it will likely run afoul of state regulations. Although there are no federal regulations specific to self-driving cars, they must meet broader safety regulations.
NHTSA will also investigate whether other similar crashes involving “fully autonomous driving” have occurred in low-visibility conditions, and whether the update affected the system’s performance in such conditions. He said he would ask for information.
“In particular, this review will evaluate Tesla’s assessment of the timing, purpose, and functionality of such updates and their safety impact,” the document states.
Tesla reported four crashes to NHTSA, pursuant to a mandate from the agency that covers all automakers. A pedestrian was killed in November 2023 when he was struck by a 2021 Tesla Model Y in Rimrock, Arizona, according to the agency’s database. Rimrock is about 100 miles (161 km) north of Phoenix.
The crash occurred on Interstate 17 just after 5 p.m. on Nov. 27, the Arizona Department of Public Safety said in a statement. Two vehicles collided on the highway, blocking the left lane. A Toyota 4Runner stopped and two people got out and helped direct traffic. A red Tesla Model Y then struck the 4Runner and one of the people who got out of it. A 71-year-old woman from Mesa, Arizona, was pronounced dead at the scene.
Department spokesman Raul Garcia said the Tesla driver was not charged because the sun was in his eyes, causing the crash. He added that the sun’s glare also played a role in the initial impact.
Tesla has twice recalled its “Full Self-Driving” system under pressure from NHTSA, and in July, after a Tesla using the system hit and killed a motorcyclist near Seattle. , requested information from law enforcement agencies and the company.
The recall was issued because the system was programmed to drive through stop signs at low speeds and because the system violated other traffic laws. Both issues were scheduled to be fixed with an online software update.
Critics say Tesla’s system, which uses only cameras to spot hazards, doesn’t have the proper sensors for full self-driving. Almost every other company working on self-driving cars uses radar and laser sensors in addition to cameras to improve visibility in darkness or low visibility conditions.
Musk said that since humans drive only by sight, cars should also be able to drive by using only cameras. He called lidar (light detection and ranging), which uses lasers to detect objects, a “fool’s errand.”
The “Full Self-Driving” recall follows a three-year investigation into how Tesla’s less sophisticated Autopilot system crashed into emergency vehicles and other vehicles parked on highways, many with flashing warning lights. It was announced later.
The investigation ended last April after authorities pressured Tesla to recall vehicles to strengthen weak systems that ensure driver caution. A few weeks after the recall, NHTSA began investigating whether it was working.
NHTSA began Autopilot crash investigations in 2021 after receiving 11 reports of Teslas using Autopilot colliding with parked emergency vehicles. In a statement explaining why the investigation was closed, NHTSA said it had ultimately identified 467 crashes involving Autopilot, resulting in 54 injuries and 14 deaths. Autopilot is a fancy version of cruise control, but Musk claims it can drive a car “fully autonomous” without any human intervention.
The investigation, which began Thursday, enters new territory for NHTSA, which viewed Tesla’s systems as assisting drivers rather than driving themselves. A new study has authorities focusing on the capabilities of “fully self-driving cars,” rather than just checking whether drivers are paying attention.
Michael Brooks, executive director of the nonprofit Center for Auto Safety, said previous research into Autopilot had not looked into why Teslas failed to recognize emergency vehicles and stop. .
“Previously, the burden was placed on the driver, not the car,” he says. “What they’re saying here is that these systems can’t adequately detect safety hazards, whether the driver is paying attention or not.”