Tesla recalling more than 2 million vehicles to fix Autopilot safety problem
Detroit — Tesla is recalling more than 2 million vehicles across its model lineup to fix a defective system that's supposed to ensure drivers are paying attention when they use Autopilot.
Documents posted Wednesday by U.S. safety regulators say the company will send out a software update to fix the problem.
The recall comes after a two-year investigation by the National Highway Traffic Safety Administration into a series of crashes that happened while the Autopilot partially automated driving system was in use. Some were deadly.
An agency spokesperson said in a statement to CBS News that its investigation found Autopilot's method of ensuring that drivers are paying attention can be inadequate and "can lead to foreseeable misuse of the system."
The recall covers nearly all of the vehicles Tesla sold in the U.S. and includes models Y, S, 3 and X produced between Oct. 5, 2012, and Dec. 7 of this year.
The software update includes additional controls and alerts "to further encourage the driver to adhere to their continuous driving responsibility," the documents said.
The software update was sent to owners of certain affected vehicles on Tuesday, with the rest getting it at a later date, the documents said.
"Automated technology holds great promise for improving safety, but only when it is deployed responsibly. Today's action is an example of improving automated systems by prioritizing safety," the NHTSA spokesperson said.
Autopilot includes features called Autosteer and Traffic Aware Cruise Control, with Autosteer intended for use on limited access freeways when it's not operating with a more sophisticated feature called Autosteer on City Streets.
The software update apparently will limit where Autosteer can be used.
"If the driver attempts to engage Autosteer when conditions are not met for engagement, the feature will alert the driver it is unavailable through visual and audible alerts, and Autosteer will not engage," the recall documents say.
The documents say agency investigators met with Tesla starting in October to explain "tentative conclusions" about the fixing the monitoring system. Tesla, it said, did not agree with the agency's analysis but agreed to the recall on Dec. 5 in an effort to resolve the investigation.
Auto safety advocates for years have been calling for stronger regulation of the driver monitoring system, which mainly detects whether a driver's hands are on the steering wheel.
Autopilot can steer, accelerate and brake automatically in its lane, but is a driver-assist system and cannot drive itself despite its name. Independent tests have found that the monitoring system is easy to fool, so much so that drivers have been caught while driving drunk or even sitting in the back seat.
In its defect report filed with the safety agency, Tesla said Autopilot's controls "may not be sufficient to prevent driver misuse."
A message was left early Wednesday seeking further comment from the Austin, Texas, company.
Tesla says on its website that Autopilot and a more sophisticated Full Self Driving system cannot drive autonomously and are meant to help drivers and that they have to be ready to intervene at all times. Full Self Driving is being tested by Tesla owners on public roads.
In a statement posted Monday on X, formerly Twitter, Tesla said safety is stronger when Autopilot is engaged.
NHTSA has dispatched investigators to 35 Tesla crashes since 2016 in which the agency suspects the vehicles were running on an automated system. At least 17 people have been killed.
The investigations are part of a larger probe by the NHTSA into multiple instances of Teslas using Autopilot crashing into parked emergency vehicles that are tending to other crashes. NHTSA has become more aggressive in pursuing safety problems with Teslas in the past year, announcing multiple recalls and investigations, including a recall of Full Self Driving software.
In May, Transportation Secretary Pete Buttigieg, whose department includes the NHTSA, said Tesla shouldn't be calling the system Autopilot because it can't drive itself.
In its statement Wednesday, NHTSA said the Tesla investigation remains open "as we monitor the efficacy of Tesla's remedies and continue to work with the automaker to ensure the highest level of safety."
Tesla recalling more than 2 million vehicles to fix Autopilot safety problem
Detroit — Tesla is recalling more than 2 million vehicles across its model lineup to fix a defective system that's supposed to ensure drivers are paying attention when they use Autopilot.
Documents posted Wednesday by U.S. safety regulators say the company will send out a software update to fix the problem.
The recall comes after a two-year investigation by the National Highway Traffic Safety Administration into a series of crashes that happened while the Autopilot partially automated driving system was in use. Some were deadly.
An agency spokesperson said in a statement to CBS News that its investigation found Autopilot's method of ensuring that drivers are paying attention can be inadequate and "can lead to foreseeable misuse of the system."
The recall covers nearly all of the vehicles Tesla sold in the U.S. and includes models Y, S, 3 and X produced between Oct. 5, 2012, and Dec. 7 of this year.
The software update includes additional controls and alerts "to further encourage the driver to adhere to their continuous driving responsibility," the documents said.
The software update was sent to owners of certain affected vehicles on Tuesday, with the rest getting it at a later date, the documents said.
"Automated technology holds great promise for improving safety, but only when it is deployed responsibly. Today's action is an example of improving automated systems by prioritizing safety," the NHTSA spokesperson said.
Autopilot includes features called Autosteer and Traffic Aware Cruise Control, with Autosteer intended for use on limited access freeways when it's not operating with a more sophisticated feature called Autosteer on City Streets.
The software update apparently will limit where Autosteer can be used.
"If the driver attempts to engage Autosteer when conditions are not met for engagement, the feature will alert the driver it is unavailable through visual and audible alerts, and Autosteer will not engage," the recall documents say.
The documents say agency investigators met with Tesla starting in October to explain "tentative conclusions" about the fixing the monitoring system. Tesla, it said, did not agree with the agency's analysis but agreed to the recall on Dec. 5 in an effort to resolve the investigation.
Auto safety advocates for years have been calling for stronger regulation of the driver monitoring system, which mainly detects whether a driver's hands are on the steering wheel.
Autopilot can steer, accelerate and brake automatically in its lane, but is a driver-assist system and cannot drive itself despite its name. Independent tests have found that the monitoring system is easy to fool, so much so that drivers have been caught while driving drunk or even sitting in the back seat.
In its defect report filed with the safety agency, Tesla said Autopilot's controls "may not be sufficient to prevent driver misuse."
A message was left early Wednesday seeking further comment from the Austin, Texas, company.
Tesla says on its website that Autopilot and a more sophisticated Full Self Driving system cannot drive autonomously and are meant to help drivers and that they have to be ready to intervene at all times. Full Self Driving is being tested by Tesla owners on public roads.
In a statement posted Monday on X, formerly Twitter, Tesla said safety is stronger when Autopilot is engaged.
NHTSA has dispatched investigators to 35 Tesla crashes since 2016 in which the agency suspects the vehicles were running on an automated system. At least 17 people have been killed.
The investigations are part of a larger probe by the NHTSA into multiple instances of Teslas using Autopilot crashing into parked emergency vehicles that are tending to other crashes. NHTSA has become more aggressive in pursuing safety problems with Teslas in the past year, announcing multiple recalls and investigations, including a recall of Full Self Driving software.
In May, Transportation Secretary Pete Buttigieg, whose department includes the NHTSA, said Tesla shouldn't be calling the system Autopilot because it can't drive itself.
In its statement Wednesday, NHTSA said the Tesla investigation remains open "as we monitor the efficacy of Tesla's remedies and continue to work with the automaker to ensure the highest level of safety."