Federal regulators say Tesla’s software violates traffic laws in dangerous ways. The company is updating its “Full Self-Driving” program after pressure from regulators.
LEILA FADEL, presenter.
Federal regulators say Tesla’s software violates traffic laws in dangerous ways.
ASMA KHALID, presenter.
So the company is fixing its full self-driving feature with the recall announced yesterday. Software is controversial. And in fact, depending on where you watched the Super Bowl, you may have seen an ad showing Tesla mowing down child-sized mannequins.
(THE VOICE OF ARMENIA)
PERSONALIZED PERSON. Tesla’s full-auto is putting the public at risk with deceptive marketing and woefully poor engineering. Ninety percent agree that this should be banned immediately. Why is NHTSA allowing Tesla to drive fully on its own?
FADEL: NHTSA is now the federal highway safety regulator. NPR’s Camilla Domonoske joins us to talk about this recall. Good morning, Camilla.
CAMILLA DOMONOSKE, BYLINE. Good morning:
FADEL: So what does this recall mean for Tesla owners?
DOMONOSKE: Well, it will only affect people who have full self-control, which is an expensive option. But more than 360,000 people actually have this software. And they’ll get a software update over the air, so they don’t have to go anywhere, it’ll change the behavior of full self-driving. So to be clear, these cars are still on the road. They can still drive completely on their own. However, the program will be fixed in the coming weeks.
FADEL: Okay. So what was wrong with the software?
DOMONOSKE: Well, federal regulators say that after partially driving cars with fully self-driving cars on, they zeroed in on four specific things they were doing. One was only in turn lanes, going straight through the intersection. Another was not responding correctly when the speed limit was changed in places where the speed limit can be changed. Stop signs. sometimes the software would not stop until the stop sign. And the last one was flashing yellow lights unsafely. Tesla disagreed with the regulators’ analysis, but agreed to push out a software fix. Like I said, it’s coming soon.
FADEL: Okay. The software being updated is completely self-contained if you can talk about what exactly it is. And with this fix, is it safe?
DOMONOSKE: Yes. You know, I don’t think this amendment addresses the full self-driving safety argument. I mean, if you ask Elon Musk, this is both a safety feature that’s safer than a human driver and absolutely critical to Tesla’s future as a company. If you talk to many safety experts, they would say that this is a dangerous experiment to be carried out on public roads. That is the underlying concern. Either way, it’s unique to Tesla, right? Fully autonomous is a misleading name because the person behind the wheel still has to control what the car is doing. It is extremely important. But the software will steer, accelerate and brake not only on highways, but also on city streets with pedestrians, cyclists and traffic lights, the whole shebang. Sometimes it behaves really spectacularly. Sometimes it makes mistakes, which is why it doesn’t happen all the time, but people need to watch what their car is doing so they can take control. The other thing is that it’s technically still in beta. It receives constant updates, but at the same time it is used by hundreds of thousands of drivers.
FADEL: Okay. What does the future hold for full self-driving?
DOMONOSKE: Well, this is not the end of the conversation. There is more oversight from regulators. There are also lawsuits about this and related technologies that are set to unfold in the coming months.
FADEL: NPR’s Camilla Domonoske. Thank you so much, Camilla.
DOMONOSKE: Thank you.
(SOUNDBITE OF DUO SIRC’S “REPLICATION”)
NPR transcripts are generated on a rush basis by an NPR contractor. This text may be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the transcript.