People tell me that they don’t like the idea of self-driving cars. I mean, who is to say that they wouldn’t screw up? I understand the concern, but it is probably uncalled for. At one time, I was employed writing software to follow a given car from a video feed. It was a difficult problem. And I don’t think anyone has ever perfected it. But the truth is that even though the computer program wasn’t perfect, it was better than I was. So the question shouldn’t be whether self-driving cars are perfect; it should be whether they are better than us humans.
A bigger concern is that our cars could turn against us. I don’t mean in the way we saw in Terminator 3: Rise of the Machines. I mean in the way that your computer will sometimes turn against you by doing things you don’t want it to do and not doing the things you want it to. I’m talking about viruses. I’m talking about hacking. Over at Vox this afternoon, Timothy B Lee wrote a fairly long article about our future, The Next Frontier of Hacking: Your Car. He noted that coming technologies will make driving better in numerous ways, “But it’s also going to make it easier for tech-savvy troublemakers to cause serious harm — or even car crashes.”
This isn’t complicated. If you want a car that allows you to call up the manufacturer and get you car remotely unlocked when you’ve locked your keys inside, you are accepting the risk that some hacker will figure out a way to use that same technology to steal your car. Or more frighteningly, if you want a car that can automatically drive you to your destination, you are accepting the risk that some hacker will figure out a way use that technology to kill you. Sounds ominous, doesn’t it? Well, it doesn’t even have to be like that. You may remember the episode of Futurama where the crew go back 1940s earth — “Roswell That Ends Well.” The lack of GPS causes them to crash land. Be careful what technologies you depend upon!
Lee summarized a recent study that shows just how vulnerable modern cars are:
I’ll admit, I don’t like cars or driving. So from my standpoint, this all sounds pretty cool. Not that I would ever want to harm anyone. But the idea that a car could be hacked by giving someone a free copy of that new Taylor Swift album is the kind of whiz-bang stuff that techies just love. Other people skydive or, you know, have friends. Of course, it’s important to have people around who find this stuff fascinating. Because it is deadly serious:
Once hackers gain control over a car using any of these methods, they can do a lot of damage. They can activate the vehicle’s internal microphone and eavesdrop on conversations that take place inside. They can unlock the doors and disable the vehicle’s security mechanisms, making car theft easy.
Worst of all, attackers could cause the car to crash. A couple of years ago, for example, security researchers Charlie Miller and Chris Valesek demonstrated the ability to use the internal network of a Ford Escape to disable the brakes. They were also able to violently jerk the steering wheel of a Toyota Prius. If an attacker did these things while someone was driving down the highway, it could get people killed.
It’s important not to freak out. At least not yet. At this point, software in cars is not standardized. Hacker attacks would have to be custom made for a particle car. But the trend is toward making cars more and more like computers — just as happened to phones before them. And the car companies are slowly learning that they have to take this kind of stuff as seriously as they do crash tests. But ultimately, the changes that are coming will make cars better and safer. It’s important to remember that. Cars are infinitely safer than they used to be and they will continue getting safer. But it would still suck if some Ukrainian hacker made you drive into a ditch.