Contributors

Sunday, October 22, 2017

The Computer Threat Bigger than Skynet

People keep saying that driverless cars are the future. Like this guy in the New York Times. I beg to differ.

Driverless cars are a much bigger threat than an evil AI like the Terminator movies' Skynet, which in the movies nuked humanity into near-extinction. Mostly because if Skynet nuked us, it would destroy the people and power, communications, and manufacturing infrastructure that Skynet itself would need to exist.

As a programmer, I know that all programs have bugs. The programs that guide driverless cars will have hundreds of millions of lines. They will depend on other systems -- navigation, radar sensors, computer vision -- that also have hundreds of millions of lines. That means there will be millions of bugs. Bugs that won't get shaken out in testing because it's impossible to test every pathway through a program. Different combinations of inputs that never occurred during testing will inevitably uncover bugs that will cause fatalities.

It's impossible to keep the maps the cars need in sync with the real world. There will always be errors in the maps, and these errors will kill people. This isn't an idle fear because this has already happened: a woman died when her husband drove off a bridge because his GPS was outdated. Yes, he was an idiot for blindly following his GPS. But he was still smarter than your car's computer will be.

Then there's hacking. Since these cars will of necessity be connected to the Internet for updates, maps, traffic reports, etc., they will be vulnerable to hacking. If a hacker finds a way to do something simple but quite deadly -- like issue a shutdown command to every car in Los Angeles -- tens of thousands of people could die in a matter of seconds.

Or someone could threaten to do this to a Tesla or a Google, demanding billions of dollars in ransom, along the lines of those hackers who lock up peoples hard drives and demand bit coin payments. Or that army of Russian hackers could turn their attention away from goading Americans on Facebook into hating each other and attack our driverless cars.

Then there's GPS jamming. The key to all driverless cars are GPS receivers and the maps built into their navigation systems. GPS signals come from satellites in earth orbit. Radio signals weaken with the inverse square of the distance. That means that a signal twice as far away is four times weaker. A signal a thousand times further away is a million times weaker. GPS satellites orbit at 12,000 miles.

GPS signals are therefore extremely weak. A few local jammers could completely blind the car's navigation system, making it forget where it is. How many people this would kill is impossible to know. How thoroughly will GPS signal loss be tested in these cars to make sure the system brings the car safely to a stop?

Also, radio waves have a hard time going through water and metal. Will driverless cars work in tunnels, especially ones that go under rivers? There's a company that says it has a fix for this, but it's another computer system that can be jammed or hacked.

And then there's GPS spoofing. The Russians are actively testing GPS spoofing, which makes your GPS think you're someplace you're not. These false inputs could kill thousands of people, making them drive off roads and bridges, or into buildings, crowds and bridge abutments.

And then there's the North Koreans. We know they've hacked Sony, the New York Federal Reserve, and the South Korean military. They don't need nukes if they can hack our cars and bring our entire nation to a standstill.

Assistive automobile technology is a good thing -- sensors that let you know when there's someone in the lane you're switching into, or if you're going to run over a kid while backing up. But it's a mistake to surrender total control of your life to a computer, when there are so many bad actors out there who have more control over that computer than you do.

No comments: