Today’s news is highlighting an issue revealed about today’s highly-computerized automobiles: That some of them can be hacked from the outside, and some functions can be disabled.  The Wired article demonstrated this on a Jeep.

Though I hadn’t touched the dashboard, the vents in the Jeep Cherokee started blasting cold air at the maximum setting, chilling the sweat on my back through the in-seat climate control system. Next the radio switched to the local hip hop station and began blaring Skee-lo at full volume. I spun the control knob left and hit the power button, to no avail. Then the windshield wipers turned on, and wiper fluid blurred the glass.

As I tried to cope with all this, a picture of the two hackers performing these stunts appeared on the car’s digital display: Charlie Miller and Chris Valasek, wearing their trademark track suits.

This comes at a particularly significant moment, when car and truck manufacturers are working on fully-autonomous vehicles that could one day dominate our roads, and American drivers are facing the potential world in which they have no manual control over the actions of their cars.

And it’s just another example of one of the greatest dangers of our time: That electronics and computers are far too vulnerable to malicious actions than they ought to be, and our day-to-day safety is at risk because of it.

cyber stealingNot only are there few vital and non-vital systems in this world that have some semblance of electronic control applied to them, there are virtually none that aren’t capable of being hacked into by a determined-enough—or even by a casual—attacker. In fact, it’s become one of the most familiar tropes of today’s society, the good or bad hacker that manages to access systems, computers and programs and change them for good or ill.  As more and more automation is being applied around us, new reports of malicious hacking come in every day, and people are afraid of their personal data being compromised… or financial accounts being drained… or communications systems eavesdropping on personal messages and compromising their privacy… and of their plains, trains or automobiles being directed to crash.

This threat is because of a fundamental flaw in our electronics and programming systems: That their base design was built by scientists and programmers who worked together in good faith, and expected other scientists and programmers to share their good intentions.  The thought behind this was to allow one person to be able to review and improve the core design created by another person, and therefore to improve the system.  This architecture of trust is the very foundation of our computers and electronics, the core on which everything is built.

But as electronics, computers and programming have evolved, they have given easy access to people outside of the scientific and programming professions, who have their own agendas and no interest in others’ good intentions.  The same attributes that made computers and programs so attractive also proved to make them vulnerable to unwanted manipulation… and because those attributes go so deep into the system, it’s proven almost impossible to prevent hacking and attack.  And as we rapidly approach a world in which computers control more and more of our society, and may soon exhibit a high intelligence of their own, this is a threat we can ill-afford.

binary codeOur computers and programs need to undergo a complete redesign, from the bottom-up, hardware and software, around new paradigms: Verified, not Trusted; and Common Sense.  Systems that can presently accept input from any source need to have sound firewalls that not only block unintended access, but put all input through a severe vetting to make sure it is from a proper source… and especially if it is from an unexpected or unique source.  Encryption must be the rule, not the exception, and must be involved in every step of a decision or input process.

Possibly most important is to give common sense to programming.  Vital programs need to have an innate awareness of the inputs they receive, to compare to the inputs they expect to get, and fully understand the ramifications of acting upon them, before they are executed; and must have the ability to ignore and even countermand inputs that they recognize to be bogus or dangerous.  Backup systems should provide real-time data so that programs can evaluate a set of inputs and orders against its surroundings, and make a determination of whether that input or order may pose a risk or contradict its understood function.

Programming like this would have prevented the hack of the Jeep as described in Wired.  The vehicle, when presented with outside inputs to change radio settings, operate wipers, etc, would have been smart enough to ignore them, as it would have the common sense to know that no outside input should need to operate such equipment.  It might even have the ability to advise the driver: “Someone seems to be trying to change the settings of X, Y and Z.  Do you want to okay this?”

jeep in a ditchAnd as the article further describes a moment when the outsiders hacked into the engine operation, essentially killing the vehicle’s engine while in traffic… an intelligent program would have been able to check its surroundings and determine that instructions from any source to alter the operation of the engine while in motion was a fundamentally bad idea.  It would then have the option of directing the driver that there was a problem requiring him to pull off the road, activating emergency signals, and even assisting in pulling the car over if the driver did not respond to queries.

It’s not enough to add these improvements to existing programs; hardware and software has been designed with too many base-level “back doors” that hackers can exploit, ways to bypass basic security settings and protocols to insert commands.  Those back doors cannot exist… they need to be completely removed, and the entire wall reconstructed to leave no trace of its former openings.  Everything must go through front doors, with strong security and common sense awareness at each stop and crossroad.  Programs must be designed to err on the side of caution, to pause when an instruction doesn’t check all the proper boxes, to double-check against other data, and to reject commands which aren’t right.

The world is becoming more intertwined and interdependent with each passing day; and our computers have become vital tools to the daily and future operations of our world.  But with the need to apply the latest in computer systems to fixing the complex ills that task us, there are legitimate concerns and threats that must be addressed.  Until we finally do a bottom-up redesign of our electronics and computer hardware and software, we will never accomplish the improvements society needs to survive.

Advertisements