Voting and Mix-And-Match Software

AUTHOR
Wade L. Robison

ABSTRACT

Abstract: The most important concern with the integrity of voting machines which use software is the integrity of the software itself — the assurance that it is correctly recording votes and, at the end, tallying them up correctly. We in the United States have had a number of situations in which problems traceable to the software in voting machines has tainted the voting process.

Yet concentration on this problem should not obscure the importance of another that can equally taint the voting process. The design of the Palm Beach paper ballot is now famous, or infamous, for misdirecting voters so that even the most intelligent, well-trained and most highly motivated would make mistakes — like voting for Pat Buchanan instead of Al Gore, a mistake apparently made by upwards to 23,000 voters, more than enough to have changed the outcome of the election — and our subsequent history. That ballot illustrates well what I call an error-provocative design.

Error-provocative designs are those which provoke mistakes on the part of the most intelligent, well-trained, and most highly motivated users. The software in the autopilot in the Columbia airliner that flew into a mountain side in 1996 was error-provocative. The pilot’s job was to key in the initial letter of the beacon for the airport where the plane was to land. The autopilot would then pick the top of the listed five options that would appear and land the plane. The default was that the closest was at the top of the list — unless the pilot keyed in “R,” in which case the software selected Bogota. The plane was to land at Cali. Its beacon began with the letter “R” as did the beacon for Bogota. When the pilot keyed in “R,” the plane turned towards Bogota. The pilots did not figure out that there was a problem until it was too late. 159 people were killed when the plane flew straight into a mountain side near Cali.

Not even the most intelligent, well-trained, and high motivated of pilots is likely always to avoid the error that led to that disaster. Putting two defaults in the autopilot software was a recipe for disaster. But it was a self-contained recipe. Everything occurred on a screen. We often have the same kind of error-provocative design when an image is produced on a screen set within a frame. The frame has buttons to push, for instance, that trigger the next item on the menu. ATMs often work this way, with the software producing choices — “Checking” or “Savings” — for the operator to choose between by pushing a button to the right of the arrows following “Checking” and “Savings.” All too often, the arrows on the screen do not match up directly with the buttons — just as in the Palm Beach ballot, and the operator must guess whether it is the button above or below the arrow that is to be pushed. My bank ATM works this way with a perverse twist: I am always to push the button above the arrow except once — when I am to push the button below the arrow. If I do not then push the button below the arrow, I am shunted out of the system, informed that no transaction has occurred, and, then, curiously enough, thanked for making a transaction. I get a receipt with nothing on it except the date and time and name of the bank.

The operator is given no warning that the next item in the menu requires that the lower button be pushed, and though you can learn how to use the machine, it is easy to forget if you do not use it often — and quite exasperating because if you do not pay very careful attention, you will not remember where in the menu things went wrong.

This example illustrates a general truth for software engineers and those using software in such framed settings — on voting machines, for instance, as well as ATMs or airport kiosks for checking in, to give just two more examples. Software engineers are morally obligated, at a minimum, to avoid error-provocative designs. Such a design would be the weapon of choice for an evil genius of an engineer, determined to create engineering artifacts that would cause great harm. Just imagine such an error-provocative design in the software that runs a nuclear plant, or a subway system, the air traffic control system, or electronic voting machines. Engineers are morally obligated to avoid such designs just because they can cause great harm.

Yet as my bank’s ATM illustrates, software engineers might not think too much about how their software is going to be displayed, particularly when the screen is framed. A design that would not cause undue harm were the menu self-contained on a screen could cause great harm once framed. It is not sufficient for a software engineer to say, “That’s not my problem. I just do software.” The problem is with an inconsistency in the ATM software: the code has an anomalous exception to the standard code that requires pushing the upper button to proceed. But, just as important, software engineers are responsible for ensuring that their software works when in place. That means ensuring it works with other software that may already be there, but also means ensuring that it works if the screen is framed.