Evaluating Electronic Voting Systems Equipped With Voter-Verified Paper Records
Evaluating Electronic Voting Systems Equipped With Voter-Verified Paper Records
G
NIRWAN overnments around the world are increas- must carefully
ANSARI, ingly replacing traditional paper ballots evaluate a print-
PITIPATANA and punch cards with electronic vot- er’s performance and its integration with the overall
SAKARINDR, ing systems such as the now 20-year-old voting system.
EHSAN direct-record electronic (DRE) system.1 In such Federal and state election commissions have made
HAGHANI, electronic voting (e-voting) systems, vendor-specific different recommendations for evaluating and cer-
CHAO ZHANG, software controls system functionality, records and tifying e-voting systems.4-8 In the US, states have
ARIDAMAN K. counts electronic votes, and generates the final re- developed different requirements that target their
JAIN, AND YUN sults. Although e-voting systems are subject to fed- particular needs. The Attorney General’s Office of
Q. SHI eral and state certification, election officials and New Jersey issued criteria for e-voting machines
New Jersey public advocates have raised many questions about equipped with printers4 and asked the New Jersey
Institute of their privacy, security, and performance. Before cer- Institute of Technology to test the various systems
Technology tifying a system, election officials must evaluate its against these criteria.9-12 As we discuss here, in the
hardware and software performance and its source testing and analysis process, we encountered several
code. However, simple lab testing is inadequate for issues of concern and formulated recommendations
detecting errors at all levels,1,2 and undetected flaws for addressing some of them.
can have devastating consequences in elections.
Furthermore, weak requirements are fundamentally System requirements
inefficient for ensuring voting system performance A DRE voting machine with VVPRS capability in-
and security. 3 Thus, each state must examine and cludes a ballot display unit, a voting panel, internal
evaluate its own requirements to ensure the vot- and external memories, a printer, a paper-record dis-
ing system’s functionality, security, durability, and play unit, and a paper-record storage unit. The voting
accessibility,4 as well as the voting results’ secrecy, systems we tested all use thermal printers and adopt
privacy, and accuracy. one of two methods: In cut-and-drop VVPRS, the in-
To increase voter confidence, some states have dividual printed paper records are cut and dropped
proposed—and in some cases, mandated—the ad- into a storage unit; in continuous spool VVPRS, the
dition of printers to voting machines.4-5 This lets vote selections are printed on a paper roll that rolls
voters verify their voting selections on paper re- continuously from one spool to another.
cords; officials then couple the electronic record of As Figure 1 shows, New Jersey’s criteria define the
each vote with a printed paper record. Using DREs system’s component functionalities.
with voter-verified paper-record systems (VVPRSs)
should instill full public confidence in the electoral Privacy requirements
process. To certify such a system, however, analysts Voter privacy requirements are as follows:
30 PUBLISHED BY THE IEEE COMPUTER SOCIETY ■ 1540-7993/08/$25.00 © 2008 IEEE ■ IEEE SECURITY & PRIVACY
E-voting
Security requirements
Security requirements exist for the
DRE System, the VVPRS, and for
the vote records.
times. Official workers must also have the opportu- Testing techniques
nity to compare the electronic and paper records after We designed and conducted four testing approaches—a
the election for audit and recount purposes. There- single test, a 1,200-vote simulated test, a 14-hour test,
fore, the electronic and paper records must be linked and a 52-vote test—to examine VVPRS against certain
through a unique identifier. state requirements; for all, we used accepted scientific
practices and methodologies. We recruited students
Integrity requirements with different backgrounds to act as “mock voters”;
There are separate integrity requirements for the paper they ranged from undergraduates to PhD candidates.
and electronic records. The paper record must include Mock voters cast votes in various voting scenarios,
every contest the voter casts on the DRE system, in- each of which represented particular selections of an
cluding write-ins and undervotes. It must also iden- election’s contest positions. We printed the scenarios on
tify the election, the election district, and the voting cards, which the testing team shuffled to achieve ran-
machine. Moreover, the paper record’s contents must domization prior to giving them to the mock voters.
be machine readable (using barcodes, for example) in Each voter made selections as indicated on each sce-
case a recount and audit is needed. As noted earlier, nario card under the testing team’s close supervision.
the paper record must contain error-correcting codes
to detect read errors. Finally, election officials must be Ballots
able to distinguish between an accepted and rejected We adopted two ballot types: one long, one short. As
paper record. Figure 2 shows, the long ballot—which we used for the
Electronic records must include all votes cast by 14-hour and 52-vote tests—contained 19 items to vote
the voter, including write-ins and undervotes. The on. We designed 12 voting scenarios to represent all pos-
electronic record can include some security identities, sible choices, including eight party voting scenarios that
such as digital signatures for an individual record and were completely balanced (two parties for seven contests;
for the election’s entire set of electronic records. seven “yes” or “no” questions; and 10 candidates listed
for the charter study commission). In the eight voting
Format, design, scenarios, each position got four Democratic (D) and
and functionality requirements four Republican (R) candidate votes, and each question
Developers must create a voting machine that works got four “yes” votes and four “no” votes. We also had
with minimum disruption on Election Day. The ma- four supplementary voting scenarios that we designed
chine must be provisioned with a sufficient amount to test possibilities not included in the eight scenarios.
of voting supplies, such as paper and ink. If the DRE Finally, we considered two additional cases from among
runs low on paper, it must let the current voter fully the 12 scenarios to test whether voters could reject and
verify all of his or her vote selections and successfully recast their ballot during the 14-hour test. In the first
cast the ballot without disruption before a paper re- case, voters voided their first set of selections (one of the
placement. Developers must design the VVPRS to 12 scenarios) and recast their votes for the second set
function only as a printer; it must not be capable of (another of the 12 scenarios). In the second case, voters
networking or communicating externally with any voided their first two sets of selections and recast their
system other than the DRE system. Finally, the elec- votes for the third and final selection.
tronic-ballot-image record’s format must be publicly We used a short ballot in the 1,200-vote test; this
available and non-proprietary; in addition, the ballot’s ballot featured the same 12 voting scenarios as the long
barcode must use an industry standard format and be ballot, but omitted the charter study commission and
readable with any commercial barcode reader. had few questions. The ballot contained eight party
voting scenarios (again, completely balanced, with two
Documentation and parties for five positions, and “yes” or “no” votes for four
certification requirements questions) and four supplementary voting scenarios.
The vendor must supply documentation for election For all volume tests, we retained summaries of the
worker training, voting instructions, and malfunction following records:
recovery. The vendor must submit all federally cer-
tified Independent Testing Authority reports on the • tabulation of final paper records,
DRE with VVPRS. • tabulation of the final paper records’ scanned bar-
codes,
Examination requirements • electronic records, and
The VVPRS must be subject to the State Voting Ma- • the closed poll’s tally report.
chine Examination Committee’s scrutiny. In addition,
the vendor must provide the state with the DRE and Each summary gave the vote counts for each contest
VVPRS source code for independent testing. candidate (including the questions).
2 N2 N7 N2 N5 W1 N7 N9 N2 N7 N10
3 N3 N8 N3 W2 W3 N10 N3 N8
4 N4 N9 N4
5 N5 N5
No. of 5 4 3 2 3 0 3 3 5 3 2 0
charter study
commission
voted
Figure 2. The long ballot. The 12 voting scenarios—eight major party and four supplementary—represent all possible choices. “R” and
“D” stand for a vote for a Republican or Democratic name, respectively. A blank space indicates a “no” vote for that position. For the
charter study commission, N1, N2, …, N10 indicate a vote for Name1, Name2,…, Name 10, respectively. W1, W2, and W3 are the three
write-in names for the charter study commission.
Single test to two-hour time slot using the long ballot. We gave
In the single test, we ran a one-time examination each mock voter a set of shuffled scenario cards derived
of specific criteria using different testing methods. from eight sets of eight major party voting scenarios
For example, the test might be a physical inspection and one set of four supplementary voting scenarios.
of various DRE and VVPRS components. In many We also randomly inserted questionnaire cards that
cases, we retrieved, studied, and compared paper asked voters questions about the voting scenario.
records, electronic records, and barcodes. For ex-
ample, we verified deployment of error-correction 1,200-vote simulated test
codes and digital signatures by closely examining The state’s criteria recommends that each machine be
these records. In some cases, we forced incidental capable of processing 750 ballots; we designed this test
and procedural hindrances—such as a paper jam—to to investigate the voting system’s response to a larger
observe the effect. We also closely examined all ven- than expected number of ballots, which tend to over-
dor documentations. load the system’s capability. Using a short ballot and
a scripted program, we ran a simulated test in which
14-hour test each machine continuously generated 1,200 votes. To
Our 14-hour test emulated actual physical voting sit- reach the 1,200 vote total, the test generated each of
uations over a 14-hour period (representing an entire the eight party-voting scenarios 125 times, and each
election day). Each mock voter cast votes over a one- of the four supplementary voting scenarios 50 times.
Ballot-activation An RFID smart card with a card An activation button An RFID smart card An Infrared Data
device encoder on an official panel with a card encoder Association (IrDA)
and a button at the proprietary device with
back of the DRE an encoding cradle
Electronic-record Built-in memory, flash drive, DVD Built-in memories, Built-in memories, a Three built-
storage device a proprietary device proprietary PCMCIA in memories; a
designed by Personal device compact flash card;
Computer Memory a proprietary, IrDA-
Card International designed device
Association (PCMCIA)
Audio-assisted voting A modified keyboard with four A proprietary four- A proprietary four- Four buttons with
interface different button shapes and a button panel and a button panel and a different shapes on the
headset headset headset DRE and a headset
Interfaces/adapters 2 Personal System/2 ports (PS/2 1 IEEE-1284; 1 RJ-45; 1 IEEE-1284; 1 RJ-45; 1 RS-232; 1 IrDA slot;
(observed) to USB adapter also provided); 2 PCMCIA slots 2 PCMCIA slots 1 compact Flash slot;
4 USB ports; 1 IEEE 1284; 2 audio
recommended standard ports (RS-
232); 1 supervideo graphics array
(SVGA); 1 registered jack (RJ-45); 1
Ethernet; 1 RFID slot; audio
VVPRS COMPONENT
Printer type Cut-and-drop Cut-and-drop Continuous spool Continuous spool
In cases in which the machines lacked the script capa- displayed on the DRE screen. Consequently, they’d
bility to automate this test, we had mock voters cast likely seek assistance from a poll worker, who might
the 1,200 votes manually. see vote selections (displayed on the DRE screen and
in the paper-record display unit), thus violating visu-
52-vote test ally impaired voters’ privacy.
Finally, we designed a 52-vote test to investigate the
special case in which the paper record extends to mul- Record privacy
tiple pages. This criteria applies only to VVPRSs us- We found several violations of the state’s paper and
ing the cut-and-drop method (in this case, machine electronic record privacy requirements (Figure 1, sub-
types 1 and 2). We ran this test using the long ballot category 1.2). First, regarding the creation and store
and mock voters. requirements, the type 4 machine recorded the elec-
tronic record when voters approved their ballots on
Problems and criteria exceptions the DRE screen, rather than after they approved the
We tested four machine types with different configu- paper record.
rations from different manufacturers. Table 1 shows Second, regarding the linkage between paper and
the machines’ features; to maintain confidentiality, electronic records, the machines did use a unique
we don’t disclose the vendors’ identities, the machines’ identifier to link the two records. However, in ma-
models and serial numbers, or other proprietary in- chine types 3 and 4, the reconciliation process was
formation. As Table 2 summarizes, the systems didn’t time consuming and difficult given a large vote vol-
comply with all of the state’s criteria during our tests. ume. With the type 2 machine, it would likely be im-
The problems and criteria exceptions fell into several possible to reconcile the two records if one or more
general categories as follows. paper records were lost.
6.1.3 VIII.D Documentation (generating a ballot image Not tested Not tested Not tested Not tested
record log)
6.2.1 IV.C.1.a Instructions for election worker training ✓ ✓ ✓ ✓
6.2.2 IV.C.1.b, Instructions for voters Not tested Not tested Not tested Not tested
IV.C.1.d
6.2.3 V.J, V.K, V.L, Documentation (VVPRS malfunction recovery) ✓ ✓ ✓ ✓
VIII.H
6.2.4 VI.C.1, VI.C.2, Certification ✗ ✗ ✗ ✓
VI.G
6.3.1 V.D.2 Cryptographic certification Not tested Not tested Not tested Not tested
6.3.2 V.J, V.K, VIII.G Documentation (DRE malfunction recovery) ✓ ✓ ✓ ✓
7.1 VI.B, VI.C, VII.A, Hardware and software examination Tested Tested Tested Tested
VIII.A VVPRS- VVPRS- VVPRS- VVPRS-
related related related related
hardware hardware hardware hardware
only only only only
7.2 VI.E Source code examination Not tested Not tested Not tested Not tested
A check mark (✓) indicates no problems during testing; a cross mark (✗) indicates problems related to the criteria requirements. Criteria in red italics
were not tested.
Third, regarding error detection and notification, set of electronic records using only the electronic re-
when we disconnected the type 1 machine’s printer cords, not their corresponding digital signatures. The
cable, the VVPRS didn’t send a signal to the offi- type 4 machine didn’t generate a digital signature for
cial. The voter could continue voting and cast the individual electronic records or for the entire set of
vote, but the machine failed to print a paper record. electronic records.
With the type 3 machine, a VVPRS mechanical er-
ror or malfunction didn’t prompt any error message Paper-record verification
or warning signal, it simply froze the system. Type 4’s We found three violations of the requirements in this
VVPRS couldn’t detect a paper jam; the voter could category (Figure 1, subcategory 3.1). First, after voters
cast votes, but the printer kept printing over the same on type 2 and 3 machines rejected their first two pa-
area on the paper, making it illegible. Moreover, if per records, the system wouldn’t let them adequately
the machine’s printer cable was disconnected after verify their third paper record (although it printed,
the voter pressed the “cast vote” button, the machine it displayed for only a few seconds before spooling).
recorded the electronic record, but didn’t print a bar- The type 4 machine printed only one paper record
code on the paper record. per voter; voters could review and verify subsequent
ballots on the DRE screen, but not on the paper re-
Vote record security mechanisms cord. Once they cast their ballots, the machine print-
All four machines violated vote record security re- ed the paper record, but it was rapidly advanced to
quirements (Figure 1, subcategory 2.3) in relation the spool.
to digital signatures. The type 1 machine generated Second, the type 4 machine let voters review and
electronic records’ digital signatures based on the modify each vote selection one-by-one an unlimited
vendor’s proprietary scheme, rather than on the re- number of times. It also immediately printed each
quired one-to-one scheme (that is, one digital sig- modification—that is, each selection, deselection, or
nature for each electronic record). The type 2 and change—line-by-line. However, it didn’t print un-
3 machines also failed to generate individual digital dervotes in the line-by-line printing, and thus vot-
signatures for each electronic record. Thus, all three ers couldn’t verify undervotes on the paper record
machines calculated digital signatures for the entire before casting.
software. An independent testing agency must be able Technology, 18 July 2007; www.nj.gov/oag/elections/
to evaluate this proprietary software to ensure the Hearing-Reports-7.07/NJIT-Advantage-report-7.07.pdf.
proper protection of electronic records. The agency 11. Sequoia AVC Edge Voter-verified Paper Record Assess-
should also rigorously evaluate all DRE system source ment, report to NJ Attorney General, NJ Institute of
codes to identify and reduce any code errors (bugs or Technology, 18 July 2007; www.nj.gov/oag/elections/
vulnerable codes) as well as any malicious codes (such Hearing-Reports-7.07/NJIT-Edge-report-7.07.pdf.
as backdoor codes). 12. Election Systems & Software iVotronic w/RTAL Voter-
verified Paper Record Assessment, report to NJ Attorney
General, NJ Institute of Technology, 26 Sept. 2007; www.
instill public confidence in using DRE with VVPRS. Nirwan Ansari is professor in the Department of Electrical and
Although our testing was applied only to voting sys- Computer Engineering at the New Jersey Institute of Technol-
tems subject to the State of New Jersey’s approval, we ogy. His research interests include various aspects of broad-
believe that other states can apply and tailor our analy- band networks and multimedia communications. Ansari has
sis, methodologies, testing scenarios, and solutions for a PhD in electrical engineering from Purdue University. Con-
their different requirements and needs. tact him at ansari@njit.edu.