Cool stuff and UX resources

< Back to newsletters

Introduction

As the announcements for 2004 candidacy become more frequent, it may be time to revisit the challenges of usability within the voting process. It is estimated that 4 to 6 million votes were "lost" in the controversial 2000 Presidential election. (What is; What could be – Fast Facts, CalTech/MIT Voting Technology Report, July, 2001.) Of those an estimated 1.5 to 3 million votes were lost to registration mix-ups, 1.5 to 2 million votes were lost to faulty polling equipment and confusing ballot design and up to one million votes were lost due to polling station policy problems. An unknown number of absentee ballots were lost or mishandled.

Deconstructing the voting process

All of these problems can be directly linked to usability problems:

  • Registration mix-ups

    Polling workers had difficulty mapping voter ID with voter status, or data collected/entered incorrectly at the voter registration stage caused confusion at the voter check-in stage.
  • Faulty polling equipment

    Poorly constructed or worded instructions resulted in problems such as improper machine setup, problems with tallying or reporting votes.
  • Confusing ballot design

    Inappropriately designed ballots because individuals without design or usability training are tasked with this important job. County voting officials – albeit with good intentions – prioritized typographic aesthetics or efficient use of white space over basic cognitive processes like gestalt grouping tendencies. Remember the infamous Butterfly Ballot?

    Add to these unfortunate design decisions the social pressure to vote independently and quickly. (There are laws about how fast you need to complete your vote in some states.) As such, voters – particularly older voters – may feel self-conscious about asking for help.
  • Polling station policies
    Individuals attempting to exercise their right to vote were prevented from doing so because poorly constructed or ambiguously written standards of practice resulted in too much variation in identification requirements.

While much ink has been spilled arguing for the need of usability testing for the specific voting apparatus, it appears there is a need to reassess the whole voting system to evaluate and improve the various task processes, the human-voting machine interaction and both human-machine and human-human error recovery strategies.

Leading edge technology: one step forward, two steps back...

The most public problems in the 2000 voting process occurred with the interpretation of the punch card ballots in Florida. Many have recommended that the outdated balloting systems be replaced with more up-to-date Direct Recording Electronic Devices (DREs). DREs are electronic voting machines that look like tablet PCs. To the technologically savvy they appear more usable because they support more direct manipulation, such as touch screens. The flexibility of on-screen presentation allows for significant improvements in voting accessibility: Typographic elements such as font size can be modified on a by-need/by-voter basis. DREs may also simplify the cognitive load of the voting process by segmenting elections decisions into a series of discrete, sequential decisions (or screens) rather than the current practice of presenting all of the election propositions on a single, somewhat overwhelming ballot. Roth (1998) observed that DRE voting was more systematic, following the traditional Western reading order from left to right and top to bottom compared to a random pattern on the mechanical level ballot.

The adoption of new voting technology appears to be driven by cost and perceived advancement, so DREs are rapidly replacing mechanical and optical voting apparatus: 4 times more voters were faced with electronic voting machines in 2000 than in 1980. The American public is just now really embracing electronic commerce. Are we ready for electronic voting?

Promise versus reality and the need for usability testing

The promise of DREs is great. However, the usability pressures on newer technology are also great: the detailed design of the actual voting screens will have a significant effect on the accuracy with which these new devices accurately capture the voters' intention. Proactive usability analysis and testing becomes even more important as counties and precincts move to adopt the more advanced technologies. To date, the few published studies that address ease and accuracy of voting are not encouraging.

In the wake of the 2000 election, Information Design Journal reprinted Roth's (1998) now-seminal voting interaction studies. She reports two observational studies with participant voters of different ages, using a touch screen versus lever mechanism voting machine (Study 1) and various combinations of Punch cards/paper ballots and touch screens (Study 2). The studies did not collect voter intention. Instead, voting success was equated to submitting a complete ballot (that is, voting on all possible candidates/issues). Several factors influenced voting success:

  • Height of display,
  • Ballot Illumination,
  • Task progress cues,
  • Information organization & grouping,
  • Type size,
  • Use of "Plain English",
  • Increased time pressure undermined intention-action match.

Reads a lot like a basic usability factors checklist, doesn't it?

Who did YOU (mean to) vote for?

More recently, researchers at the University of Maryland evaluated the DRE that is to be adopted by a number of counties in Maryland (Bederson, 2003). In that report, Bederson and colleagues present findings of both an expert review and a (limited) field study evaluating the usability of the voting apparatus. Their converging evaluations uncovered several usability problems which significantly undermine the likelihood of voters casting a complete and intended ballot. They identify issues such as general system failure, voter-card interaction failure, layout, affordances, text size, task flow, error recovery, visual design hierarchy and monitor glare (a particular problem for aging voters-one of the nation's most active voting constituency).

Voter's confidence in the interaction is also an issue: exit polls reflect that voters are not universally confident that the machine records the vote that they intended or attempted to cast. In exit interviews in Bederson's study, ten percent of participants stated that they did not feel confident that the ballot they had cast reflected their intended vote. In the state of Maryland, that would translate to 171,706 actual voters who were not confident about the accuracy of their cast ballot (Maryland State Board of Elections). Fully 8% of participants reported that they somewhat or did not trust the voting system. Based on their expert review of the technology and the field studies, Bederson's research team suggested that the polling places should:

"...provide the user with a printed record of the votes electronically recorded. Before leaving the polling place, the voter would be required to certify the contents of the paper record and place it into a ballot box..."

The University of Georgia's Carl Vinson Institute of Government quarterly Peach State Poll asked Georgia voters how confident they feel about the electronic voting systems now in place in Georgia. Press reporting of the survey focuses on the increase in confidence that voters have in the machines in general: in 2001, 56% of voters were Very Confident or Somewhat Confident that their vote was accurately counted. In December 2002, 93% of voters reported that confidence level. While that change may seem optimistic, the reverse of that that statistic – that 7% of voters were Not Very Confident or Not at all Confident that their vote was accurately counted should also give readers pause: roughly 88,400 active voters across the state of Georgia did not feel confident about the accuracy of their vote.

Looking closer, voter confidence varies widely on race lines. While 79% of Caucasian voters were Very Confident that their vote had been recorded properly by the new electronic voting system, only 40% of African-American voters felt that way. This disparity in voter confidence may reflect a variety of factors (e.g., confidence with the interface interaction, the impact of the digital divide, etc.). Critical to the human factors community, this disparity along racial lines highlights the importance of recruiting testing participants that truly reflect and represent the end user population in order to get accurate and meaningful test results.

In 1960, JFK beat Nixon by less than one vote per precinct...

Studies assessing voting efficacy, defined as casting a complete ballot that reflects the intended choices, are critical to establishing the usability of the various voting tools. Is it truly efficacious to embrace the emerging technology? Work by the CalTech/MIT Voting Project suggests that migrating to electronic balloting may be somewhat premature.

Their work reports longitudinal trends in residual voting errors. Residual voting errors occur when a ballot fails to register a vote for any reason. This may include failure to indicate a selection, indicating multiple selections or (in the case of paper ballots), stray marks that mean the ballot cannot be processed. They report residual voting error statistics for a range of voting technologies, including paper ballots, lever machines, punch cards, bubble (optically scanned) ballots and DREs. In their analysis, the residual voting rate of punch card methods and electronic devices is 50% higher than the rate for manually counted paper ballots, lever machines and optically scanned ballots.

Of particular interest is their additional analysis of the incidence of residual voting errors for counties that have recently switched voting technologies. Here the researchers are provided a unique naturalistic observation opportunity to compare residual error rates of the newly adopted technology with those of the replaced technology. This allowed them to determine if updating increased the likelihood of successful voting.

Comparing the new technology to the residual voting error baseline of lever voting machines (the most prevalent equipment), they predict the change scores derived below. In this table, positive numbers indicate that the newer technology results in a higher residual error incidence than lever machines.

Projected Change in Residual Voting Error by Machine Type
Paper ballot (v. Levers) – 0.55
Punch Card:Votomatic 1.32
Punch Card: DataVote 1.24
Bubble ballot 0.11
Electronic (DRE) 0.90

Note that the prediction algorithm takes into account the fact that DREs protect voters from over voting. Unlike paper/pencil ballots, DREs do not register over-votes. Based on the findings of the Caltech/MIT Voting Project report, only a switch to traditional paper-and-pencil ballots reduced the residual voting error. In contrast, counties that switch from levers to DRES can expect a significant increase in residual voting error: approximately 1% of all ballots cast will be unusable.

Their projection that DRE adoption increases residual voting error is validated by actual county return data: Precincts adopting DRE technology reported an increase in their residual voting error. Precincts that kept their lever technology or adopted bubble (optical) ballots reduced their residual voting error.

The authors suggest several possible causes for this increase in uncountable ballots. They include the technology learning curve, the voter's learning curve, inadequate administrative attention to care and maintenance of voting apparatus and the intimidation factor for less technologically sophisticated voters.

So what now?

All in all, these findings suggest that if America actually IS ready for an electronic polling place, technology to support that migration is clearly not. Basic interface and task flow usability problems that directly undermine voting efficacy surface in both the hardware and interface design of the publicly evaluated DREs. These challenges clearly reflect screen design challenges that are unique to DREs.

As the task of voting is made simpler through technology, it is important that the county officials responsible for selecting and implementing voting tools become familiar with usability issues and evaluation techniques. Cognitive walkthroughs, heuristic reviews and – most critically – usability testing of representative users should be employed to evaluate and iterate on the designs before the next election.

Lawmakers take action on voting issue

On May 22, 2003, Mr. Holt submitted the Voter Confidence and Increased Accessiblity Act of 2003 (H.R. 2239) to the 108th session of the House of Representatives. Among other things, this bill stipulates in Section B that:

  1. The voting system shall produce a permanent paper record, each individual paper record of which shall be made available for inspection and verification by the voter at the time the vote is cast, and preserved within the polling place in the manner in which all other paper ballots are preserved within the polling place on Election Day for later use in any manual audit.
  2. The voting system shall provide the voter with an opportunity to correct any error made by the system before the permanent record is preserved for use in any manual audit.

and then later...

All software and hardware used in any electronic voting system shall be certified by laboratories accredited by the Commission as meeting the requirements of clauses (i) and (ii).

A post hoc feedback/error checking method is not quite the solution we, as usability professionals, would recommend. However, this legisation does indicate that the stakeholders recognize there is a problem...which is typically the first step toward the right track.


References

Bederson, J. Herrnson, P. and Neimi, R., (2003), Electronic Voting System Usability Issues, ACM Computer Human Interaction Conference, Ft. Lauderdale, Florida, April 5-10.

Roth, S. (1998), Disenfranchised by Design, Information Design Journal, 9(1). Reprinted electronically:

CalTech/MIT Voting Technology Project Report: Residual Votes Attributable to Technology (Version 2), March 30, 2001.

Maryland State Board of Elections, 2002 Gubernatorial General Election - Voter Turnout - Statewide.

Georgians Express Confidence in New Electronic Voting System, Carl Vinson Institute of Government, University of Georgia.

Credit for Voting Report, Georgia Secretary of State Elections Office.

Message from the CEO, Dr. Eric Schaffer — The Pragmatic Ergonomist

Leave a comment here

Subscribe

Sign up to get our Newsletter delivered straight to your inbox

Follow us

Privacy policy

Reviewed: 18 Mar 2014

This Privacy Policy governs the manner in which Human Factors International, Inc., an Iowa corporation (“HFI”) collects, uses, maintains and discloses information collected from users (each, a “User”) of its humanfactors.com website and any derivative or affiliated websites on which this Privacy Policy is posted (collectively, the “Website”). HFI reserves the right, at its discretion, to change, modify, add or remove portions of this Privacy Policy at any time by posting such changes to this page. You understand that you have the affirmative obligation to check this Privacy Policy periodically for changes, and you hereby agree to periodically review this Privacy Policy for such changes. The continued use of the Website following the posting of changes to this Privacy Policy constitutes an acceptance of those changes.

Cookies

HFI may use “cookies” or “web beacons” to track how Users use the Website. A cookie is a piece of software that a web server can store on Users’ PCs and use to identify Users should they visit the Website again. Users may adjust their web browser software if they do not wish to accept cookies. To withdraw your consent after accepting a cookie, delete the cookie from your computer.

Privacy

HFI believes that every User should know how it utilizes the information collected from Users. The Website is not directed at children under 13 years of age, and HFI does not knowingly collect personally identifiable information from children under 13 years of age online. Please note that the Website may contain links to other websites. These linked sites may not be operated or controlled by HFI. HFI is not responsible for the privacy practices of these or any other websites, and you access these websites entirely at your own risk. HFI recommends that you review the privacy practices of any other websites that you choose to visit.

HFI is based, and this website is hosted, in the United States of America. If User is from the European Union or other regions of the world with laws governing data collection and use that may differ from U.S. law and User is registering an account on the Website, visiting the Website, purchasing products or services from HFI or the Website, or otherwise using the Website, please note that any personally identifiable information that User provides to HFI will be transferred to the United States. Any such personally identifiable information provided will be processed and stored in the United States by HFI or a service provider acting on its behalf. By providing your personally identifiable information, User hereby specifically and expressly consents to such transfer and processing and the uses and disclosures set forth herein.

In the course of its business, HFI may perform expert reviews, usability testing, and other consulting work where personal privacy is a concern. HFI believes in the importance of protecting personal information, and may use measures to provide this protection, including, but not limited to, using consent forms for participants or “dummy” test data.

The Information HFI Collects

Users browsing the Website without registering an account or affirmatively providing personally identifiable information to HFI do so anonymously. Otherwise, HFI may collect personally identifiable information from Users in a variety of ways. Personally identifiable information may include, without limitation, (i)contact data (such as a User’s name, mailing and e-mail addresses, and phone number); (ii)demographic data (such as a User’s zip code, age and income); (iii) financial information collected to process purchases made from HFI via the Website or otherwise (such as credit card, debit card or other payment information); (iv) other information requested during the account registration process; and (v) other information requested by our service vendors in order to provide their services. If a User communicates with HFI by e-mail or otherwise, posts messages to any forums, completes online forms, surveys or entries or otherwise interacts with or uses the features on the Website, any information provided in such communications may be collected by HFI. HFI may also collect information about how Users use the Website, for example, by tracking the number of unique views received by the pages of the Website, or the domains and IP addresses from which Users originate. While not all of the information that HFI collects from Users is personally identifiable, it may be associated with personally identifiable information that Users provide HFI through the Website or otherwise. HFI may provide ways that the User can opt out of receiving certain information from HFI. If the User opts out of certain services, User information may still be collected for those services to which the User elects to subscribe. For those elected services, this Privacy Policy will apply.

How HFI Uses Information

HFI may use personally identifiable information collected through the Website for the specific purposes for which the information was collected, to process purchases and sales of products or services offered via the Website if any, to contact Users regarding products and services offered by HFI, its parent, subsidiary and other related companies in order to otherwise to enhance Users’ experience with HFI. HFI may also use information collected through the Website for research regarding the effectiveness of the Website and the business planning, marketing, advertising and sales efforts of HFI. HFI does not sell any User information under any circumstances.

Disclosure of Information

HFI may disclose personally identifiable information collected from Users to its parent, subsidiary and other related companies to use the information for the purposes outlined above, as necessary to provide the services offered by HFI and to provide the Website itself, and for the specific purposes for which the information was collected. HFI may disclose personally identifiable information at the request of law enforcement or governmental agencies or in response to subpoenas, court orders or other legal process, to establish, protect or exercise HFI’s legal or other rights or to defend against a legal claim or as otherwise required or allowed by law. HFI may disclose personally identifiable information in order to protect the rights, property or safety of a User or any other person. HFI may disclose personally identifiable information to investigate or prevent a violation by User of any contractual or other relationship with HFI or the perpetration of any illegal or harmful activity. HFI may also disclose aggregate, anonymous data based on information collected from Users to investors and potential partners. Finally, HFI may disclose or transfer personally identifiable information collected from Users in connection with or in contemplation of a sale of its assets or business or a merger, consolidation or other reorganization of its business.

Personal Information as Provided by User

If a User includes such User’s personally identifiable information as part of the User posting to the Website, such information may be made available to any parties using the Website. HFI does not edit or otherwise remove such information from User information before it is posted on the Website. If a User does not wish to have such User’s personally identifiable information made available in this manner, such User must remove any such information before posting. HFI is not liable for any damages caused or incurred due to personally identifiable information made available in the foregoing manners. For example, a User posts on an HFI-administered forum would be considered Personal Information as provided by User and subject to the terms of this section.

Security of Information

Information about Users that is maintained on HFI’s systems or those of its service providers is protected using industry standard security measures. However, no security measures are perfect or impenetrable, and HFI cannot guarantee that the information submitted to, maintained on or transmitted from its systems will be completely secure. HFI is not responsible for the circumvention of any privacy settings or security measures relating to the Website by any Users or third parties.

Correcting, Updating, Accessing or Removing Personal Information

If a User’s personally identifiable information changes, or if a User no longer desires to receive non-account specific information from HFI, HFI will endeavor to provide a way to correct, update and/or remove that User’s previously-provided personal data. This can be done by emailing a request to HFI at hfi@humanfactors.com. Additionally, you may request access to the personally identifiable information as collected by HFI by sending a request to HFI as set forth above. Please note that in certain circumstances, HFI may not be able to completely remove a User’s information from its systems. For example, HFI may retain a User’s personal information for legitimate business purposes, if it may be necessary to prevent fraud or future abuse, for account recovery purposes, if required by law or as retained in HFI’s data backup systems or cached or archived pages. All retained personally identifiable information will continue to be subject to the terms of the Privacy Policy to which the User has previously agreed.

Contacting HFI

If you have any questions or comments about this Privacy Policy, you may contact HFI via any of the following methods:
Human Factors International, Inc.
PO Box 2020
1680 highway 1, STE 3600
Fairfield IA 52556
hfi@humanfactors.com
(800) 242-4480

Terms and Conditions for Public Training Courses

Reviewed: 18 Mar 2014

Cancellation of Course by HFI

HFI reserves the right to cancel any course up to 14 (fourteen) days prior to the first day of the course. Registrants will be promptly notified and will receive a full refund or be transferred to the equivalent class of their choice within a 12-month period. HFI is not responsible for travel expenses or any costs that may be incurred as a result of cancellations.

Cancellation of Course by Participants (All regions except India)

$100 processing fee if cancelling within two weeks of course start date.

Cancellation / Transfer by Participants (India)

4 Pack + Exam registration: Rs. 10,000 per participant processing fee (to be paid by the participant) if cancelling or transferring the course (4 Pack-CUA/CXA) registration before three weeks from the course start date. No refund or carry forward of the course fees if cancelling or transferring the course registration within three weeks before the course start date.

Cancellation / Transfer by Participants (Online Courses)

$100 processing fee if cancelling within two weeks of course start date. No cancellations or refunds less than two weeks prior to the first course start date.

Individual Modules: Rs. 3,000 per participant ‘per module’ processing fee (to be paid by the participant) if cancelling or transferring the course (any Individual HFI course) registration before three weeks from the course start date. No refund or carry forward of the course fees if cancelling or transferring the course registration within three weeks before the course start date.

Exam: Rs. 3,000 per participant processing fee (to be paid by the participant) if cancelling or transferring the pre agreed CUA/CXA exam date before three weeks from the examination date. No refund or carry forward of the exam fees if requesting/cancelling or transferring the CUA/CXA exam within three weeks before the examination date.

No Recording Permitted

There will be no audio or video recording allowed in class. Students who have any disability that might affect their performance in this class are encouraged to speak with the instructor at the beginning of the class.

Course Materials Copyright

The course and training materials and all other handouts provided by HFI during the course are published, copyrighted works proprietary and owned exclusively by HFI. The course participant does not acquire title nor ownership rights in any of these materials. Further the course participant agrees not to reproduce, modify, and/or convert to electronic format (i.e., softcopy) any of the materials received from or provided by HFI. The materials provided in the class are for the sole use of the class participant. HFI does not provide the materials in electronic format to the participants in public or onsite courses.