July | August 2017


 

 

 

 

 

States Using Open Data to Make Elections Run More Smoothly

By Kamanzi G. Kalisa, CSG Director, Overseas Voting Initiative
While it always has been possible to share information openly, the potential for open data that is easily shared and analyzed has developed slowly in the United States.
At the turn of the 19th century, the Lewis and Clark expedition to the newly acquired western portion of the United States was an early example of information of gathering and sharing. By the 1970s, the scale of data being collected and the systems to distribute that data by public and private institutions began to increase rapidly. Today, the global economy increasingly is operating in an open data world, with constant streams of information tracking human behavior—from where people are shopping to what TV shows they are watching.
The private sector has been engaged actively in the open data space and producing large amounts of data concerning their operations in real time.
Election industry policymakers and administrators are catching up.
With the 2016 election cycle quickly approaching, Caltech and MIT’s collaborative Voting Technology Project is offering open data tools to help states improve the election process for election administrators and voters. The project’s Election Management Toolkit provides tools to help state and local election administrators calculate how many poll workers they will need, how to best manage polling place lines, and how to optimize voting equipment, online voter registration and mapping tools. The toolkit also offers analytics that sift through massive amounts of real-time and historical information to track patterns, identify problems and improve efficiency.
Cliff Tatum, executive director of the Washington, D.C., Board of Elections, said the open data tools available through the Election Management Toolkit have helped improve the efficiency and administration of elections in the District of Columbia.
“The toolkit’s calculator optimization tools use queueing theory to calculate the minimal number of service stations and voting machines necessary to process voters throughout the entire voting process,” he said. “This information was unavailable a generation ago and in many ways makes the planning process for my team much easier.”
The optimization tools only require a select number of inputs: state, machine type and number of registered voters in the precinct. The tool provides an idea of potential voter wait times based upon estimates of projected turnout, average time to vote a ballot, election equipment and average time required to check-in each voter.
Using these types of data tools can make Election Day run much more smoothly, say experts.
“Election officials can use these tools to ensure that they minimize the likelihood of problems throughout the voting process—from voter registration and early voting through Election Day,” said Thad Hall, co-author of the book, “Evaluating Elections: A Handbook of Methods and Standards and a collaborator in the Voting Technology Project.
“Using data to identify potential problem spots is not difficult and is time well-spent,” said Hall.
 
 
< Prev 1 | 2 | 3 Next >