National Academies Wades Into The Future of Voting Technology Question

The esteemed National Academies of Sciences, Engineering & Medicine recently announced the formation of the “Committee on the Future of Voting: Accessible, Reliable, Verifiable Technology.” This is an ad hoc committee, under the auspices of the Committee on Science, Technology and Law and the Board on Computer Science and Telecommunications. The committee is chartered to conduct a Study that will:

  1. Document the current state of play in terms of technology, standards, and resources for voting technologies;
  2. Examine challenges arising out of the 2016 federal election;
  3. Evaluate advances in technology currently (and soon to be) available that may improve voting; and
  4. Offer recommendations that provide a vision of voting that is easier, accessible, reliable, and verifiable. 

The committee will issue a final report at the conclusion of the Study.

As you might imagine, we are very excited to see this because every one of those four elements are topics we've been deeply engaged in and are core to our nonprofit mission, including considerable volunteer work our executives performed regarding item 2 on behalf of the prior Administration and members of Congress.

A distinguished team was announced during this workshop on April 4th and 5th, and is available on the Project Page.  Our executives know several of the members, eight of the eleven to be precise, and have worked with many of them.  Of course, it would have been awesome to see my manager, our CTO John Sebes selected for this Team as he has been completely immersed in this subject matter for 10 years—roughly a third of his professional career in computer systems science and information security.

Kick-off Meeting Recap

The Committee held a kick-off meeting in Washington D.C., and I had the privilege of attending in my capacity as a Sr. Policy Analyst in the Office of CTO here at the OSET Institute.  I took notes available for download (see below), and recap the meeting here.

David Baltimore (California Institute of Technology) and David S. Tatel (U.S. Court of Appeals for the District of Columbia Circuit) opened the meeting and offered some details and initial remarks.

Among their points, they noted this CSTL Study targets the intersection of technology and constitutional law. The goal of CSTL Study is to publicize ideas for election technologies that can go forward.  The premise is that voting should be as easy for old folks as it is for young, new voters.  Good idea, eh? ;-)

The Study is supported from funding by the Hewlett Foundation and the Carnegie Corporation, and its mission is four-fold:

  1. Determine standards for election infrastructure
  2. Review challenges in 2016 election
  3. Survey election technology landscape
  4. Develop vision for reliable and verifiable voting

All very important aspects and the core of OSET's work.  Several presentations were made from a number of domain experts including:

  • Thad Hall (Fors Marsh Group);
  • Brian Newby & Jessica Myer (U.S. EAC);
  • Geoffrey Hale (DHS);
  • Hon. Alex Padilla (Secretary of the State CA);
  • Hon. Matt Masterson (Chair, U.S. EAC)

And those presentations are available for download at the Project Page.

There were robust question and answer sessions with each speaker, and I offer some highlights here, with a summary thought in conclusion.

Thad was asked a variety of questions about his presentation on the U.S. election process in general. Thad noted that in 2012 dollars, the replacement cost for election infrastructure (all equipment) for the entire nation (3,300 counties and over 10,000 jurisdictions) was somewhere between $1-$2 billion (This calibrates with our findings of $3 billion being the replacement cost in 2016 dollars and there is more detail in the recently released Wharton-OSET Industry Study, “The Business of Voting.”)  Thad also noted that one technology trend in particular is the emergence of “ballot on demand.” 

Co-Chair Lee Bollinger asked, “What if we doubled money for election infrastructure; what happens?”  Thad responded that we would expect to see better security and better user experience, although he noted that necessary reforms extend beyond the platform to policies such as voting on weekends; early voting; etc.

In the midst of a discussion about experiments or forward-thinking projects like Travis County’s STAR Vote, Thad noted that the fear driving projects like this is about a “race to the bottom” if there is not a healthy robust competitive vendor environment.  “Today, for sake of shareholder value, vendors work to develop minimally viable technology at the highest possible margin.”  And Thad believes that to this end, usability standards for voting systems will a key to avoiding this phenomenon.

Brian Newby and Jessica Myers of the EAC presented an overview of voting technologies today and the impact of their work on the Voluntary Voting Systems Guidelines, version 2.0.

Another part of this work will surely include an examination on the notion of “criticality” ofelection infrastructure and whether and how it could or should be covered by the Federal definition of critical infrastructure as managed by DHS.  Geoffrey Hale was on hand to field several questions about this, which could consume a blog post of its own. OSET has been working on a White Paper about this very question, and should release that soon.

Some notes taken during that event available for download here and here (Day 2).  However, a for me, two key points came out of the Q&A sessions included:

  1. The degree of cyber security preparedness and protections vary widely from state to state; and
  2. The most common vulnerability DHS saw in working with over 30 States in the lead up to the ’16 election was unsupported legacy software and the need to update that software (and in many instance entire systems).

A unanimous opinion at this early gestation of this research project is that the funding issue for the update and upgrade of this infrastructure is essential.  Whether that's at (preferably) the State level or through some creative means at the Federal level (e.g., innovation research dollars) the funding to bring about innovation is the most pressing issue.

Summary Thoughts

This is clearly an important Study and project.  I presume that because it is the Academies undertaking this Study, a higher degree of credibility will be afforded the outcome than many other commentaries and works we've seen (e.g., more on the level of the PCEA Study from three years ago).  And assuming the work product of this Study matches or exceeds to the caliber of other Academies work, which it clearly should given the caliber of the research team roster, this incredibly important topic will continue to earn the level of national discourse it deserves.

Hopefully, the 10-years of research and development work the OSET Institute has invested in this arena will not be overlooked in this endeavor.  Hopefully, our several domain experts will be invited to offer perspective, insights, and content from the work of the Institute and the TrustTheVote Project.  And hopefully, our CTO’s domain expertise will be called upon, given that he has worked with the majority of this Team.

Similarly, we recently completed a year-long industry study in close collaboration with, and led by the Wharton School and its Public Policy Initiative. Our COO and co-founder Gregory Miller, also a 10-year veteran of this unique arena of government I.T., was a key participant along with a strong research team led by principal investigator Dr. Lorin Hitt of Wharton. That Report can be downloaded through links in this article covering that work.  Here too, this amounts to substantive content that OSET (and Wharton) can offer the Academies.

Our hope in general is the Academies’ CSTL will leverage what our 501.c.3 nonprofit election technology research institute can offer to avoid a re-inventing of the wheel.  Our contributions and content would help the CSTL to advance their Study into the deeper nuances of this important subject matter regarding the operational continuity and preservation of our democracy.  I can attest that every single element and aspect of the scope of the work as explained to us so far, and of the discussions over that day and a half, are subject matter the OSET Institute has been working on since 2007.

Our mission is to increase confidence in elections and their outcomes through the R&D of innovations in election technology to increase integrity, lower costs, improve usability, and ease participation.  Our Operating Principles—a manifesto of sorts in how we do this sustains our work and seems to perfectly position us to help the Academies’ CSTL on their Future of Voting Study.  I hope that can be leveraged to the benefit of this new important effort.

I have a sense they will from a couple of conversations. Now, fingers crossed.

-Patrick Reed

Previous
Previous

Addressing the Impact of Voter Data Tampering

Next
Next

A Call for the President to Act on Electoral Integrity Preparedness