Perspectives from the U.S. Elections Assistance Commission Public Hearing in Memphis
On April 10, 2019, at the historic Peabody Hotel in Memphis, Tennessee, I had the honor of presenting public testimony on behalf of the OSET Institute at a Public Hearing of the United States Election Assistance Commission (EAC). The topic was the latest version (still pending) of federal voting system standards: the Voluntary Voting System Guidelines (VVSG), Version 2.0. (currently content for Version 2 is only principles and guidelines for the ultimate finished version.)
The promise of VVSG 2.0 represents a pivotal point in ensuring high-confidence elections in a rapidly-changing global environment. It’s been nearly two decades since the Help America Vote Act (HAVA) was passed and the EAC was created, and since then the global technology environment has changed significantly. In the years since the EAC was created, the greater elections community has had ample opportunity to observe how the federal certification program works, and how it impacts the flow of election technology to our nation’s election officials and voters. Meetings like this Public Hearing are a critical part of assessing what’s working, what’s not, and what needs to change in the future.
Below are my observations about themes that emerged from the Public Hearing.
The elections community is working very hard and takes seriously the goal of bolstering the nation’s election infrastructure and ensuring high-confidence elections. If there’s one overarching takeaway that I have from this meeting, it’s that the broad community of election administration stakeholders cares deeply about the goals and outcomes that it’s their charge to protect. It was especially gratifying to see such a diverse group of people in attendance – members of the EAC Standards Board (state officials and county clerks), staff members dedicated to testing and certification, research organizations, and representatives of citizen groups. Although everyone who attended the Public Hearing can rightly be called an “elections geek” in some way, there’s no doubt that meetings like this are greater than the sum of their parts. It’s important and valuable to get many voices in the same room, talking to each other, and tackling the same issues -- each from a slightly different subjective vantage point.
Conscious efforts to be more collaborative are a valuable step forward. While the process of change is slow and difficult, many at the Public Hearing commented in some way about the importance of breaking down silos, and trying to get ahead of challenges by consulting with a broad spectrum of expert voices. Mary Brady, Program Manager of the National Institute of Standards and Technology (NIST) Voting Program, captured this point when she highlighted the broad-based dialogue that has generated the VVSG 2.0 Principles, and the still-in-progress Functional Requirements and Test Assertions for Voting System Test Laboratories (VSTLs). Rather than being devised in a vacuum, these outputs reflect years of collaboration between experts in usability, accessibility, security, election administration, and interoperability. And of course those efforts have been guided by real-world expertise from the EAC Standards Board and election administrators that must implement and live with the implications of VVSG-compliant voting systems, in live elections.
Even as they praised the EAC’s work, state and local officials continue to send the message that time is of the essence in completing the VVSG 2.0 program in its entirety. Based on comments from Ryan Macias, Acting Director of Testing and Certification, and Mary Brady, NIST Voting Program Manager, many in attendance are concerned that the VVSG 2.0 program may not provide the details required for manufacturers to begin developing compliant systems until late 2020. (Assuming that same timeline, that means that manufacturers might not complete development and certification of 2.0-compliant systems until 2021 or 2023.) Although the VVSG 2.0 Principles and Guidelines are currently out for public comment, there was broad understanding that that 5-page document, which contains only fifteen high-level principles and no Functional Requirements or VSTL Test Assertions, does not provide enough detailed information to allow voting system manufacturers to know what are the detailed capabilities that must be included in future voting systems. Mark Goins, Coordinator of Elections for host state Tennessee, expressed the opinion that “late 2020” for the availability of the completed requirements (pending completion and review by NIST) was a nonstarter.
Looking ahead, state and local elections officials expressed repeated concerns about being “stuck” in the middle of political quagmires that might impede the EAC’s effectiveness. It’s clear that the fits and starts and related uncertainties associated with the EAC repeatedly losing a quorum over the past decade has left election administrators nervous. More specifically, the possibility that a lack of a quorum of commissioners in the future might prevent the EAC from continuing to update and adopt new standards was a repeated concern. For the same reason, more than one election official emphasized the importance of devising programmatic mechanisms that will allow requirements to continue to be developed, updated, and adopted even in the absence of a quorum of commissioners.
The elections community is depending on the voting system manufacturers. Many expectations about the future hinge upon what the elections community can expect from vendors. There was much discussion, for example, about how to get the vendor community to “move” toward building newer systems; others wondered why voting system manufacturers have not produced any systems compliant with the current latest standards (VVSG 1.1). In short, there was a palpable sense of dependency in the room. Unfortunately, those uncertainties were preceded by the words “I can’t speak for the vendors…” more than once, because there were no representatives from the voting system manufacturers in attendance at the Public Hearing. I know that vendors care about our nation’s election infrastructure, and I know they are engaged with the EAC in many ways; but it would benefit them and the rest of the elections community to be in the room, hearing diverse voices from a broad range of stakeholders – as well as sharing vendor perspectives on the challenges they face in the federal certification program, and how it might be improved.
It appears that all parties, including the EAC and state and local officials, recognize that in the future the federal certification program must be more agile, given the pace of technology change and the rising global threat environment. I was most heartened by the unmistakable feeling that everyone in attendance recognized that we’re a long way from 2002, when HAVA passed and the EAC was created. Similarly, I heard a common theme that in order for election administrators to meet future challenges, the EAC must continue to evolve and critically re-assess the existing federal certification program. The greatest concern was ensuring that high-quality voting technology can be certified at a faster pace, with reasonable costs that are affordable for large and small jurisdictions alike. The recognition that today’s needs are different from the past is a good sign. Continuous improvement and laying the groundwork for future success cannot happen unless stakeholders identify and speak honestly, with candor, about the problems to be tackled.
The devil is in the details, a.k.a. Functional Requirements. The EAC has been listening to concerns like these, by taking steps toward a potentially more flexible VVSG structure. Specifically, the EAC has tried to offer some protections against inertia by making a conscious distinction in VVSG 2.0 between the high-level, relatively non-controversial “Principles and Guidelines,” which are less likely to change frequently, and the detailed “Functional Requirements” that will shape the design of election technology in the future. Attendees at the Public Hearing seemed to appreciate this attempt to move in a new direction. However, there still remains much uncertainty -- if not outright lack of consensus -- on how to ensure that requirements might be able to adapt to new needs in the future. Quickly.
Programmatically, this issue turns on the question of whether requirements constitute “policies” that require the approval of EAC Commissioners in order to be adopted, or not. EAC Chairwoman Christy McCormick was especially interested in this issue, and repeatedly asked different panelists,
Do you believe that functional requirements constitute policy, or not?
In the same vein, she also asked, if EAC Commissioners are not part of ongoing approvals for potential requirements changes, then what becomes of the EAC? Would it even be necessary?
Although some state and local election officials indicated (in what appeared to be a conciliatory way) to Chairwoman McCormick that they do believe that requirements constitute “policy,” my personal impression is that their responses were intended mainly to affirm the importance of the EAC’s institutional role. This was in contrast to specifically addressing the more consequential and “legalistic” question of whether, in principle, functional requirements could change or evolve based on approval only from EAC testing and certification staff, without requiring approval from the EAC Commissioners.
I did not have the opportunity to answer the question, “Do you believe that functional requirements constitute policy or not?” because no Commissioner posed that question to me specifically. Had I been given the opportunity to answer, I would have provided the less conventional (and perhaps less popular) answer: No, I do not believe that functional requirements constitute policy statements. Based on my years of experience in the discipline of product management, the distinction between “Principles and Guidelines” (which I believe do constitute policy) and “Requirements” (which do not) is akin to the classic distinction in product management between what and how. In product management, product managers specify a desired outcome for technology (or in this case, what the technology must “be”), without specifying or constraining the manner by which that outcome shall be generated. And based on those product goals, engineers devise technical requirements that specify how to implement features that will deliver the desired outcome.
This analogy is similar to the challenge at hand, regarding how to maintain flexibility in the federal certification program. VVSG 2.0 “Principles and Guidelines” constitute the “what”; they are policy statements about capabilities that voting systems must deliver (e.g., security; usability; audit-ability; interoperability; and so forth). Those principles and guidelines are precisely the sort of policy decisions that should require the approval of EAC Commissioners. However, as technology changes, the agility and adaptability of the federal certification program should not be dependent on the commissioners’ availability to weigh in on the details of how policies are achieved. Suppose, for example, that a new International Standards Organization (ISO) specification for the format of time-date stamps was determined by EAC certification staff to be preferable for purposes of voting system audit logs; the decision of whether to adopt that specification as a new or updated functional requirement in the VVSG is merely a “how” to achieve the principle (or “policy”) of audit-ability. In other words, requirements are simply statements of how to “operationalize” the achievement of a policy outcome. My own past experience of leading a team that commercialized and achieved four EAC certifications of a voting system leaves me no doubt that it would be bad for the certification program, and hence bad for election officials, to make the flow of technology dependent on EAC Commissioner approvals of potentially arcane functional specifications; changing requirements about such functional capabilities should be capable of being approved by EAC technology experts and staff.
I dwell on this point because I believe this understanding of Principles (i.e., policies, or the “what”) versus Requirements (the “how”) is vital to the future success of VVSG 2.0 in maintaining its relevance. In a rapidly changing global environment, adaptability and agility in the federal certification program will be paramount. The EAC’s decision to make a distinction between high-level “Principles and Guidelines” vs. “Functional Requirements” was a great first step in creating the conditions for such agility; but requiring Commissioner approval before any changes in requirements can be adopted would undercut that very same promise.
Finally, the OSET Institute’s recommendations were very much aligned with state and local officials’ emphasis on agility and adaptability in the federal certification program. It was gratifying to hear how the concerns voiced by election administrators throughout the day dovetailed with the Institute’s own perspective that VVSG 2.0 presents valuable opportunities to introduce more innovation, diversity, and agility into the federal certification process. My testimony focused on three topics:
Definition of “voting system.” Although past certification campaigns have been focused on “total” system configurations that include a comprehensive minimum set of end-to-end functions, there are alternative ways of thinking of a “voting system” in a manner that could still be consistent with HAVA’s definition, and that could allow more flexibility in the testing program.
Component-level certification. In conjunction with VVSG 2.0 requirements to support NIST Common Data Formats, the ability for manufacturers to develop, test and seek certification for individual portions of a voting system, rather than being required to submit only entire systems for certification, could introduce greater diversity and agility in the voting system marketplace.
The Cyber-threat landscape. Because the cyber-threat landscape is changing quickly, in the future, the VVSG 2.0 federal certification program must support relatively rapid changes to voting technology, at a pace faster than the last two decades have experienced.
In closing, I was honored and fortunate to be a part of this ongoing dialogue with a community of stakeholders all devoted to achieving the same goal: free and fair elections, where ballots are counted as cast, and where confidence is high in the outcomes; or, more succinctly: protecting our democracy.
My complete written testimony is available here.