It is not without a sense of schadenfreude that I see the surge of interest in continuity planning at many of our colleges and universities: pleasure that its value is now so painfully recognized, sadness in what it took to arrive there. Collectively, we are undergoing an unprecedented continuity drill. Our campuses are giant real-time tabletop exercises with predictable outcomes: in some areas we flounder, and in others we sustain. In this way, we are microcosms of society writ large.
And in a few instances, we are, in fact, heroic. As at no other time in recent memory, the “higher” in higher education has proven apt. A dear friend who’s a CIO said to me in the early days of helping her campus respond to the coronavirus crisis, “I’ve never felt more noble about what we’re accomplishing.”
But for anyone who works in or even pays attention to issues of privacy, the current pandemic has pushed us to teeter on an uncomfortable edge. All of us are wrestling with the challenge of effective and timely contact tracing as part of our efforts to reopen campuses. Contact tracing, with the loosely coupled social distancing monitoring, however, embody our worst privacy nightmares.
Mr. Schine: Let me ask you a question. Will you give the committee at this time the names of some Communist Party member whom you know?
Mr. Hughes: I do not know anyone to be a member of the Communist Party, sir. I have never seen anyone’s party card.
As we contemplate creating an edifice that will allow our every location, every human interaction, every “transmission moment” to be identified, we are rendering moot questions such as that asked by G. David Schine during the testimony of poet Langston Hughes during a U.S. Senate hearing in 1953. A robust contact tracing technology — most likely leveraging the location information from our phones — eliminates any need to ask someone whom they’ve associated with. We have the data.
As my own institution, the University of California, San Diego, dives deeply into testing and contact tracing, I find myself proud to be even peripherally involved. Our efforts truly revolve around the goal of ensuring the health and well-being of our community, and my respect has only grown for the academics leading our testing and tracing effort and the audacity of our program. If privacy within a higher education context is foremost a question of the appropriate use of personal data, I have trouble thinking of a more completely appropriate use than personal and public safety.
Despite that, our institutions’ race to embrace mass testing and contact tracing has me deeply uncomfortable. It is worth exploring that discomfort. I am not concerned that our community will abuse the contact tracing information: the challenges to privacy within higher education strike me as fundamentally distinct from those we typically read about. Unlike the commercial sphere, within our institutions, an individual’s data isn’t raw currency. It’s neither bartered nor sold; it’s exclusively used for the enhancement of pedagogy and student success. If your university is selling student or alumni data for profit, well, shame on you.
Yet contact tracing, as a regional if not national challenge, will necessarily go far beyond our institutional control. Colleges and universities are small cities nested within larger governmental structures, and as such, our data will spread as expanding ripples in a pond, from a local point to national shorelines. The practical requirements of effective contact tracing will assuredly draw our data out, and in that process, erode the aegis of confidentiality protections.
This is not the cynicism of a fringe libertarian but reasonable skepticism borne of a persistent pattern of abuse — from the handling of phone metadata after Sept. 11 to the 2017 attempt to collect data for political machinations. Yet as a child of the Apollo era, I remain starry-eyed on the nobility and greatness of our social and scientific mission. Most of the deeper problems we face require the kind of resources only a nation can deliver. Protecting the personal information that enables contact tracing is surely one of those problems.
So the hard question remains: How are we to permit the use of data for a noble purpose when the mechanisms that enable that purpose are so easily abused? Is this a technological problem? Is it a legal or regulatory problem? Or has the ship already sailed: Has the practice of abusing our personal information been entirely reified with the triumph of capitalism over democracy?
Perhaps another slightly different example would be helpful — or increase our collective discomfort. We can reframe this question to place it within the context of another long-standing pandemic. More than one in six women in the United States will be the victim of an attempted or completed sexual assault in her lifetime, according to some estimates. Yet as many as 200,000 rape kits remain unprocessed across the country. Perhaps we should propose that every man in the United States undergo a DNA test and the results be entered into CODIS, the national DNA database used by the FBI. Or we could perform a DNA analysis of every COVID-19 test and use that. Surely, mass processing of rape kits and mass DNA testing would bring justice to hundreds of thousands, if not millions, of victims of sexual assault.
These examples liberate us to engage in a host of nuanced questions. Is your privacy violated if your DNA results end up in CODIS despite the lack of a match to any extant rape kit? Alternatively, what are the ethical boundaries for the use of DNA information where no accusation of a crime exists? Fundamentally, privacy is merely a proxy for ethics when exploring these types of issues.
Can we quantify privacy itself? We are used to the disclosure of our general location through area codes when making a phone call. How fine does the resolution of our geographical location need to be to count as a privacy violation? What if your GPS coordinates were also displayed when calling someone?
In our first scenario, digital contact tracing requires a resolution of mere feet. In the second scenario, would you be comfortable losing the confidentiality of your health data if 100 rapists were caught? What about 100,000? Can we even imagine the calculus of a privacy loss against the experience of millions of women for whom justice is denied?
Clearly, “trust” could be the sword that cuts this Gordian knot. I want my COVID-19 results to be used to keep me safe and to help ensure those around me, loved ones or total strangers, are also safe. Do I trust that my data won’t be used for other purposes? Do I trust that a false positive won’t mark me forever as a rapist? Do I trust that my every contact won’t be provided to extrajudicial political committees? Will this information be used to deprive me, or those I know, of any fundamental rights?
Trust is particularly challenging at the moment due to our political climate. One result of the current global pandemic has been to codify the political principles of governmental mistrust. Some people are resigned to see citizens die rather than have our governmental leaders succeed in their most basic function: the protection of the health and safety of the body politic.
As a technologist I want to turn to technical solutions. One can imagine a highly redundant system for managing cryptographic tokens that authorize both data decryption and access. To some degree an infrastructural approach, a national initiative to build a data protection infrastructure, might closely resemble nascent efforts at providing digital “wallets” for the management of tokens of achievements and training such as transcripts. Using modern encryption techniques, we can create an ecosystem that allows individuals to effectively control to whom and when their information can be accessed.
Of course, this is not a problem mere technology can solve, nor is the technology always the difficult part. In fact, technology is often not the difficult part. Any solution will face both operational and regulatory challenges. Yet if nothing else, that klaxon that we are hearing should mobilize higher education to call for the establishment of a large-scale consortial system for the curation and management of personal information that puts individuals in direct and immediate control of their personal information. By forging our academic and technological prowess with a consortial approach to problem solving, higher education may be distinctly positioned to tackle this challenge. I would encourage presidents and chancellors, as well as academic thought leaders, to recognize that addressing privacy outside higher education can be part of our mission — a chance for the academy to place its stamp on the world and enter into the battle for civil rights.
Right now, academe essentially plays no role in the public discourse on the nature, description or limits to privacy. It remains for us a purely academic exercise. Instead we see governments negotiate regulations that are half-measures or laughable fines with the great data brokers who use our data as currency. It’s time to recognize that for the sake of those we love, for a future that’s for once not dystopian, we need to get in the game.