Review: Tech Refactored Ep. 6 - Student Privacy in the Era of Zoom School

Wed, 02/17/2021

This post is a summary of Episode 6 of The Nebraska Governance & Technology Center’s Podcast Series, Tech Refactored. Hosts Gus Hurwitz, Menard Director of the Nebraska Governance and Technology Center and Elana Zeide, Assistant Professor of Law at the University of NebraskaLincoln were joined by Anisha Reddy, Policy Counsel with the Future of Privacy Forum, Jim Siegl, Senior Technologist, also with the Future of Privacy Forum, Bill Fitzgerald, Privacy Researcher for Consumer Reports, and Chris Gilliard, Professor at Macomb Community College and Visiting Fellow at the Harvard Kennedy School Shorenstein Center on Media, Politics and Public Policy.

The global pandemic has transformed our relationships to technology in ways that most of us couldn’t have imagined. For parents, students, and educators in particular, among the most critical issues these changes have raised is protecting student privacy in the era of remote learning, or ‘Zoom School.’ And as our panelists discuss, when viewed through the lenses of race, class, and data-mining, anxieties about technologies that intrude into students’ homes couldn’t be more urgent. To discuss the extent to which the pandemic has highlighted existing imbalances between tech vendors and school districts, misuses of student data by developers that have already come to light, the ramifications of law enforcement gaining expanded access to student data, remote proctoring tools, and what concerned stakeholders can do to advocate for students, hosts Gus Hurwitz and Elana Zeide were joined by Anisha Reddy, Jim Siegl, Bill Fitzgerald and Chris Gilliard

As school districts around the country rushed to adopt new technologies to enable widespread remote learning in the wake of Covid-19, most were unprepared and under-resourced to fully grapple with the ways in which they were entrusting technology companies with enhanced access to student data, often with little more than a promise that those companies would manage that data responsibly. As Reddy notes, even prior to the pandemic, “schools didn’t have enough resources to fully understand or stand up methods of vetting technology and making sure that technology is appropriate and appropriately used in the classroom.” But as the scope of the learning space has moved into the home, the current practice of putting the onus on schools to detect what vendors are doing with student data is, in Fitzgerald’s words, “an unfair continuation of an imbalance of power that ultimately impacts students most of all.”

 There is evidence that some companies have already begun leveraging their access to student data in ways that are harmful to students, and which school districts are ill-equipped to prevent, or even monitor. Many contracts between districts and vendors contain clauses that allow for the use of student data for “product improvement,” which, according to Fitzgerald, some companies have interpreted as a broad license to use student facial recognition data in their ongoing product development. In one case in particular, this resulted in student data actually being hard-coded into the vendor’s software, a fact that didn’t come to light until two students independently made the discovery, and reported it to their schools. “The fact that students are doing the work that adults have failed to do for over five years is absurd,” Fitzgerald notes. 

The misuse of student information can become particularly disturbing when law enforcement becomes involved. Of particular concern to the panel was recent investigative reporting in the Tampa Bay Times that found that the Pasco County Sheriff's department was using student “data and personal information” provided by the school district and combining it with their own internal records to create a list of students believed to be at higher risk of criminality. “It’s Minority Report, basically,” says Reddy. For Siegl, the Florida investigation highlighted a larger dynamic playing out between school districts and third parties with access to school-furnished student data. “(The schools) have the accountability and the responsibility, but do they have the ability to monitor everything that a third party, in this case a sheriff’s office, is doing?”

Remote proctoring tools software that uses facial recognition or monitors eye movements, designed to detect whether a student is cheating have also raised concerns. Gillard in particular makes two points, first, that flaws in proctoring software have been shown to render them both ableist and racist, and second, that there is very little in the way of peer reviewed evidence to back up developers assertions that they are able to detect cheating in the ways developers claim. Fitzgerald sees this as emblematic of problems with edtech that have been adopted in the wake of the pandemic. “I think there needs to be some attention in terms of looking at what gets adopted, given the fact that these things really have zero evidence base behind them.”

Given all these concerns, is there anything that a student, parent, or other concerned stakeholder can do to help address these problems at the local level? “I am not an optimist,” says Gillard, “but one of the things I’ve been really encouraged by is the amount of student pushback on these products. And we’ve seen in a lot of cases that it is the students and parents pushing back against these things that actually moves the needle.” And the panel urges that those concerned don’t go it alone; finding allies is important. “Often an individual voice is easy to ignore or marginalize,” says Fitzgerald, “but multiple voices are much more difficult.”

Tags: Tech Refactored Review

Tech Refactored Episode Review