When the world was forced to stay home during the COVID-19 pandemic, remote learning became the new normal for millions of students. Educational technology tools like Google Classroom, Nearpod, and Kahoot! allowed schools to transition the learning environment completely online for more than an entire year.
While this $85 billion market provided a convenient way for teachers and administrators to keep track of students’ progress, it also opened the door for data collection by other companies.
A group of researchers from the University of Chicago and New York University studied online learning and shared their findings in a paper that explored how educational technologies get into schools and what privacy risks these technologies pose to students. The paper, which will be presented at the upcoming ACM CHI Conference on Human Factors in Computing Systems, discloses that many of the technologies were unvetted before they were used with students, possibly leading to critical data security risks.
Advised by faculty members Marshini Chetty (UChicago) and Danny Yuxing Huang (NYU), UChicago second-year PhD student Jake Chanenson co-led the work with NYU’s Brandon Sloane along with a team of undergraduate students and a high schooler.
“EdTech use isn’t going away,” Chetty said. “There might have been an increased usage of certain technologies during COVID, but just because the pandemic is over doesn’t mean these technologies aren’t persisting. That means privacy issues are going to remain as well.”
The team conducted the research in two ways. First, they interviewed 18 stakeholders to determine what type of EdTech was being used and how those stakeholders managed student privacy and security around this technology. Then, they looked at more than 15,000 public school’s websites to see what EdTech sites are linked on these school sites to analyze privacy risks associated with those education-focused sites.
“We essentially looked at how educational technologies get into classrooms and in front of students,” stated Chanenson. “We’re starting to see more invasive EdTech products that use learning analytics to help tailor an educational experience to each student, but there hasn’t been much research into the privacy implications of these tools. They may be great pedagogically, but maybe not so great from a privacy perspective.”
Between 2016 and 2020, the Government Accountability Office found that there were 99 student data breaches that affected hundreds of school districts. In January 2022, a data breach at EdTech company Illuminate Education compromised highly sensitive information such as free-lunch and special-education status for 820,000 former and current students of the New York public school district.
For a minor, this type of security risk could have longer-lasting, negative effects that aren’t as easy to identify. When an adult falls victim to a data breach, it usually has immediate and direct consequences, from exposed bank accounts and credit scores.
However, someone who is underage might not understand or be aware of how their personal information could be used against them. A future employer could gain access to a student’s academic record, classroom behaviors, and disability status, potentially creating a hiring bias based on confidential information. If hackers used a student’s social security number to take out a line of credit it could go undetected for years.
What Chanenson and the team brought to light was that not only did schools lack resources to handle privacy and security incidents, but staff also had low data privacy awareness. Teachers play a major role in deciding what technology to bring into their classroom and usually rely on peer recommendation or vendor reputation. Typically, IT personnel would be responsible for vetting the EdTech tools, but neither IT nor the teachers have enough time or training to properly do so. As a result, they tend to assess the pedagogical benefits instead of the potential risks.
“You should have somebody whose job it is to decide what [educational technology tools] are appropriate,” stated Chanenson. “All of the schools we talked to are genuinely interested in keeping their students safe. But if you are bringing unvetted technology into the classroom without awareness that there might be potential privacy and security issues, then you don’t know what to look out for when trying to mitigate the issue.”
Another part of the problem was non-uniform or non-existent protocols across the school districts. Some policies required teachers to notify parents or contact law enforcement if data security or privacy incidents concerning student data occurred. Others needed a “wake-up call” before putting a plan in place.
The solutions to these problems can be multifaceted and complex, but Chanenson believes it comes down to legislation, awareness and accountability.
“Privacy laws in the United States are a very tricky issue because we have a sectoral privacy system. So, our health privacy laws are distinct from our education privacy laws and also from our major general consumer privacy laws. Because they don’t overlap, it means that your health data has robust protections, but your education data does not.”
“In our study, we found that school districts in Connecticut and Illinois, two of the school districts we looked at with specific state privacy laws, had better privacy practices because they were held to a higher standard. Unfortunately, there just isn’t enough awareness about these issues for there to be sufficient legislation on the floor to protect this. From the EdTech side, we need to call on these companies to do better. They should serve a good product that is held to a higher standard because these are students, and they deserve privacy as they try to learn, grow, and make mistakes within an educational environment.”
Chanenson is continuing his work by studying what types of data are commonly being extracted by EdTech tools and how often that data is being collected. He is working with an expanded team to create a browser extension for schools that will make tracking this information easier. To learn more about the team’s research, visit the Amyoli Internet Research Lab page.
—This story originally appeared on the Department of Computer Science site.