A study by Northeastern University researchers suggests that schools are safer than they were in the 1990’s. But headlines about cyberbullying and school shootings have communities understandably on edge. One in three parents surveyed in a 2018 PDK Poll say they feared for their students’ safety—nearly three times the percentage in 2013.
With safeguarding students’ online and physical safety as a top priority, schools are leaning on a variety of digital services. Many use web filtering tools that blacklist unsavory websites and monitor what students write and search for online. Others track behavior and disciplinary issues. And a few have started adding facial recognition technology to security cameras.
Yet as schools adopt technologies that can collect and monitor data in unprecedented ways, questions have emerged over how far surveillance should go—and whether educators, parents and students are aware of the implications and possible consequences. Specifically, how should schools strike a balance between what’s needed to ensure students’ safety, and how to protect their privacy?
That was the guiding question for a panel discussion at SXSW EDU, where privacy advocates joined a school administrator and a school safety software product manager to offer their perspectives.
“For me, it’s been difficult,” says Stephanie Cerda, an assistant principal at Austin Independent School District. “We have the responsibility on a daily basis to protect all of the kids who walk in our building from actual violence to online harm. We have to protect them from themselves, but there’s also the issue of protecting their data. We’re caught in the middle.”
Navigating that fine line between ensuring security and privacy is especially tricky, as it concerns newer surveillance technologies available to schools. Last year, RealNetworks, a Seattle-based company, offered its facial recognition software to schools, and a few have pioneered the tool.
The increasing availability of these kinds of tools raise concerns and questions for Doug Levin, founder of EdTech Strategies. For many of these providers, he notes, facial recognition is often processed in the cloud, via a service licensed from a third party that may have other use-cases—like law enforcement—in mind. “Once those faces are uploaded, I have lots of questions about how long they’re stored, how they’re used over time and the terms of services around the ownership of that data.”
This is still an emerging technology in dire need of refinement, he adds. And there are troubling reports that it can generate false positives and negatives. Facial-recognition police tools have been decried as “staggeringly inaccurate.”
To improve these systems, it comes as no surprise to Bill Fitzgerald, a researcher at New Knowledge, that some companies are offering such technology gratis. “When a company is offering a free facial recognition service, what they’re really doing is harvesting free training data so they can improve their models,” he says. It harkens back to that mantra: if a product you’re using is free, then you’re probably the product.
Filters Are Not Failsafe
When it comes to digital and online safety, schools often rely on web and social media monitoring tools. Over the years, Impero Software, a U.K.-based provider of such services, has developed a library of 20,000 keywords that could be flagged for adults to review. What’s important, says Courtney Goodsell, a product owner there, is allowing educators an opportunity to understand the context behind the choice of words, before making a decision.
“It’s a responsibility for anyone providing tools in education to offer proper support and training to help educators use tools not just effectively and efficiently, but with privacy in mind,” she says.
Over-relying on these tools to flag for behavior risks “teaching bad operational security” to students, says Fitzgerald, “because you’re teaching them that it’s actually okay if someone is looking through what they’re doing day to day.”
School web filters can also impact low-income families inequitably, he adds, especially those that use school-issued devices at home. “If you have money within your family to get your own device, you can access an unfiltered internet. If you don’t, well, there you are.”
Social-Emotional Learning: The New Surveillance?
While most school security software take a preventative approach (by watching for red flags), Levin sees the emergence of a new class of tools that attempt to proactively shape positive behavior.
These include disciplinary and behavior-management tools, which are commonplace. (Think digital demerits and gold stars.) Others include products that pitch themselves as social-emotional learning tools that record students’ feelings, through their communications and self-reflections. In doing so, Levin suggests, these tools essentially create student profiles that can be monitored and used to shape future behavior.
Using data to profile students—even in attempts to reinforce positive behaviors—has Cerda concerned, especially in schools serving diverse demographics.
If there are assumptions about how students ought to behave, she asks, “are those culturally responsive?”
“Think about the cultures that our students are coming from,” she urges. “Are we pushing all kids to act in a certain way?”
More Safety, Less Scary
As in the insurance industry, much of the impetus (and sales pitches) in the school and online safety market can be driven by fear. But voicing such concerns and red flags can also steer the stakeholders toward dialogue and collaboration. The hope is that if and when these tools are implemented, they’re used appropriately—not naively.
“The vast majority of people in this space are doing it for the right reasons. The intentions are good, and ultimately those reasons are doing what’s right by kids,” says Fitzgerald. Unfortunately, he adds, “a lot of these tools are less about what’s better for kids, and more about what’s easier for adults.”
Tools like Impero should not be considered a silver bullet by any means, says Goodsell. Rather, they present an opportunity to change safety policies, and even shape curriculum around digital citizenship and safety, and weave best online practices across everyday instruction. “There are good intentions behind these tools,” she notes. But they should also promote opportunities “to promote a culture of safety, [and] to have that one counselor or teacher that students can go to.”
Cerda offers this advice for companies developing these tools: “I just ask that you keep equity and social justice in mind, especially for diverse campuses like mine.”