In a quiet, residential neighborhood in Seattle, situated within the red-brick walls of a Catholic church, is a small, long-running pre-K-8 school called St. Therese Catholic Academy.
St. Therese is not unlike a lot of other schools in the U.S. It has a basketball team, a science lab and a dedicated time for recess. And, increasingly, as one tragic school event follows another in the news, it’s experimenting with a new approach to safety. The spate of school shootings has rattled the most idyllic communities and spurred school leaders to take extra measures—and spend extra money—to deter similar tragedies at home.
It’s not that St. Therese had no security system in place at all. Up until last fall, a small camera at the entrance fed into a tiny, square-inch screen that the office staff checked to determine whether to buzz someone into the school building, says Matt DeBoer, the principal.
But after last year’s school shootings in Parkland, Fla., and Santa Fe, Texas, St. Therese overhauled its system, affixing the red-brick exterior of the church with several high-resolution cameras and equipping them with facial recognition software. “So we went from nothing, to state-of-the-art,” DeBoer says.
What, if any, difference that makes from a safety standpoint is contested and unclear. But St. Therese is not the only school beefing up its lines of defense. The education sector spent an estimated $2.7 billion on security equipment and services in 2017, a number that’s expected to grow year over year, according to IHS Markit, a market research firm.
Despite questions over its effectiveness at deterring school violence, the new high-tech tool fulfills its purpose in at least one important area of school operations: making staff feel safer.
DeBoer himself acknowledges that “there’s an emotive reality and a data reality” when it comes to school safety. The latter is that violent deaths in schools have stayed relatively constant over the last 30 years, according to data from the National Center for Education Statistics. But then there’s the emotive reality, which is that every time another event like Sandy Hook or Parkland occurs, many educators and students feel they are in peril when they go to school. The technology, at least, offers a crucial sense of relief to staff, for whom “the feeling” of safety is “real,” DeBoer says.
Take the school’s office manager, who’s been with St. Therese over 30 years and is the first person visitors see when they enter the school. With the facial recognition software in place, ensuring that no unauthorized adults are coming into the building, “she says she’s never felt safer or more comfortable in her job,” DeBoer adds. “It’s that peace of mind. Something is greater than nothing.”
A Safer Solution?
St. Therese is a relatively sleepy school, serving just over 150 students and 30 staff. So while DeBoer says he, like every other school administrator in the country, thinks often about the safety and security of his students and staff, that wasn’t the only reason he became interested in facial recognition technology. “We were open to innovative ideas, doing things unconventionally,” he says. He wanted to be among the first to try out this new software.
By spring 2018, DeBoer was hearing a lot about RealNetworks, a Seattle-based software company that was popular in the 1990s for its audio and video streaming services but has since expanded to offer other tools, including SAFR (Secure, Accurate Facial Recognition), its AI-supported facial recognition software. Over the summer, when St. Therese was already in talks with RealNetworks, the company announced it would make SAFR free to all K-12 schools.
While the software was free, DeBoer still had to upgrade the school’s infrastructure to support SAFR. After installing new security cameras, purchasing a few Apple devices and upgrading the school’s Wi-Fi, St. Therese was looking at a $24,000 technology tab.
When the private school officially began using SAFR at the start of the 2018-2019 school year in September, it was indeed one of the first K-12 schools to try it. St. Therese outfitted two of its five entryways with the facial recognition cameras and registered all staff—but no students—in the system, a simple process that involves scanning a face and typing a name into a computer. The software is programmed to allow authorized users into the building with a smile. “You go up, give it a neutral look so it recognizes who you are, then smile and [the door] opens,” DeBoer says. Other visitors require individual approval to be granted access.
The technology was an instant hit among staff, DeBoer says. “‘Cool’ is the adjective I hear most.” But it goes beyond the sleek sci-fi allure. “We don’t want to think about safety and security, but we kind of have to,” he says. “This technology has taken that worry and thinking away from classroom teachers to free them up to teach.”
About a dozen other schools and districts have gotten SAFR up and running, and hundreds more have expressed interest in it, says Mike Vance, senior director of product management at RealNetworks.
But this technology cannot guarantee that schools will become any safer or more secure, privacy experts tell EdSurge. Facial recognition is most effective in very specific use cases, like a high school that opens its doors to the public for community events and needs to keep certain “known threats,” like a parent with a restraining order or a registered sex offender, out of the building. SAFR’s technology doesn’t reach its full potential when used simply to identify who comes and goes from every building, says Sara Collins, policy counsel at the Future of Privacy Forum (FPF), a Washington, D.C.-based think tank.
“Facial recognition isn’t a panacea. It is just a tool,” says Collins, who focuses on education privacy issues. “If you have a problem this tool is useful for solving, it may be a good investment. But if it’s just this nebulous, we’re-making-our-school-safer idea, you won’t get much utility out of it.”
For one, Collins says, facial recognition databases require a lot of maintenance to stay accurate. Students come and go, as do school staff. There are plenty of reasons some names would have to be added or removed. “You have to have someone whose job, or at least part of it, is dedicated to maintaining this,” she explains, because having an out-of-date database can be as unhelpful as having none at all.
At a school as small and tight-knit as St. Therese, maintaining a current database may not be a problem. For now, only the 30 staff at the school are registered in SAFR, DeBoer says, with an option for parents to join coming in the near future. Those will be easy enough to keep track of. But DeBoer is considering adding outside visitors and volunteers to the database—“regulars” at St. Therese, he calls them. This includes the lunch caterer, the milk delivery person and the mail carrier. “Being able to know those people by name makes us stronger and increases that sense of safety,” DeBoer says. “There’s power in calling people by their name.”
However well-intentioned, that’s a “really bad idea,” says Brenda Leong, senior counsel and director of strategy at FPF who specializes in AI and biometrics, including facial recognition. The mail carrier could be fired at any time, she points out. Would the school be notified? What if that person were fired for child abuse and still had unrestricted access into the school?
“That’s part of the problem of implementing a system like this,” Leong says.
‘SAFR,’ Not ‘SAFE’
Another part of the problem with tools like SAFR, she adds, is it provides a false sense of security.
With a high-tech security system in place, people tend to let their guard down, as DeBoer noted about his office manager. But that can actually work against schools looking to bolster their security.
“If the technology is not maintained and run well, if you’re overly reliant on it, you’ll make things worse,” Leong says.
Vance, the executive at RealNetworks, says the company is blunt and transparent with its customers about the limitations of facial recognition technology. “We don’t call it SAFE, we call it SAFR,” he says. “It’s not a guarantee something isn’t going to happen. We don’t read minds or guess behavior.”
What RealNetworks promises schools is software that can track who is entering and leaving a building and can alert the right people when someone who shouldn’t be there attempts to get in. Even then, the responsibility is on the school staff, who must register the appropriate people and maintain the database.
“We know this is not a preventative solution to the tragedies that have been occurring,” DeBoer acknowledges. “What it does do is increase our awareness and make it easier for us to think about our response.”
So far, to Vance’s knowledge, schools have left their students out of the system. A few schools have mentioned to him that it’d be a convenient tool for taking attendance, but on the whole, he says, they recognize the costs aren’t worth the benefits.
Still, regardless of whether students are directly input in the system, they’re part of the equation when schools weigh whether to adopt facial recognition technology. In Leong’s view, schools shouldn’t be considering this option at all—at least not now.
“The very best facial recognition systems that exist today are excellent,” she says. “There are some really good, really powerful, really accurate systems used in places where high threats exist—in the military, for example, or at border crossings. Like any technology, there are all levels of quality out there, including extremely high quality; the ones schools are most likely to buy are probably not those,” at least for the time being.
Leong, a parent of school-aged children herself, says she understands the desperation and urgency that parents, teachers, administrators and other school staff feel in addressing safety issues. But those fears don’t justify putting undue stock in one product or system.
“All of us are ultimately terrified of the worst case scenario. We want to prevent that because it’s viscerally terrifying,” she says. “But that’s not a problem facial recognition technology is going to prevent, and it’s not going to prevent a lot of other safety and security problems either.”