During the first two months of 2020, there were at least 26 incidents of gunfire on school grounds in the United States, part of a devastating nationwide epidemic that shows no signs of letting up. While legislators, parents, and students themselves are tirelessly fighting to keep kids safe at school, their tactics vary in efficacy and enthusiasm: Bulletproof shields, clear backpacks, and even the proposition to arm kindergarten teachers haven’t proven to actually lessen gun violence. Nor do they get to the root of the issue, gun control. So the makers of facial recognition software, which has a unique ability to identify a person and track an individual’s movement in real time, say they are attempting to keep students safe without interfering in their daily lives. And many schools jumped at the opportunity to utilize the new technology.
A preliminary search by MTV News shows that at least 16 schools across the country have implemented or are considering implementing facial recognition cameras on their campuses. But problems with the tech keep popping up: Research has shown it has adverse effects on students of color by misidentifying them. It could also perpetuate the school-to-prison pipeline, a trend that, with the increase of on-campus policing, sees students funneled into the criminal justice system. The technology isn’t federally regulated, and we don’t know the full scope of how many schools are using it.
But here is what we do know: Students and their allies, many of whom are skeptical of the potential of their online data being stolen, are fighting back.
WHAT HAPPENED IN LOCKPORT
Earlier this year, the controversial technology was put to use in a high school in Lockport, New York, a town of about 20,000 people located just south of the U.S.-Canada border. That decision by the school district sparked intense community backlash and national headlines.
In an attempt to deter crime and particularly mass shootings in schools, Lockport City High School implemented facial recognition cameras that collect and store facial data on January 2. Multiple students told MTV News they didn’t know the system was going to be turned on until nearly a month after it was already active.
This specific technology, a suite called AEGIS created by SN Technologies, a company based out of Ontario, Canada, that specifically advertises to schools, has the ability to scan every single person’s face if they walk in the camera’s view. Theoretically, if that person isn’t supposed to be on campus — for example, if the system identifies someone listed as a registered sex offender — an alert will be sent to school administrators. AEGIS also has object detection software that will send an alert if it recognizes a gun that is out in the open, and can then track that person through the school even if they discard the weapon. The tech also has a “forensic search engine” that allows users to search its entire history for specific people.
The district is spending most of its $4 million “Smart School” grant on these and other enhancements to its security systems, according to the Niagara Gazette.
“We can’t let our guard down,” Lockport Superintendent Michelle T. Bradley told the Buffalo News after signing the contract with SN Technologies in 2018. “For the Board of Education and the Lockport City School District, this is the number one priority: school security.”
But the software’s object detection system could be one of AEGIS’s most useless feature, according to Connor Hoffman, the 25-year-old journalist whose local coverage of the school’s system for the Lockport Union-Sun & Journal has fueled national conversation around the use of facial recognition in schools for months. “If you have a system that’s detecting a gun, how much more helpful is that than someone seeing a gun and screaming?” he said to MTV News. But that object detection, along with other capabilities of AEGIS, made it worth the multi-million dollar price tag.
Hoffman first reported on the program while working the school beat at the Union-Sun & Journal. He was preparing for a meeting when he saw a peculiar note on the agenda: a standardization resolution looking at cutting energy costs and updating surveillance systems in the Lockport City High School (LCHS). So he began asking questions and learned that the district planned to implement facial recognition software inside the schools. His report quickly turned the story into a national issue. Before implementing the system, the district had to fight battles with the New York Civil Liberties Union, local press, and the state of New York.
But the students said they didn’t feel they had been adequately informed about the technology before the system was activated. “Something this big should have been properly told to us,” Mariana Schultz, a 17-year-old junior at LCHS told MTV News. Discussion about the facial recognition technology had been rolling around her classroom for months, and it appeared to be at a halt when, in May 2019, the New York State Department of Education intervened in response to public backlash and told Lockport City School District to delay their plans to activate cameras. The school decided they wouldn’t collect any student data and would only use cameras to detect guns and unwanted people in the schools. But advocates and students were still nervous about the system being turned on, especially given that the software’s forensic search engine is able to collect data retroactively, which was confirmed to MTV News by SN Technologies.
“The District will not [enter students] into the database … except upon notification to and consultation with the Chief Privacy Officer of the New York State Education Department, and in accordance with all applicable law,” the school’s facial recognition technology policy reads.
Students said they were not officially told about the school’s decision to turn the system on the day they returned from winter break this year. At least five of Schulz’s classmates said they found out the day they were interviewed by MTV News over a month after the system was turned on. “We’re in the school every day,” Schultz said. “We should know that there are facial recognition cameras in our school.”
When asked for additional information, Lockport Superintendent Michelle Bradley pointed to Policy 5685 Operation and Use of Security Systems/Privacy Protections and said that “the system became operational on January 2, 2020, students are not included in the system, and no student data is generated or maintained in the system.”
The students in Lockport aren’t alone in their distrust of this technology: According to a survey from YouGov, a majority of young people aged 18 to 24 years old said they don’t want facial recognition technology to collect and store their image while on public school grounds, even if its intention is to deter crime.
And one glaring question remains for many students at LCHS: If the technology doesn’t include students or most alumni, how can this expensive technology actually stop shootings? The perpetrators of all of the major U.S. school shootings in the last five years have been current or former students, according to The Intercept — but students don’t want to be looked at suspiciously for the crimes of other people, either.
DOES FACIAL RECOGNITION TECHNOLOGY WORK?
We don’t actually know if this facial recognition technology would work to stop a school shootings. The difference between facial recognition technology and traditional surveillance systems, like cameras and security guards, is that this tech tracks people’s biometric data. But while it might be easier to spot a potential danger and call for help if a person is holding a weapon or if that person’s image was previously entered into the database, SN Technologies told MTV News that the system then requires human verification before alerts are distributed. Even machines still require human judgment to make the call.
And critics at the New York Civil Liberties Union say it’s not apparent that this software actually works as advertised, that it’s just one part of a police state that has taken over K-12 schools, despite there being little to no proof that any of these techniques actually keep students safe. That includes loading schools up with security guards or a heightened police presence that can make students feel watched, and even trigger PTSD for some.
The bottom line is that there’s no real guarantee that facial recognition of any kind will actually make students safer. The real change, advocates, including those at Everytown For Gun Safety, say, has to come at the hands of stricter gun control laws.
WHAT IS EVERYONE SO CONCERNED ABOUT?
The goal of this kind of technology is to deter crimes. But students fear that its ability to allow school administrators to search through recorded videos and scan for specific faces could lead people in power to use it for discipline, too.
“I think that these cameras can end up being an invasion of our privacy,” Schultz said. “Because the various times whenever a teacher has brought it up to my class, the way that they’ve described it has also been that they’re going to use the cameras not just for our protection, but as a way to discipline people.”
The LCHS students who spoke to MTV News are worried about who has access to that footage, and what might happen if that footage — of hundreds of underaged students — gets hacked, but SN Technologies told MTV News that they have confidence that their system cannot be hacked and that “every technical security precaution has been incorporated into our engineering.” Other critics worry that this biometric data can be shared with law enforcement or agencies like Immigration and Customs Enforcement (ICE).
Another fear, particularly for students of color, is that they will be falsely identified by the system. SN Technologies say that their “product is very accurate (as verified by third parties) in terms of facial recognition and firearm detection” and that their accuracy is “well beyond 99 percent regardless of race, age or sex.” These studies are not public, “as they contain confidential information,” according to SN Technologies.
However, a December 2019, the National Institute of Standards and Technology released a study of multiple kinds of facial recognition software and found that false-positive rates are highest for people of color and lowest for people with light skin tones. A study from IT and University of Toronto researchers found that Amazon’s facial recognition software mistook darker-skinned women for men 31 percent of the time; lighter-skinned women were misidentified just 7 percent of the time, and men of any skin tone had low to no misidentification.
Stefanie Coyle, the deputy director of the education policy center at the New York Civil Liberties Union, is worried that such misidentification may perpetuate the school-to-prison pipeline, which is the “flow of students from schools and into the criminal and legal system,” she said at a recent Lockport town hall. “A lot of times this happens when there are police officers stationed in the schools and things where people used to be sent to the principal’s office can then result in a trip to a police precinct.”
“This [facial recognition] system can amplify that,” Coyle said. “And what we wouldn’t want to happen is that a student is falsely identified by this system, an alarm is triggered, and then we have an interaction between our young people and police.”
IS FACIAL RECOGNITION TECHNOLOGY IN SCHOOLS NEW?
It’s unclear which school was the first to implement the technology, but this tech is nothing new. Facial recognition surveillance on schools started as early as 2003: According to Gov Tech, Royal Palms Middle School in Phoenix, Arizona, was one of the first, if not the first school in the country to use facial recognition to check the faces of people who entered the school against a database of registered sex offenders.
The setup was uniquely prison-to-school: That year, Joe Arpaio, who was then the Sheriff for Maricopa County, and his team tested Phoenix-based Hummingbird Defense Systems’s facial recognition technology to verify identities of people booked into county jails. Arpaio’s office and Hummingbird Co. then decided to try the technology out on schools in an effort to keep sex offenders off campus. (All of this was years before Arpaio was convicted, and later pardoned by President Donald Trump, for criminal contempt in connection to his racist habit of detaining people he suspected to be undocumented immigrants.)
Times have changed since then, and it seems the primary use of facial recognition technology has shifted from an effort to deter sex offenders from being on campus to curbing mass shootings. There is no proof that such technology has ever successfully stopped either kind of incident. SN Technologies told MTV News that “luckily” their technology has never stopped a school shooting, “although the system has been tested by police in school buildings and it works very well.”
HOW MANY SCHOOLS ARE USING THIS TECHNOLOGY?
It’s not clear how many schools are also using this technology without the broader public’s knowledge, because there are no regulations to report such technologies in a national database. A report from Wired in November 2019 found at least eight K-12 schools in the U.S. that have considered using, or are currently using, this technology. Another report by Recode found that “about two dozen” schools, including private, public, and collegiate institutions, are using facial recognition technology to count school attendance.
As of March 2018, the local high school and junior high school in Magnolia, Arkansas, were planning to install an expansive surveillance system including more than 200 cameras, infrared monitoring, and facial recognition technology accessible to law enforcement agencies, according to the American Civil Liberties Union. In Missouri, West Platte School District installed 95 facial recognition cameras into its buildings, which serve approximately 700 students, in April 2019.
The Sun Sentinel reported in January 2019 that the Broward County School District in Florida was in plans to install a “$621,000 surveillance system that includes technology that can recognize people,” but the system is not technically facial recognition. That district includes Marjory Stoneman Douglas High School, where a former student killed 17 people in February 2018.
“How is this computer going to make a decision on what’s the right and wrong thing in a school with over 3,000 kids?” Kimberly Krawczyk, a teacher at Marjory Stoneman Douglas High School, who was at school during the shooting, told the Washington Post of the new technology. “We have cameras now every two feet, but you can’t get a machine to do everything a human can do. You can’t automate the school. What are we turning these schools into?”
It’s not clear that any of the systems that have previously been or are currently deployed at these schools have actually stopped a violent event.
WHO IS TRYING TO REGULATE IT?
There simply aren’t many laws that clearly apply to facial recognition. SN Technologies told MTV News that the New York State Department of Education views all video as school records, which are protected by state law. But the New York Civil Liberties Union argues that “the system will inevitably implicate student data” because the biometric data of students will exist inside the system.
Federal regulation governing facial recognition does not exist, and the U.S. Department of Education told Recode that they have not issued any guidance regarding the technology. What’s more, students can’t opt-out of the system once they’re in it, causing many states to consider banning facial recognition or at least regulating it.
“Face recognition technology has a history of being far less accurate in identifying the faces of women, young people, and dark-skinned people,” an in-progress Massachusetts State Senate bill that seeks to regulate facial recognition technology statewide reads. “Such inaccuracies lead to harmful ‘false positive’ identifications.”
This tech is being used on people in schools, in summer camps, and in concert venues, whether they’re minors or consenting adults. As a result of public backlash, city-wide bans went into effect in Oakland, San Francisco, and Cambridge, and many states are looking to fall in line with bans or restrictions, including Indiana, New Jersey, South Carolina, Washington state, New Hampshire, Michigan, and Vermont. Florida, Illinois, and Texas have biometric information privacy laws that could require consent before using facial recognition.
If you’re a student or faculty member with information about facial recognition being explored by your school, send us a tip.