Is the rush to bring personalized software into schools putting student data at risk?
Researchers from the National the National Education Policy Center (NEPC) say, yes. Earlier this week they released an extensive report asking lawmakers to put up some legal barriers to protect student information.
“Technologies are being harnessed to amplify corporate marketing and profit-making, extending the reach of commercializing activities into every aspect of students’ school lives,” reads the report, authored by Faith Boninger, Alex Molnar, and Kevin Murray of the University of Colorado Boulder.
The document points out that the heavy emphasis on technology in the classroom not only makes students susceptible to brand marketing from technology companies, but it also socializes students to accept online surveillance as a normality. Noting the risk associated specifically with personalizing technology, which uses algorithms and logic educators and policymakers have a difficult time tracking, the group calls for “commercialization in schools” to be monitored through six policy recommendations:
- Prohibit schools from collecting student personal data unless rigorous, easily-understood safeguards for the appropriate use, protection, and final use of those data are in place.
- Hold schools, districts, and companies with access to student data accountable for violations of student privacy.
- Require algorithms-powering education software to be openly available for examination by educators and researchers.
- Prohibit adoption of educational software applications that rely on algorithms unless a disinterested third-party has examined the algorithms for bias and error and valid data have shown that the algorithms produce intended results.
- Require independent third-party assessments of the validity and utility of technologies, and the potential threats they pose to students’ well-being, to be conducted and addressed prior to adoption.
- In addition, parents, teachers, and administrators—as individuals and through their organizations—should work to publicize both the threats that unregulated educational technologies pose to children and the importance of allowing access to algorithms-powering educational software.
NEPC is not the first group to show concern over the collection of student data in schools. Parent-advocate groups such as Parent Coalition for Student Privacy have been vocal opponents of what they describe as an untethered collection of student data in schools. However, with the movement to personalize classrooms growing faster than the political will to govern anything in schools, this call for regulating personalized learning environments seems in for an uphill battle.