If the Ancient Mariner were sailing on the internet’s open seas, he might conclude there’s information everywhere, but nary a drop to drink.
That’s how many college students feel, anyway. A new report published this week about undergraduates’ impressions of internet algorithms reveals students are skeptical of and unnerved by tools that track their digital travels and serve them personalized content like advertisements and social media posts.
And some students feel like they’ve largely been left to navigate the internet’s murky waters alone, without adequate guidance from teachers and professors.
The study was conducted by Project Information Literacy, a nonprofit research institution that explores how college students find, evaluate and use information. It was commissioned by the John S. and James L. Knight Foundation and The Harvard Graduate School of Education.
Researchers set out to learn “how aware students are about their information being manipulated, gathered and interacted with,” said Alison Head, founder and director of Project Information Literacy, in an interview with EdSurge. “Where does that awareness drop off?”
To find out, Head and fellow investigators Barbara Fister and Margy MacMillan conducted focus groups and interviews with 103 undergraduates and 37 faculty members from eight U.S. colleges.
They found that many students not only have personal concerns about how algorithms compromise their own data privacy but also recognize the broader, possibly negative implications of tools that segment and customize search results and news feeds.
“They felt very strongly and could fill in the blanks of how we see separate realities because of filtered information,” Head said. “A lot of the students were able to take issues of social justice and translate that into algorithmic justice—the idea of, how do these systems work and how do they impact people in society?”
However, students seemed less knowledgeable about the use of data collection and algorithms in education. When focus group discussions turned to the surveillance conducted by learning management systems, such as Canvas, some students were surprised and “indignant,” the report says, especially since students are rarely able to opt out of using such tools.
Some faculty participants, too, seemed unaware of the potential of digital education tools to serve as large-scale tracking devices, Head added. And although some professors reported teaching critical thinking skills through lectures and assignments, less than a third of the study participants could offer examples of how their courses address the questions raised by algorithmic platforms.
In light of this, the report argues that “the information practices students develop to manage college assignments, according to our research, do little to equip them for an information environment that increasingly relies on manipulating large data sets to select and shape what they see.”
A dearth of relevant classroom instruction may help explain why students reported relying largely on their peers, or their own judgement, to learn about how algorithms affect information.
“Students had dismissed faculty from being a source in helping them do that kind of work,” Head said.
To better equip students for the modern information environment, the report recommends that faculty teach algorithm literacy in their classrooms. And given students’ reliance on learning from their peers when it comes to technology, the authors also suggest that students help co-design these learning experiences.