An online discussion platform called Piazza has taken off among faculty at the University of California at Berkeley in recent years. Professors can easily integrate the platform into an LMS, allow students to engage with one another and course material online, and faculty see data on how student are learning. Even more attractive: It’s free.
But there’s a catch. To make money, Piazza sells student data to third parties via a revenue model it called “Piazza Careers,” a recruiting service that allows companies to find potential employees (i.e. students) based on their information.
Technology staff at UC Berkeley said they didn’t know that was happening, though, since teachers were finding the tool on their own instead of a going through a university contract. So in 2016, after discovering that student data was being sold, the university (among others) decided it was time for things to change.
Jenn Stringer, assistant vice chancellor for teaching and learning at UC Berkeley, told the story to a crowd of about 50 educators and IT professionals on Wednesday at the 2017 Educause national conference. In the session, titled “The Wicked Problem of Learning Data Privacy,” educators asked questions and shared lessons learned about using learning analytics in higher education.
Education writer Phil Hill covered the privacy issue and college objections to it last year: “Faculty assign usage of Piazza, often as required course tool. Students sign up and almost none of them take the step to opt-out (and it is opt-out and not opt-in, as users have to take an action to uncheck the box). And thus Piazza sells student profile data, including courses taken and general course performance, to corporate recruiters.”
Stringer said UC Berkeley’s technology services staff members were not in the negotiations with Piazza to prevent that from happening early on, since the company’s model is to work with individual professors. Those busy faculty members, may quickly click through terms of service without understanding the privacy implications. For students in courses that require using the platform, it’s likely they will skip though the fine print as well.
“It’s difficult for a vendor to see why you're important if you’re not a part of their revenue generation,” said Stringer.
Following Hill’s post, the company reached out to UC Berkeley to work out a new arrangement. Stringer says, “we had a conversation with the vendor about how important it was to get [data protections] in the contract.” Piazza responded and made several adjustments, including making “opt-out” of data-sharing the default option for students using Piazza for class.
“It’s good we are talking about this today, but we are a little late to the game,” said Jim Williamson, another speaker in the session.
Williamson, who is the director of campus educational technology systems and administration at the University of California at Los Angeles, called the issue around learning-data privacy a “wicked problem.” By that he means there is no definitive formulation of the problem, no hand and fast rule, no agreed solutions and no ultimate test of the solution.
“What happens if we install a policy that allows data to be deleted but that data has been used in a research paper?” he asks. “There are no binary solutions.”
Learning-analytics enthusiasts, including academic advisors who may use personal student data to “predict” if a student needs support, may be “well-meaning” he said. But data gaps could inhibit some technologies from achieving their aims. “A solution working in a perfect world might not work in a world where we lack the data to arrive at meaningful solutions,” said Williamson.
The UCLA director also touched on another widely-criticized and researched side of the issue: “What happens when students get stereotyped [based on their data] and how do you handle outliers?”
Many in the audience said their campuses haven’t bought a learning-analytics system—but are thinking about it or at least curious to learn about them.
Domi Enders, associate director of learning resources and academic technologies at Columbia University, said her department is still in the early, “embryonic” stages of implementing data-informed teaching and learning. Among challenges, she said, “everyone is scared to release data because there is no policy” to guide its use.
Another audience member asked why Piazza’s practice was not a violation of FERPA, the federal student-privacy law. Stringer explained that by some interpretations the company complied with the law because students had the option to opt-out, even if they may not have understood the choice they were making.
Others attending the session cited concerns about the cost of maintaining and sustaining such tools over time, returns on investment and looming uncertainty over how and where data will be stored.
To deal with some of these questions internally, the University of California now has a system-wide committee that focuses on its edtech initiatives, said another session speaker Mary-Ellen Kreher, who is director of course design and development at the UC Office of the President.
The group “agreed we needed to work together to tackle the problem that our data was being collected and we were not in control,” Kreher said. “If you recognize that your data is being collected and you’re not in control, you need to take action because it’s only going to get worse.”
At UC Berkeley in particular, Stringer said that the university also pulled together a student panel to get their opinions on the tools faculty and advisors may be using. One student, according to String, was not happy to find out that his attendance and other personal habits, like how often he or she wrote on a discussion board, could be shared with their advisor.
Quoting the student, Stringer said: “I don’t want anyone to go all ‘mom’ on me. I’m here to fail and learn on my own.”
“Student success tools can be invasive in advising, and some [people] will have a very different opinion about how these tools should be used,” said Stringer. “This is something we have to negotiate.”