When students interact with digital learning environments, they leave behind trails of data. The desire to understand and improve learning has led many educators to consider the value and utility of this information. But what can these data traces actually tell us about the students who left them, and how do we translate them into meaningful action?
When we hear about scaling innovation in higher education, the emphasis is almost always on quantity and increasing the production of post-secondary credentials. However, this emphasis should not detract from the primary mission of educational institutions: to ethically develop and realize both individual and socio-cultural potentialities. The use of data should support each of these aims, but that can only happen when students are involved in making sense of their own data.
To whatever extent an educational technology is considered neutral, there are better and worse ways of putting it into practice. In this, Georgia State University is exemplary. When the university first developed and deployed a new advising system, it also invested in people. Specifically, the campus hired 42 new advisors to reduce its students-to-advisor ratio from around 700 to 1 down to the national average of 300 to 1.
At the same time, GSU also invested in more computer monitors. Now, when a student is flagged as being ‘at risk,’ they are invited in for a conversation with their academic advisor. Two monitors means that both the student and the advisor can easily look at the same performance data, interpret the information together, and agree on a course of action.
What’s striking about GSU’s approach to proactive advisement is the way it treats students as agents who are responsible for their academic journeys. An advisor, like a teacher, serves as an important guide, but a student must take personal responsibility for his or her decisions in a broader social context. Since going live with a predictive analytics system in support of high quality proactive advisement, GSU has seen an increase from 50 to 55 percent in freshman retention, and graduating seniors are taking fewer credits to earn their degrees.
An interest in respecting student agency is an important component of the academic field of learning analytics. Here, researchers are considering how to use data to reflect on teaching and learning practices. However, these kinds of practices are difficult to scale because teaching approaches vary widely.
Embedded learning analytics are not like rotisserie ovens: you can’t set them and forget them. As in the case of proactive advising, the real impact from learning analytics comes more from the relationships it makes possible and the dialogue it provokes than from the technology itself. Institutions like Indian River State College, for example, are seeing tremendous gains in student success through faculty development programs that highlight the importance of learning analytics to inform teaching and learning practices. Between 2014 and 2016, online enrollment at the institution increased by 56 percent. And yet, average grades in online courses lagged behind traditional face-to-face courses by 7.5 percentage points. Since using LMS data in support of academic advising, and training faculty in the use of analytics to identify challenges and opportunities, IRSC has eliminated the achievement gap between online and in person courses, and increased online baccalaureate success rates by 11 percent.
Certainly, there are cases where the use of student data can be problematic. For example, cluster analysis may be a useful statistical method of grouping individuals based on certain categories. However, when race, ethnicity, gender, or other demographic identifiers are used as variables, institutions must be cognizant of factors beyond simply performance metrics or predictive statistics—especially when the data is produced through algorithms that may contain hidden biases.
This is an important argument for using analytics, though, not simply as a tool for understanding learning environments, but also as a tool that generates artifacts to be interpreted carefully with students. Whether we are talking about academic advising or the classroom, it is important to engage students in conversations about what data mean, and the implications of different interpretations. It should also be underlined that algorithms only have power to the extent that they are rendered meaningful, and that students, faculty and advisors should all participate in the meaning-making process. This kind of approach can go a long way toward mitigating concerns about the use of predictive analytics to pigeonhole students into stereotypes.
Using data to drive interventions is fundamentally an ethical enterprise. Instructors and administrators should embrace practices that ensure that they are using data and analytics to support student progress while also considering the broader implications of their approaches to teaching, learning and student success.