Companies that make learning software now gather unprecedented amounts of data on student behavior as students do things like read online textbooks or study for tests with digital review tools. But when online learning aids can study students, could that give professors new ways to help learners? And how far is too far in trying to apply such student-activity data?
Those are the deep questions raised by an announcement today by Cerego, a company which makes a software platform designed to let students review and quiz themselves on class materials to prepare for exams. The company’s latest feature, called Cerego Insights, promises to score each learner’s behavior in the system, rating how they compare to other users going through the same material. So far the software attempts to measure three aspects of each student’s “cognitive and behavioral profile”: agility, diligence and knowledge.
Andrew Smith Lewis, co-founder and CEO of Cerego, says the analysis gives teachers (or business leaders in the case of corporate training), a “new model for understanding talent.”
“You and I might be at the very same level” as far as the grade, Smith Lewis says. “But one of us might be gritty, and one of us might be agile, and we can determine that by the data we have.”
Smith Lewis says the software uses an artificial-intelligence algorithm that weighs aspects such as how quickly students respond compared to how accurate their answers are, how well students retain material over time, how well they stay on track and how closely they follow instructions.
Because the software is designed to deliver questions that push the learner forward, it is already guessing how difficult to make the next problem presented to a learner. “Agile learners are always ahead of the estimates in the machine-learning algorithm,” Smith Lewis says.
He argues that the approach is more effective than a Myers–Briggs personality test, a tool often used in the corporate sector, since the Cerego software gathers so many more data points, and over a longer period, than the personality test, which is a one-time multiple-choice assessment.
Arizona State University, which uses Cerego in online courses as part of its Global Freshman Academy, says it plans to pilot the Cerego Insights tool this fall.
Philip Regier, university dean for educational initiatives and CEO of EdPlus at Arizona State University, stressed that he imagines professors will use it to guide them in improving materials and helping students, rather than to use it in assessing academic performance.
“I’m not going to change anybody’s grade because they’re more diligent or less diligent,” he says. “At the same time, we know that diligence and agility are factors that are associated with success in moving through the university.” He says having the scores help set a baseline to understand student study habits better, and help professors “begin developing interventions that would help students become more diligent at least in particular courses.”
He also points out that the Cerego tool is optional in ASU courses, offered as a study aid for those who are interested. The hope is that students who use the software will get better grades than those who don’t, he says, saying that showing students diligence scores may help motivate them to study more.
Jack Suess, vice president of information technology at the University of Maryland Baltimore County, sees promise in using data from learning software to help instructors improve courses, but he worries that educators “should be careful” when classifying students, even if such ranks aren’t used in grades. He says he is “uncomfortable” with the notion that a system like Cerego could promise to rate personality characteristics like agility, asking whether any third-party has checked the validity of the algorithm or how it was otherwise validated.
Smith Lewis argues that the tool is not concluding that agility or diligence are fixed traits of students, but simply measures of behavior in the specific context of a study session.
He set up a demo for EdSurge, and I spent a few minutes at various points one day this week learning and reviewing material from a sociology course. My “agility” wasn’t great, at 42 out of 100. Even though I’m not even in a course on sociology, I found myself wondering how the system decided that, and I admit feeling a bit defensive. I wondered if one of the questions I missed was the reason my score was so low, and I remembered feeling that the question was worded in a confusing way. Which raises the issue both Suess and Regier pointed out: When students are scored low on agility, is that a reflection of the quality of the student or the quality of the material? And how will instructors know?
For what it’s worth, my diligence score was 82 out of 100. I’ll confess that since I knew my behavior was being measured, I went through more review questions than I would have otherwise.
The new Insights feature is being announced at the Amazon Web Services Public Sector Summit, in Washington, DC. Cerego has also made its software compatible with the Amazon Alexa, so that professors can access data on their students with voice commands.