How AI Can Help Educators Test Whether Their Teaching Materials Work

EdSurge Podcast

How AI Can Help Educators Test Whether Their Teaching Materials Work

By Jeffrey R. Young     Jul 11, 2023

How AI Can Help Educators Test Whether Their Teaching Materials Work

This article is part of the collection: Artificial Intelligence Holds Promise for Education — and Creates Problems.

Companies like Amazon and Facebook have systems that continually respond to how users interact with their apps to make the user experience easier. What if educators could use the same strategy of “adaptive experimentation” to regularly improve their teaching materials?

That’s the question posed by a group of researchers who developed a free tool they call the Adaptive Experimentation Accelerator. The system, which harnesses AI, recently won first place in the annual XPrize Digital Learning Challenge, which boasts a purse of $1 million split among winners.

“In Amazon and Facebook, they're rapidly adjusting conditions and changing what their viewers are seeing to try to quickly better understand what small changes are more effective, and then providing more of those changes out to the audience,” says Norman Bier, director of the Open Learning Initiative at Carnegie Mellon University who worked on the project. “When you think about that in an educational context, it … really opens up the opportunity to give more students the kinds of things that are better supporting their learning.”

Bier and others involved in the project say that they are testing the approach in a variety of educational settings, including public and private K-12 schools, community colleges and four-year colleges.

EdSurge sat down with Bier and another researcher on the project, Steven Moore, a doctoral candidate at Carnegie Mellon’s Human-Computer Interaction Institute, to hear more about their bid to win the XPrize for education and what they see as the challenges and opportunities for harnessing AI in the classroom.

The discussion took place at the recent ISTE Live conference in Philadelphia in front of a live audience. (EdSurge is an independent newsroom that shares a parent organization with ISTE. Learn more about EdSurge ethics and policies here and supporters here.)

Listen to the episode on Apple Podcasts, Overcast, Spotify or wherever you get your podcasts, or use the player on this page. Or read a partial transcript below, lightly edited for clarity.

EdSurge: The app you developed helps teachers test out their learning materials to see if they’re effective. What’s new in your approach?

Norman Bier: If you think about standard A/B tests [for testing webpages], they're usually working off of averages. If we're going to average out everything, we're going to have student populations for whom the intervention that's good for everybody isn't good for them individually. One of the real benefits of adaptive experimentation is that we can start to identify, ‘Who are these subgroups of students?,’ ‘What are the specific kinds of interventions that are better for them?,’ and then we can deliver them and in real time keep giving them the intervention that's better for them. So there's a real opportunity, we think, to better serve students and really address the notion of experimentation more equitably.

I understand that one aspect of this is something called ‘learner sourcing.’ What is that?

Steven Moore: The concept of learner sourcing is akin to crowdsourcing, where a large number of people chime in. Think of the game show ‘Who Wants to Be a Millionaire?’ when contestants poll the audience. They ask the audience, ‘Hey, there's four options here. I don't know which one, what I should pick?’ And the audience says, ‘Oh, go with choice A.’ That's an example of crowdsourcing and the wisdom of the crowd. All these great minds come together to try to get a solution.

So learner sourcing is a take on that, where we actually take all this data from students in courses — in these massive online open courses — and we collect their data and get them to actually do something for us that we can then throw back into the course.

One example in particular is getting students that are taking, say, an online chemistry course to create a multiple choice question for us. And so if you have a course with 5,000 students in it, and everyone elects to create a multiple-choice question, you now have 5,000 new multiple-choice questions for that chemistry course.

But you might be thinking, how's the quality of those? And honestly, it can vary a lot. But with this whole wave of ChatGPT and all these large language models and natural language processing, we're now able to process these 5,000 questions and improve them and find out which ones are the best that we can actually then take and use in our course instead of just throwing them blindly back into the course.

Bier: We're asking students to write these questions not because we're looking for free labor, but because we think it's actually going to be helpful for them as they develop their own knowledge. Also, the kinds of questions and feedback that they're giving us is helping us better improve the course materials. We've got a sense from lots and lots of research that a novice perspective is actually really important, particularly in these lower-level courses. And so pretty implicit in this approach is the idea that we're taking advantage of that novice perspective that students are bringing, and that we all lose as we gain expertise.

How much does AI play a role in your approach?

Moore: In our XPrize work, we definitely had a few algorithms that power the backend that take all the student data and basically run an analysis to say, ‘Hey, should we give this intervention to student X?’ So AI was definitely a big part of it.

What is a scenario of how a teacher in a classroom would use your tool?

Bier: The Open Learning Initiative has a statistics course. It's an adaptive course — think of it as an interactive high-tech textbook. And so we've got thousands of students at a university in Georgia who are using this stats course instead of a textbook. Students are reading, watching videos, but more importantly they're jumping in, answering questions and getting targeted feedback. And so into this environment, we're able to introduce these learner sourcing questions as well as some approaches to try to motivate students to write their own questions.

Moore: I have a good example from one of our pilot tests for the project. We wanted to see how we could engage students in optional activities. We have all these great activities in this OLI system, and we want students to do extra stats problems and whatnot, but no one really wants to. And so we want to say, ‘Hey, if we can provide a motivational message or something like, Hey, keep going, like five more problems and you know, you'll learn more, you'll do better on these exams and tests.’ How can we tailor these motivational messages to get students to participate in these optional activities, whether it be learner sourcing or just answering some multi-choice questions?

And for this XPRIZE competition in our pilot test, we had a few motivational phrases. But one of them involved a meme because we thought maybe some undergrad students for this particular course will like that. So we put in a picture of a capybara — it's kind of like a large hamster or Guinea pig — sitting at a computer with headphones on and glasses, no text. We're like, ‘Let's just throw this in and see if it gets students to do it.’ And for like five different conditions, the picture of just the capybara with headphones at a computer led to more students participating in the activities that followed. Maybe it made them chuckle, who knows the exact reason. But compared to all these motivational messages, that had the best effect in that particular class.

There’s lots of excitement and concern about ChatGPT and the latest generative AI tools in education. Where are you both on that continuum?

Moore: I definitely play both sides, where I see there's a lot of cool advancements going on, but you should definitely be super hesitant. I would say you always need human eyes on whatever the output from whatever generative AI you're using. Never just blindly trust what is being given out to you — always put some human eyes on it.

I'd also like to throw out that plagiarism detectors for ChatGPT are terrible right now. Do not use those, please. They're not fair [because of false positives].

Bier: This notion of the human in the loop is really a hallmark of the work we do at CMU, and we've been thinking strategically about how do we keep that human in the loop. And that's a little bit at odds with some of the current hype. There are folks who are just rushing out to say, ‘What we really need is to build a magic tutor that can provide direct access to all of our students that can ask it questions.’ There are a lot of problems with that. We're all familiar with the technology's tendency to hallucinate, which gets compounded by the fact that lots and lots of learning research tells us we like things that confirm our misconceptions. Our students are the least likely to challenge this bot if it's telling them things that they already believe.

So we've been trying to think about what are the deeper applications of this and what are ways that we can use those applications while keeping a human being in the loop? And there's a lot of stuff that we can be doing. There are aspects of developing content for things like adaptive systems that human beings, while they're very good at, hate doing. As someone that builds courseware, my faculty authors hate writing questions with good feedback. That’s just not a thing that they want to spend their time doing. So providing ways that these tools can start giving them first drafts that are still reviewed is something we're excited about.

Listen to the full conversation on this week's EdSurge Podcast.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

Next Up

Artificial Intelligence Holds Promise for Education — and Creates Problems

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up