Instructors Rush to Do ‘Assignment Makeovers’ to Respond to ChatGPT

Teaching and Learning

Instructors Rush to Do ‘Assignment Makeovers’ to Respond to ChatGPT

By Jeffrey R. Young     Jul 27, 2023

Instructors Rush to Do ‘Assignment Makeovers’ to Respond to ChatGPT

This article is part of the collection: How AI Is Impacting Teaching and Learning.

Since the release of ChatGPT a little more than six months ago, students have quickly figured out how to get the free AI chatbot to do their homework for them. That has sparked a burst of activity by teachers at schools and colleges to change their assignments to make them harder to game with this new tech — and hopefully more human in the process.

But pulling off these “assignment makeovers,” as some instructors are calling them, turns out to be challenging, and what works differs significantly depending on the subject matter and type of assignment.

EdSurge talked with professors in a variety of disciplines to dig into what they’re trying as they teach summer classes or prepare for the fall. The race to outsmart artificial intelligence is on as educators try to prevent the coming semester from devolving into, as one professor put it, a “homework apocalypse.”

A large number of K-12 teachers and college professors have decided to simply ban the use of ChatGPT and other new AI chatbots when completing assignments. Some of those instructors are using tools that attempt to detect text written by bots, such as GPTZero and a new tool by Turnitin. But even the makers of those detection tools admit they don’t always work, and they can even falsely accuse human-written assignments as being generated by AI. And some schools have attempted to block AI chatbots from their school networks and devices, but experts say that doing so is essentially impossible, since students can easily access the tech from their smartphones, or through the many services that have integrated AI but that aren’t on lists of banned tools.

But plenty of educators are game to try working with AI rather than simply wish it didn’t exist. A recent survey of 1,000 K-12 teachers found that 61 percent predicted that ChatGPT will have “legitimate educational uses that we cannot ignore.”

Adding Authenticity

Some teaching experts see AI as a spark to motivate instructors to make assignments more interesting and more “authentic,” as Bonni Stachowiak, dean of teaching and learning at Vanguard University of Southern California, argued on a recent EdSurge Podcast.

When Tim Bajkiewicz heard that, though, he said he felt unfairly criticized — because to him, that advice is harder to follow than many might realize. For one thing, Bajkiewicz, who is a broadcast journalism professor at Virginia Commonwealth University, teaches more than 200 students per class. And he teaches those courses online and asynchronously, meaning students go through the material at their own pace rather than ever meet at the same time and place. In other words, there’s not even a Zoom classroom where they gather.

All that makes it challenging for him to get to know students in ways that would be easier if he taught, say, 20 students at a time in person. And he can’t just turn assignments into one-on-one discussions with students to see if they’re keeping up with material or even have students do writing in class while he can watch them work.

Bajkiewicz says he is spending time trying to adapt his assignments for an introductory mass communication course he teaches, since he believes some of his students already use ChatGPT to get out of doing the work themselves.

For instance, on a recent assignment, some of the homework that came in didn’t sound like typical student work he was used to. So he ran those assignments through an AI-detection tool, which determined that they were likely bot-written.

“Getting students to write something has always been such a solid form of assessment — probably one of the bigger tools we have in our toolkit,” he says. “We have to seriously now ask ourselves, when does it make sense to have students writing?”

In response, Bajkiewicz gave students the option of turning in an assignment as audio recording using a tool the campus already had a license for, hoping that would make it harder to game and easier to tell if the students were doing their own work.

The assignment was to give a summary of and response to a film they had been assigned, the pioneering 1922 documentary “Nanook of the North.” But because it’s a classic, ChatGPT and other tools have plenty of information about it, since many of those tools have been trained on recent Internet data.

“Some of them sounded really scripted,” Bajkiewicz says of the audio assignments he got, and he wonders if some students simply requested an answer from a chatbot that they then read aloud. “Was that something that came out of AI? I don’t know,” he adds.

In other words, the assignment designed to be more authentic is in some ways more difficult to check with an AI-detection tool.

What About Writing Classes?

Many college classes are designed to fulfill a writing requirement, meaning they are meant to prepare students to put their ideas in written form, in part to prepare them for communicating in the workplace.

Derek Bruff, a consultant and a visiting associate director at the Center for Excellence in Teaching and Learning at the University of Mississippi, recently blogged about his attempts to update an assignment for a writing class to respond to the presence of ChatGPT. (Bruff may have coined the term “Assignment Makeovers” with his series of blog posts inspired by watching the TV show “Extreme Makeover: Home Edition.”)

The assignment he revised was from a course he taught in 2012 about the history of mathematics and cryptography that fulfilled a campus writing requirement. For the assignment, he asked students to write about the origin and impact of a code or cipher system of their choice, to form their answer as a blog post for the academic blog Wonders & Marvels, and to submit it to the blog for possible publication. At the time, he told students: “The technical side of your post is the closest you’ll come to the kind of writing that mathematicians do, so be sure to be clear, precise, and concise.”

Looking at the assignment today, though, he realizes that technical writing is something that ChatGPT and other AI tools are particularly good at. And he notes that students could even pretend to submit drafts to him along the way, as he required, that were made better not by the students but by the tool being prompted to clarify some point or other.

The fact that students are given a choice of a cryptography tool they want to write about gives them some intrinsic motivation to actually do the assignment themselves, he argues. “But,” he wrote, “for students who want an easy way to complete the assignment, AI certainly provides that.”

One surprising thing Bruff discovered by trying to give the assignment a makeover and in talking to colleagues, he said in a recent interview with EdSurge, is that extra effort he made in giving instructions about the assignment — explaining what kind of work he required to get a good grade — might make it easier for students to cheat in this era of ChatGPT. Giving clear rubrics and expectations is meant to make grading more transparent and fair, and groups including the Transparency in Learning & Teaching project advocate for the notion. But, Bruff says, “the more transparent I am in the assignment description, the easier it is to paste that description into ChatGPT to have it do the work for you. There’s a deep irony there.”

One possible makeover, he says, is to ask students to compose their assignment in a tool like Google Docs, and then share the document with the professor so he or she can look at the revision history to see if it was composed or simply pasted in all at once.

But he says there are tradeoffs to that approach, including issues of student privacy. Also, he adds, “If I knew my prof was standing over my shoulder as I wrote, I think I might freeze up.”

The Challenge of Teaching Coding

Perhaps the most challenging assignment makeovers will come in courses on computer coding.

Sam Lau, who is starting a job as an assistant teaching professor in data science at the University of California at San Diego this fall, is excited about AI, but he admits that teaching his course about introductory computing will be “pretty tough.”

To help him prepare, he recently co-wrote a post for O’Reilly’s Radar blog about “teaching programming in the age of ChatGPT.” For the post, he and a colleague interviewed 20 computing professors to hear how they were giving their assignments a makeover.

He says he knows that programmers increasingly use AI tools like GitHub Copilot to have a bot write code. But he wonders how students will ever learn the basics of code if they never learn to do coding themselves?

Lau is optimistic, though. He says his theory is that even if students use tools to help them write code, they will still learn the basics by having to craft the code for the assignment and “think through what needs to be programmed.”

Still, he knows that some computer-science professors want their intro students to learn to code without AI support. For those, he recommends an assignment he learned about from Zachary Dodds, a computer science professor at Harvey Mudd College.

The assignment asks students to write computer code for a random “walk” along a number line. Then students are asked to program a second random walker that is on a collision course with the first. Part of the assignment is for students to make up a story about these two characters and why they are on the path. For instance, a student might say that they are two ants on a log and one is telling the other where the food is, or that they are two friends trying to go to the grocery store. The idea is to inject an element of playfulness in an otherwise mundane coding task.

Could AI essentially be used to make up both the story and the code?

Well, yes, Lau admits. “At some point as an instructor there’s the question of how far students are going to go” to cheat, he says. “If they’re willing to go that far, we don’t think nor believe we should try to spend time getting these students to do their assignments.”

A Balancing Act

So perhaps the best instructors can do is to make their assignments so interesting or unusual that even though students could cheat, that it would take more significant effort to do so. After all, most locks on houses could conceivably be picked, but at some point we accept a balance between the ease of the homeowner getting to their house and the challenge it would be for a bad actor to break in.

Ethan Mollick, an associate professor of management at the University of Pennsylvania, is the one who coined the term homework apocalypse. One of his major recommendations: Try a flipped classroom, where students watch lectures via video and spend class time on active learning exercises.

“There is light at the end of the AI tunnel for educators, but it will require experiments and adjustment,” he writes in his newsletter, One Useful Thing. “In the meantime, we need to be realistic about how many things are about to change in the near future, and start to plan now for what we will do in response to the Homework Apocalypse.”

Bruff, the teaching consultant, says his advice to any teacher is not to have an “us against them mentality” with students. Instead, he suggests, instructors should admit that they are still figuring out strategies and boundaries for new AI tools as well, and should work with students to develop ground rules for how much or how little tools like ChatGPT can be used to complete homework.

What do students think?

Johnny Chang, an incoming graduate student at Stanford University, is organizing an upcoming online conference on AI in education in hopes of infusing more student voice into conversations about teaching and AI.

He suggests that whatever instructors do with their assignments to adapt to ChatGPT and other tools, they should be asking students for input — and be ready to keep revising their assignments, because the tech is so fast-moving.

“What you design currently might become outdated as soon as students hop on and find some loophole around it,” he says.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

Next Up

How AI Is Impacting Teaching and Learning

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up