Although the phrase “constructive feedback” is slowly being relegated to a euphemism for what’s articulated on voice-themed reality shows, the importance of providing accurate information about performance is understood by educators and learning scientists. One reason startups that make use of blended learning generate so much excitement is that adaptive learning platforms have the potential to provide each student with detailed feedback tailored to their unique strengths and weaknesses.
Yet exactly what kind of feedback -- including the level of detail, tone, and timing -- is most influential, is the subject of much research.
For example, while researchers have established that it’s important for feedback on a problem to contain the correct answer, they’ve failed to conclusively show that further elaboration is beneficial. A group of psychologists led by Duke’s Andrew Butler believed these counter-intuitive results could be explained by the fact that the same questions tend to be asked on the pre- and post-tests. If participants simply had to retain the correct answer, any additional information would be superfluous.
Butler's team decided to design an experiment where the post-test included new questions that required transferring knowledge gleaned from the initial feedback to a new context. The results, which will appear in a forthcoming issue of the Journal of Educational Psychology, suggest their suspicions were accurate. When participants couldn’t merely memorize the pre-test answer the extra explanatory feedback led to better performance. It appears that, up to a point, more is in fact better.
Of course, not all feedback needs to be detailed to be effective. In a study published in the January issue of Computers and Education, a team of researchers from the University of Canterbury and the University of Illinois-Chicago attempted to isolate the impact of positive feedback – relatively nondescript mid-exercise comments such as “good job” and “exactly.”
The experiments used a program designed to teach coding in the SQL programming language. One group used the standard program that only gave negative feedback after incorrect answers. A second group used a modified version that was designed to also offer positive feedback. It did this when students entered a correct answer, but were likely to be uncertain about it.
When the researchers analyzed the performance of the two groups they found no significant difference in the number of problems solved. However, the group that received positive feedback solved the problems in half the time it took the group that received only negative feedback. The results suggest that by reducing uncertainty the positive feedback was able to accelerate the learning process. If future research confirms that positive feedback works best in moments of uncertainty, then identifying these moments ahead of time could become an increasingly important part of developing computer-based lessons.
One perpetual problem with computer feedback is that it lacks the emotional element of human interaction. But even if computers never match the kind words of a teacher, might there be ways to move beyond cold font on a computer screen? A 2010 study involving over 170 sixth graders found that when a program used a human voice recording for feedback instead of static emoticons, it led to higher ratings of social presence, more intrinsic motivation, and better performance on math problems. A human voice still can’t match the presence of a person, but it is promising that that the voice appeared to move outcomes off an inferior, completely inhumane baseline. The door is clearly open for future researchers to explore ways that sound and video can make feedback more emotionally engaging.
Computer-based instruction still has a way to go, but research on feedback illustrates its potential. The ability to provide elaboration and positive encouragement at all the right moments would give computers an instructional advantage over humans. And who knows, perhaps science fiction writers were prophetic and someday we’ll find a way for computers to generate copious amounts of humanity (for better or worse.)