posted on 2022-05-01, 00:00authored byLena Hildenbrand
Many studies have demonstrated that practice testing on to-be-learned materials can be an effective learning activity. While most of the literature on practice testing is on retention, work on improving comprehension primarily has focused on open-ended activities such as short answer quizzes or having students write self-explanations. Recently, two studies demonstrated that closed-ended practice tests consisting of inference questions can improve students’ comprehension outcomes. However, effects were modest and suggested that differences between closed-ended testing formats may exist. Known to augment testing effects, adding feedback may be one way to promote optimal learning from practice tests and further enhance their effectiveness. The goal of the present study was to test whether adding feedback to closed-ended practice tests (multiple choice, true-false) would improve learning. Additionally, it was of interest to test whether feedback would influence learners’ metacomprehension accuracy (the accuracy of their judgments for how well they think they understand a given text). Results demonstrated that presenting practice test questions in a multiple-choice format improved comprehension over asking these same questions in a true-false format. However, feedback did not improve learning from the practice tests, and elaborative feedback increased the error in participants’ judgments for how well they thought they understood the texts. One explanation for the failure to see benefits from feedback is that students may have only processed feedback superficially. The present study suggests that for practice tests consisting of closed-ended inference questions adding feedback may harm students’ understanding of how well they have learned while not actually helping them learn at all.