Utilizing and Assessing Peer Instructor—Generated Formative Self-Assessments Using ExamSoft™
Using emerging technologies, such as online programs and course management systems, as learning resources is popular among students.(1) Self assessments are a type of learning resource conducive to use with emerging technologies. As a formative assessment, self-assessments are study tools that help students identify areas of weakness that require focus. They are linked to increases in course exam scores, overall academic performance, and long-term retention of course material.(2) Self-testing with online quizzes is particularly popular among students and is effective at increasing exam scores.(3) Despite their usefulness, this type of additional learning resource can often be time-consuming for faculty members to create and grade.
To help combat this limitation, peer instructors can be resources for faculty and students alike. Peer instructors can significantly increase student performance, while benefiting from the program themselves.(4) It is therefore natural to seek peer-instructor assistance to implement a self-assessment program in a course. At the Wegmans School of Pharmacy, peer instructors were being underutilized, so the administration developed a program where peer instructors helped develop weekly online quizzes utilizing ExamSoft. ExamSoft is an excellent tool for developing a self-testing student resource, as it allows self-assessments to be easily created, distributed, and analyzed. It also allows peer instructors to be directly involved in writing and editing questions. This document provides an overview of the steps involved in creating peer instructor–generated online self-assessment tools with ExamSoft.
Description of Self-Assessments
Peer instructors write weekly quizzes with 10 questions each. One of the benefits of using ExamSoft for the quizzes is their similarity to the testing environment for students; the quiz format was indeed developed to be as similar to course exams as possible and consists of multiple-choice and fill-in-the-blank questions. Quizzes are also secured, but unlike exams, they are not time-restricted. All quizzes are optional for students, and students can take each one up to two times. After a quiz is made available, students have access to it until the date of the corresponding exam. After they complete each quiz, students are able to see their quiz scores and review the full questions and rationales for each answer. Students are also able track their performance on quizzes throughout the semester.
Development of a Self-Assessment Program
When utilizing ExamSoft to develop peer instructor–created formative assessments, three general phases are employed: training, implementation, and assessment (Figure 1). Both faculty and peer instructors are involved in the training and implementation phases of the self-assessment program. The first step is to instruct both groups on using the software for this purpose. It is also necessary to train peer instructors to write quality questions. Faculty are responsible for reviewing questions written by peer instructors and for launching quizzes to students. In addition to writing the initial questions and rationales, peer instructors revise the questions and rationales based on feedback provided by faculty. Finally, the program is assessed to determine its use and impact. Data is collected about the use of the tool as well as the impact on both students and peer instructors.
Both faculty and peer instructors are involved in the training phase. At the start of the semester, quiz number, format, and schedule for release are determined by faculty and communicated to peer instructors. Quizzes should be short in length and available at frequent intervals so students benefit most from the resource. In the pilot program at Wegmans School of Pharmacy, quizzes were 10 questions long and were made available each week.
Within ExamSoft, the infrastructure for the program must be set first. Separate folders are required for questions and assessments, and class lists must be populated. In the pilot program, all questions, assessments, and class lists were set apart from actual courses to prevent confusion and to give peer instructors access only to quiz questions. Faculty must also create dummy accounts for peer instructors with limited permissions. Peer instructors are provided access to related question folders but do not have the ability to see other folders, create exams, or map questions. Data from quiz questions can muddy student longitudinal data, so quizzes should not be mapped to any existing categories.
Students are trained in question writing and in utilizing ExamSoft. At the beginning of the semester, students are provided with short printed resources addressing question writing. They are also given guidelines from faculty members concerning the number and types of questions to write. In the pilot program, students had not yet had the opportunity to work within ExamSoft, so resources were provided that were similar to the faculty training materials but edited to focus on question creation only.
With restricted access in ExamSoft, peer instructors are able to enter questions directly into the test bank each week. Along with questions, peer instructors provide the rationales for the correct and incorrect answers. After entering questions, the faculty members teaching the related content review the questions and provide feedback directly within the question, utilizing the internal comments feature. Using this mechanism, faculty members are able to ensure that quizzes and exams are consistent in difficulty, coverage of course learning objectives, and format. Feedback through the internal comments box also allows peer instructors to easily edit content based on the comments and to learn from the continuous faculty feedback. Throughout the semester, peer-instructor questions increase in quality based on previous feedback.
When questions are complete, faculty members assemble a weekly quiz and launch it to the class. After the quiz is made available, students have access to it until the date of the corresponding exam. Faculty members make each quiz available at least twice to allow students multiple attempts to learn the material. If data is being collected for analysis, copies are launched as separate quizzes so data can be collected from all attempts. Feedback for students is also important, and a secure review should be available for all quizzes. Students are given their quiz scores immediately upon exiting and are then able to review the questions, answers, and rationales for all quiz items.
The benefits of quizzes can be assessed by a number of methods. Data collection can be broken into two main categories: use and impact. Data concerning use of the quizzes is easy to gather from the basic reporting features in ExamSoft. In the pilot semester, all quizzes were highly utilized by students (Table1).
The impact of quizzes is two-fold: the impact on students and impact on peer instructors. To assess the impact on student quiz takers, data from assessments and individual students can be easily collected using the ExamSoft platform. Additionally, since exams are also conducted using ExamSoft, quiz and exam performance can be compared. To directly measure impact, the following information can be collected for both course exams and tutoring quizzes: average class performance for the assessment, individual scores for each student, and the percentage of students who answered each question correctly. Using these data, two simple comparisons can be made. First, individual student performance on quizzes can be compared to performance on related exams. In the pilot semester, exam performance was 2.4– 26.7% higher than quiz performance (Table 2). This may indicate that practice with the material through the quizzes better prepared students for exams.
It is also possible to group students as quiz takers or non–quiz takers based on the number of quizzes utilized by each student during the semester. Ideally, you could measure the minimum number of quizzes necessary to impact course grades, but an easier approach is to set a cut-off. In this case, since two to three quizzes are available for each exam, quiz takers are considered who take at least 50% of the quizzes for each exam. Those taking less than 50% of quizzes are grouped as non–quiz takers. Exam scores are then compared between the groups. Analysis reveals that for a majority of course exams, quiz takers perform better than non–quiz takers (Table 3). The same is true for the semester average; quiz takers score an average of 4.5 points higher on exams. These data emphasize the benefit of quizzes for students who utilize them to a high degree.
The impact of quizzes can also be measured through surveys of both students and peer instructors. As seen in Table 4, students and peer instructors feel the quizzes are beneficial learning tools. They help increase confidence and perceived performance on exams. Surveys also reveal that peer instructors feel a direct benefit from learning to write quiz questions and reviewing related material.
- Self-assessments are valuable learning tools for students, and engaging peer instructors in their creation creates new learning opportunities for peer instructors and frees up faculty time and effort.
- Creating the infrastructure for self-assessments in ExamSoft is similar to that of in-class assessments and only requires a few minor modifications.
- Quiz and exam performance indicate that students benefited from self-assessments.
- Survey feedback demonstrates that students and peer instructors perceive a benefit from self-assessments.
1. Karaksha, A., Grant, G., Anoopkumar-Dukie, S., Nirthana, S. N., and Davey, A. K. “Student engagement in pharmacology courses using online learning tools.” American Journal of Pharmaceutical Education 77, no. 6 (2013).
2. Stewart, D., Panus, P., Hagemeier, N., Thigpen, J., and Brooks, L. “Pharmacy student self-testing as a predictor of examination performance.” American Journal of Pharmaceutical Education 78, no. 2 (2014); West, C., and Sadoski, M. “Do study strategies predict academic performance in medical school?” Medical Education 45, no. 7 (2011): 696–703; Nevid, J. S., and Mahon, K. “Mastery quizzing as a signaling device to cue attention to lecture material.” Teaching of Psychology 36, no. 1 (2009): 29–32.
3. Stewart, D., Panus, P., Hagemeier, N., Thigpen, J., and Brooks, L. “Pharmacy student self-testing as a predictor of examination performance.” American Journal of Pharmaceutical Education 78, no. 2 (2014).
4. Santee, J., and Garavalia, L. “Peer tutoring programs in health professions schools.” American Journal of Pharmaceutical Education 70 (2006); Haist, S. A., Wilson, J. F., Brigham, N. L., Fosson, S. E., and Blue, A. V. “Comparing fourth-year medical students with faculty in the teaching of physical examination skills to first year students.” Academic Medicine 3 (1998): 190–200.