Responding to Student Feedback on ExamSoft

August 20, 2015 Carla Hernandez

Each semester in higher education is full of demands for its faculty and administrators. Recurring challenges coupled with ironing out the logistics of a new computer-based testing program can make it seem even busier. Regardless of the pressures they encounter, key administrators and faculty must not overlook the fact that exam takers are also integral to a successful ExamSoft implementation. Perfecting the logistics and coordination behind the program may seem most important; however, recognizing the importance of the exam takers’ experience is just as crucial. During our institution’s recent first semester of ExamSoft use, exam takers from two academic programs were vocal about their preferences—from the first sign of computer-based testing—so we made sure to listen.

As many schools likely face in the wake of introducing computer-based testing, there was resistance to handle and tough feedback to hear. Mock exams were first administered to introduce exam takers to the new software. Some noticeable frustration was observed, but the mock exam provided an opportunity for the key administrators to gauge other initial concerns. Any issues experienced by exam takers at the mock exams and subsequent exams were documented. We tracked the frequency in which repeated problems occurred and added our institution to any corresponding open tickets through the community or our account manager with ExamSoft. Surveys were disseminated to all exam takers during that semester, and we could see the transition from traditional paper exams to computer-based testing was not wholeheartedly accepted. Regardless of their positive or negative reactions, students were encouraged to share their feedback and constructive criticism throughout the semester.

Surveys confirmed what we suspected. A majority of exam takers, from both academic programs using ExamSoft, were not comfortable with computer-based testing. However, the surveys also informed us of details we would not have been aware of otherwise. A majority of the students surveyed in one course did not agree that the student reports identified gaps in their learning, which alerted the course’s instructor that the categorization process needed some modification. However, on a positive note, 70% of the same group of students stated they would be interested in seeing more reports on their performance. Responses to other questions revealed some specific exam takers’ preferences, such as in the formatting of certain questions, the display of attachments, and the structure of exams. Students also expressed they would have preferred a more detailed orientation to ExamSoft’s interface and its features.

Due to the feedback provided by our pilot period’s exam taker groups, we learned that providing students with resources early on is necessary. For one of our allied health programs, a more comprehensive ExamSoft segment will be presented during the orientation of incoming students. Implementing a concise ExamSoft orientation for exam takers of other academic programs is also in discussion. A condensed student resources manual has also been shared with exam takers to provide them with answers to the most frequent questions we have received thus far.

Following the pilot period, we learned that as intuitive as technology seems to be for students, extra levels of support are still needed for users new to computer-based testing. By nature, exams tend to be stressful, and we do not want to exaggerate that stress by sending students into a new testing environment feeling any more apprehensive. We will continue to monitor student feedback as we expand our usage of different question types and as more courses begin to utilize ExamSoft. As challenging as it may seem to receive harsh criticism, the only way to continually improve our model of computer-based testing is to be aware of the processes that work best for all stakeholders and to remain cognizant of the processes that could be improved upon.

 

About the Author

Carla Hernandez

Carla Hernandez is a Data Administrator with the Office of Assessment and a Key Examsoft Administrator in the College of Pharmacy and Health Sciences at St. John’s University in Queens, New York. Carla facilitates data management and exam administration analysis for the Health Sciences programs in the Department of Clinical Pharmacy Practice. Her research interests include higher education assessment, improvements in student learning, and quantitative statistical analysis of student learning.

More Content by Carla Hernandez
Previous Article
Assessment Resources for Dental Education
Assessment Resources for Dental Education

Assessment resources for dental educators and education programs.

Next Article
Tips for a Speedy CBT Implementation
Tips for a Speedy CBT Implementation

Introducing a new technology in the right way can make a difference in how your faculty and staff respond.