Soon students will be back for the start of another academic term. For the past 20 years I have tried to schedule an advising meeting with my students in our healthcare graduate program. Typically, these advising sessions involve pleasant interactions on what they did on their holiday break, how the previous semester went, and their goals/concerns for the upcoming semesters. The level of engagement for the student and for me depends on how much time we have and how committed we are to the overall advising process. Most of my actual “advising” involves asking them about their self-assessment of learning from the previous semester, how prepared they were for the final exams and their performance on those exams, and what, if anything, they plan to do differently to improve their learning process in this upcoming semester. Any advice I give them is anecdotal, and many times the session just goes through a series of informal discussion, commentary and pleasant, if unimportant, chatting.
However, I am beginning to envision a paradigm for what these advising sessions could be now that we are beginning to use our new assessment platform, ExamSoft, within our program. As we begin to categorize our first batch of questions using concepts of Bloom’s taxonomy, levels of foundation vs. applied vs. management type questions, and matching questions to our various curricular threads, it may soon be possible to have advising sessions where we can use data-driven, learner-centered reports that give students and advisors a new focus on learning.
Looking at some of the case studies from academic programs that have been using this type of assessment software for a long period of time provides a glimpse into what is possible. In the future, my vague question of “how did you do on your final exams?” can be replaced with a discussion based on the students learning metrics from the previous semester that provides data-driven feedback on performance in comparison to other students, in relation to program learning objectives, and to competencies needed to successful become a well-prepared practicing physical therapist in the not-too-distant future. When it is possible to see that the student’s performance is above average on lower-level recall questions, but well below average on higher-level integration questions, I can now be a much more effective advisor and active mentor in this students learning. The program is also in a much better position to look at aggregate data that can better inform the program’s curricular committee on priorities for faculty development related to assessment and curricular planning.
In short, everyone has moved to data-driven decision making and feedback on performance, which is much needed in higher education. If we are being asked to document student learning, tools like this seem to be critical for making this happen.
The start of the academic term is upon us. Are we ready for a new approach to learning this year?