Assessment Resources for Medical Education

August 19, 2015 Kristen Hicks

No matter how much you mean to explore the research and ideas to improve your courses each semester, we know how hard it can be to just sit down and do it. To make the process a little easier on you, we’ve collected some of the best resources we’ve got for medical educators all in one place. The beginning of the new semester is close, but for now you still have a little time to learn.

Tips and Resources to Improve Assessment in Medical Schools

Webinar: The Value-Add of Outcome Measures: Processes and Tools for Assessing Student, Course/Module, and Program Effectiveness

Of course, you want to consistently produce positive results for your students, but with administrative and accreditation pressures, you need to be able to show evidence of those results as well. Measuring your progress as you go with the help of assessment tools can help you achieve both goals. You’ll do a better job of recognizing how well your students are reaching your learning objectives (and gain a clearer picture of what to do if they’re not), and you’ll have the assessment data to prove it.

Webinar: Using Learner Data for Curricular Quality Improvement

A lot of work goes into developing your program’s curriculum. By creating a coding structure to better track how well the curriculum accomplishes your program’s learning goals, you can make sure the time you spend on curriculum planning is well spent.

White Paper: Design of a Tagged Electronic Database of Exam Questions (TEDEQ) as a Tool for Assessment Management within an Undergraduate Medical Curriculum

The primary goal of an exam is to find out how well students understand the material. With exam blueprinting, you also achieve a secondary goal: to track how successful specific lectures, assignments, and units of class time were at helping students understand the material. This white paper digs into the benefits of tagging your exam questions to track the successes (and weaknesses) of your curriculum.

Blog Post: Using Rubrics to Measure Affective Learning

Most educators hear “assessment” but think “test.” So much of what students need to learn in medical school, however, can’t be easily covered on an exam. With rubrics, you can extend assessment beyond the topics that easily translate to the question-and-answer format and measure the outcomes of more complex skills, like how well medical students handle delivering difficult news or their level of professionalism.

Case Study: Improving Student Retention and Enhancing Learning Outcomes with Detailed Performance Reports

Nobody likes to see a medical school dropout. The lost potential and wasted time and money are a shame for all involved. The Touro College of Osteopathic Medicine was committed to improving their retention rates so that more of the gifted medical students in their program could go on to realize their potential.

To achieve this, they focused on providing better, more detailed feedback through automated performance reports. By spending a little extra time before each exam to tag questions based on the learning objectives they were designed to measure, teachers
gained a detailed and easy-to-understand report on which areas students were doing well in and which they were not. The improved feedback translated to higher retention rates for the program.

All educators know that there’s always room to learn and improve. Take a little time before the semester gets going to learn from the experiences of some of your colleagues. You and your students can reap the benefits of their trial and error.

 

Previous Article
#AssessChat Recap: Writing Good Multiple-Choice Questions
#AssessChat Recap: Writing Good Multiple-Choice Questions

The theme of this #AssessChat was item analysis, and Aaron came equipped with six questions to get the conv...

Next Article
Best Practices in Faculty Training for ExamSoft – Part 1
Best Practices in Faculty Training for ExamSoft – Part 1

Implementing a new technology is never an easy process. Here are some best practices for faculty training a...