A Journey To Assessment Success

December 1, 2015 Brad Marcum

In the late 80’s, I was a young recipient of a graduate degree in English. I was eager to educate students in the joys of writers such as Shakespeare, Samuel Beckett, Alice Walker, and Doris Lessing. I was prepared to show my soon-to-be students what a fine eye I had for the nuances of great literature, which would transform their lives and help them be great scholars themselves.

I interviewed for a variety of teaching positions and discovered the wayward traveling life of an adjunct professor. I traveled across state lines, changing my class times due to changes in time zones and daylight saving time, and was armed with my giant briefcase full of developmental English textbooks—a modern literary hobo, trying to patch together meager paychecks to keep me in my small apartment. I wasn’t near any of the literatures, but I was waist-high in developmental English courses. It was in this period of my teaching life when I first started reading articles about a movement to assess how effective we were in educating students. I had a good relationship with the English chair at one of the community colleges where I was an adjunct. I asked about her thoughts on the subject, but she made it clear that she did not think that much of the idea. This was echoed by my mentor, who did not like the idea of people telling him that his questions were not effective. I can still hear him:

“Who are these people to tell me about my questions? I have used them for years. They’ve always worked well because I know what I am doing!”

However, I have been a stats fan for years. I played a number of sports, but my buddies always teased me about my fascination with statistics; in a way, that was setting me up for the assessment revolution that was coming to higher ed. I became intrigued with how we could document ways to be effective when we were in the business of educating students.

At this point, I became an English faculty member at a small public university in the Midwest. This was my first full-time appointment, and I was eager to dive in and show my colleagues that they had made a good decision to hire me. I was excited that I actually had my own office (even if for the first few weeks it was in a closet), and I worked hard to bring to life the writing center that had been shut down.

It took three years before we hired a trustworthy assessment director. I had unsatisfactory experiences with her two predecessors, and my distrust was obvious as the new director made her first meeting with the faculty. But then I noticed that she was changing her entire presentation on the fly as she realized her audience was different from what she expected. I discovered that this was someone who actually knew what assessment was and how it could be used to help us be better instructors.

I left that school to teach at a private liberal arts school in the South. I worked for the undergraduate program in a number of roles, including WebCT coordinator, English instructor, and webmaster, and so many other positions that my colleagues claimed I was trying to set a Guinness World Record as the person with most titles ever at a school. I referenced my baseball past when we introduced ourselves at faculty committees: “I’m Brad Marcum—utility infielder.” If the school needed something to be done, I was usually called in to help.

I became a part of Academic Affairs in the medical school. One of my first tasks was to look at programs for question banks. I suggested that the school move to computerized testing since computer-based testing was used for board exams. At that time, the people who facilitated the exams came to campus to inform us of some of the changes in the board exams. They started with their PowerPoint presentation, but quickly showed us a video of a person walking with a rather awkward gait. They then showed us the question and answer choices that a student would choose to show what condition that person demonstrated with that walk. I leaned over to the academic dean and said, “We can’t do that on paper.” The next question started with heart sounds resonating in the auditorium. Again, they displayed the question and answer choices for the students to choose what symptom was being demonstrated. I again leaned over to the academic dean and said that we could not duplicate that on paper.

We had our own examples at about the same time. We had questions about dermatology on our exams, and we used an overused copier that we called Godzilla to run the images for the exams. I do not know how our students could make out the dermatology images on the bad black and white paper. We also had a series of post-exam reviews with the course director and students so that both sides could discuss the class experience. In this case, the students mentioned that they had trouble reading the chest X-rays because they were very cloudy. The course director became agitated and said, “I gave them an extra hour to look at the X-rays!” My question was, “Did the images get any better on that black and white page during that hour?”

It was not long after that I was tasked with looking at computer-based testing programs. I saw a webinar from ExamSoft and was floored by the reports and statistical information that it was able to produce. We quickly partnered with ExamSoft, and it has dramatically changed how we teach. We had never really used data to make curricular or programmatic decisions at the med school. We gathered large amounts of data, but we did not really do much with it. Now we had a wealth of reports that we could use to show what we were doing.

That also opened other doors for us. Our paper budget nearly disappeared. We used to print every PowerPoint and give them to the students for every lecture. We switched to ebooks and gave our students iPads so that they could access all of their material there and not have to lug around all of the large med school textbooks. We changed how we made decisions on the challenge sessions that we had after exams. Previously, when students challenged questions’ validity, the course director and sometimes the associate deans would make a decision just by looking at the challenge and the questions. I can still remember our first exam with ExamSoft. Our exams were more than 400 questions, and we tested them on all of the classes in the same exam. I was in my office, running the reports in ExamSoft to break out the scores from each of the classes, and when that first set of statistical reports opened on my computer screen, I had associate deans slapping my back. They were excited because they could easily see the statistical breakdowns, and now they could analyze the data to make better decisions.

We still had pushback from the faculty. We only did computer-based testing with the incoming first-year students. At that time, the students took tests much like an undergraduate program. Each instructor gave a test in the classroom. We switched to having one test with blocks of combined courses, but the questions were grouped in five questions from a course at a time. The students who took those exams complained all of the way to graduation day about how that testing change ruined their lives. We had one of our second-year faculty members tell me that she would retire before she would work with the computer-based testing. I told her, “Well, you have a year then.” Sure enough, she resigned that next summer.

We continue to look at ways to use the data more in our decisions. We are still in the early stages for some things as our next graduation class will be the first to use ExamSoft, but we have taken our first steps to predict how students will perform best in the second year and what courses in the first year will give us that model. We are using the data to see how students will perform on their board exams.

We are also looking to do a better job of using the data as an early warning device to help reach students quicker. We are giving our students more information on how they performed on the exams with the reports that are available, something that was not always comfortable for us.

We have had discussions about using data only to decide what questions we should keep or lose, or perhaps modify to use in future exams. That one is still being debated, as it lets the numbers decide and takes out the ego factor, but many faculty still want the human element to remain. The main thing for me is that we are discussing these ideas and are listening to each other.

It is still hard for me to believe that I now travel around the country doing presentations on how to use statistical information to make changes in programs. It is hard to think that the Shakespeare in me is telling people how to use statistics. It has been a long journey that started out roughly. Despite that, I have seen the light of assessment and it is no longer a train rushing at me as it was early in my professional career. Now, I let that light shine on me.

There are still obvious groups out there that are hesitant about assessment in higher ed. We are keeping a close eye on the politicians, who keep muttering about trying to step in and show us how we should assess ourselves. I also see the various lines of demarcation during my presentations when my colleague and I present. I mention that I come from a faculty perspective, so I see the faculty in the room perk up and the assessment people tune out—and just the opposite reactions when my assessment director colleague talks of her background—but I do see more people listening and asking questions, and fewer people complaining and asking if assessments are necessary.

 

About the Author

Brad Marcum

Mr. Brad Marcum taught English for 16 years at Wright State University and Central State University in Ohio before moving to Pikeville, KY, in 2002. Currently, he serves as Director of Academic Data Services at the University of Pikeville Kentucky College of Osteopathic Medicine, where he works closely with the school’s faculty and students in administering online learning. On the side, he enjoys coaching basketball and baseball.

Follow on Twitter More Content by Brad Marcum
Previous Article
Top 7 Reasons to Kick Off the New Year with Computer-Based Testing
Top 7 Reasons to Kick Off the New Year with Computer-Based Testing

The start of a New Year means new students and new tests to administer. Here are seven reasons to start off...

Next Article
Is the Accreditation Process About to Change?
Is the Accreditation Process About to Change?

Does your program need help with accreditation compliance? ExamSoft can help with managing the changing acc...