My Students Cheated and That’s OK

By Kali M. VanLangen, PharmD, BCPS

We all talk to our students about academic dishonesty. Whether it be on the first day of professional school or course orientation, it’s been discussed. Students know not to cheat, but do they understand what that means? It seems intuitive to faculty but it may not be so clear to students.

We recently held our end of semester skills-based assessments. With approximately 150 students, not everyone is able to complete the assessment simultaneously. The assessment is completed in a 4-hour time frame but it only takes students 60 minutes to complete all stations, which leaves opportunities for students to share information. Midway through the assessment, I received an e-mail about students discussing the assessment in a common area. As the assessment progressed, more reports of academic dishonesty rolled in.

Why does this happen?
This experience brought to light that many students don’t view sharing case information from a skills-based assessment as cheating. I reached out to others on AACP connect and received several responses that others have had similar experiences on these types of assessments. Despite the fact that we call our assessments Objective Structured Clinical Exams (OSCEs)…there is a lack of viewing these activities as exams. Following the assessment, we met with each student reported and several commented that OSCEs aren’t viewed as exams and they didn’t realize they were cheating until it was brought to their attention.

National graduate survey results from 2017 indicate that 98% of students are aware of expected behaviors with respect to professionalism and academic dishonesty, yet we still have issues.1 In a survey of pharmacy students on academic dishonesty, approximately 53% of students either agreed or strongly agreed that cheating occurs on every pharmacy exam while 54.4% of respondents either agreed or strongly agreed that cheating is normal.2

Does it matter?
As faculty, we get frustrated with this type of behavior. These are professional students and shouldn’t they know not to discuss the cases with others? After this experience, I took the time to look more closely at student performance to see if there was a change from the beginning of the assessment to the end. Anecdotal reports told me there was a perceived difference but I wanted to know more.

The analysis showed that of the 30 students who needed to remediate, only 36.7% were from the beginning groups. The remaining 63.3% of remediation students came from the groups that began the OSCE after the first group finished. Average scores on each station also did not show much of a difference between the first groups and the last groups.

Graded Station #1*Graded Station #2*Graded Station #3*
Student Groups
1 – 7
8.387.709.32
Student Groups
8 – 20
8.607.618.84

*Each graded station is 10 points.

These results are similar to OSCE performance seen in third year medical students, but do vary from a simulated OSCE breach where scores improved when detailed information was provided.3,4 So perhaps it doesn’t matter that they talked. After all, this is a SKILLS-based assessment. Are we measuring students ability to perform a skill rather than asking them to regurgitate information like on a multiple choice exam? For my assessment, the students are given a patient case and the objective is to identify and help the patient overcome a barrier. There is a lot that goes into that process and communication skills are key. Communication skills aren’t something that significantly improve by knowing the topic of the case. Students still need to be able to empathize with the patient, use motivational interviewing skills to help the patient overcome the barrier and ultimately make an appropriate plan with the patient. Having knowledge of the case may impact their ability to develop a plan, but doesn’t necessarily impact their ability to communicate that plan clearly and effectively.

Steps We Can Take
Academic dishonesty should still be discouraged. As faculty, we should help students realize these assessments are exams and discussing scenarios is academic dishonesty. There are further steps we can take to limit this behavior including:

  • Require students to sign an honor code annually and/or prior to each assessment
  • Quarantine students to prevent communication between groups
  • Remind students to refrain from sharing information
  • Use multiple rotating cases

Final Thoughts
Skills-based assessments are vital to assessing our curricula and maintaining their integrity is critical. We must also remember that we are evaluating skills and in most cases, the students already know the skills they are being evaluated on. The patient case is the vehicle to help put the skill into context. Students sharing patient case information may not be as big of a deal as we make it out to be.

References

  1. American Association of Colleges of Pharmacy Graduating Student Survey 2017 National Summary Report. July 2017. https://www.aacp.org/sites/default/files/2017-10/2017_GSS_National%20Summary%20Report.pdf
  2. Rabi SM, Patton LR, Fjortoft N, et al. Characteristics, prevalence, attitudes, and perceptions of academic dishonesty among pharmacy students. Am J Pharm Educ. 2006;70(4): Article 73.
  3. Parks R, Warren PM, Boyd KM, et al. The objective structured clinical examination and student collusion: marks do not tell the whole truth. J Med Ethics. 2006;32:734-738.
  4. Gotzmann A, De Champlain A, Homayra F, et al. Cheating in OSCEs: the impact of simulated security breaches on OSCE performance. Teach Learn Med. 2017; 29(1): 52-58.

image

Kali M. VanLangen is an Associate Professor of Pharmacy Practice at Ferris State University College of Pharmacy. Her research interests include APPE readiness and the use of electronic health records in the laboratory setting. In her free time, Kali enjoys spending time outdoors with her husband and two daughters.


Pulses is a scholarly blog supported by a team of pharmacy education scholars.

Leave a comment