Article Text

PDF
Clinical examination in the OSCE era: are we maintaining the balance between OS and CE?
  1. Alexandre Lafleur1,
  2. Jimmie Leppink2,
  3. Luc Côté3
  1. 1Department of Medicine, Laval University Faculty of Medicine, Quebec City, Quebec, Canada
  2. 2School of Health Professions Education, Maastricht University, Maastricht, Limburg, The Netherlands
  3. 3Department of Family and Emergency Medicine, Laval University Faculty of Medicine, Quebec City, Quebec, Canada
  1. Correspondence to Dr Alexandre Lafleur, Faculté de médecine, Pavillon Ferdinand-Vandry, bureau 2207C, Université Laval, 1050, avenue de la Médecine, Québec city, Québec, Canada G1V 0A6; alexandre.lafleur{at}fmed.ulaval.ca

Statistics from Altmetric.com

After more than three decades of assessment of clinical competence through Objective Structured Clinical Examinations (OSCEs), are medical trainees viewing clinical examination through the lens of an assessment method? Has the structure (the OS) become more important than the skill itself (the CE)? For many students, the standards required by OSCEs are the skills expected for clinical competence. Most recent textbooks on clinical examination are based on the structure needed to succeed in an OSCE, but OSCEs were not designed to teach an ideal framework for performing a clinical examination and most medical encounters require a more flexible patient-centred approach. How can we ascertain that medical students are focusing less on the OS and more on learning ‘the art’ of CE?

The OSCE remains a very important method for assessing students in many medical training programmes and in national examinations. The task demands of OSCEs and the importance of the examination exert a strong influence on students.1 ,2 Therefore, the pre- and post-assessment effects of OSCEs on students' learning may take precedence over the demands of other assessment methods. As recently stated by Harden,3 more than 1600 articles have been published on OSCEs, and one quarter of those have been published since 2011. However, very few studies have focused on the impact of the ‘OSCE era’ on how students learn and perform clinical examinations.1 ,2 ,4 In agreement with Hodges,5 we believe that OSCEs may have shaped the roles doctors play in clinical practice, an area worth investigating.

In our recent studies focusing on pre-assessment effects, 5th year medical students were given 25 min to study in pairs how to perform an articular examination in an OSCE. Interestingly, even in fully equipped rooms, students practiced for less than 6 min and intentionally devoted their time to discussions, which were perceived as being more effective for learning about the structure of OSCEs.6 This resonates with the findings of Rudland et al,4 who asked 5th year medical students why preparing for OSCEs is helpful: nearly 80% of students referred to the structural aspects of OSCEs and only 7% to improving examination and communication skills. Worryingly, what students refer to as ‘practicing for OSCEs’ could really mean discussing a structured routine outside of the clinical environment rather than engaging in deliberate practice.4 ,6 Depending on the type of OSCE, it is therefore not uncommon for a student to perform well in the technique of CE but be unable to explain his diagnostic reasoning to the observer, since the OS was the focus of his preparation.4 ,6 ,7

As most medical students adopt a strategic approach when studying for OSCEs, assessors need to be as strategic to ensure meaningful learning from OSCEs. Students will observe the task demands of an OSCE and whether the emphasis is on the structure (eg, conducting a normal shoulder exam; a part task) or achieving a genuine global performance (eg, examining a painful shoulder to justify their diagnosis; a whole task).6 ,7 Whereas in part-task assessments CE is fragmented, whole-task assessments remain meaningful and coherent, reflecting how CE is performed in the workplace. We demonstrated that this perception of the task demands influences the use of diagnostic reasoning and study strategies when learning how to conduct a physical examination.6 ,7 In other words, if assessors notice that students are rehearsing a routine, OSCEs can be redesigned to assess a meaningful whole task. The introduction of global rating scales and increasing the weight of discriminating or evidence-based items in assessment checklists are promising in that regard.8 The hypothesis-driven physical examination, where students are asked to elicit and interpret the findings of a physical examination in order to reach a diagnosis, provides an inspiring framework.9

As part of a programmatic approach, other assessment methods should be strategically combined with OSCEs to assess clinical competence.10 For instance, mini-clinical evaluation exercises (Mini-CEX), Entrustable Professional Activities (EPA) and video recordings of actual practice are valid methods currently used in the workplace. Awaiting clarification studies, we can expect students to redirect their focus on CE rather than OS in those whole-task assessments.1 ,6 ,7 Consequently, a combination of assessments, where trainees are actually performing a clinical examination instead of only ‘showing how’, will help maintain the balance between OS and CE.10

References

View Abstract

Footnotes

  • Competing interests None declared.

  • Provenance and peer review Not commissioned; internally peer reviewed.

Request permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.