Article Text

PDF

FRCS first pass variance: deanery and specialty contrariety
  1. Chris Brown1,
  2. Tarig Abdelrahman1,
  3. John Pollitt1,
  4. Mark Holt1,
  5. Wyn G Lewis1,2
  1. 1 Wales Post Graduate Medical and Dental Education Deanery, School of Surgery, Cardiff, UK
  2. 2 Upper GI Surgery, University Hospital of Wales, Cardiff, UK
  1. Correspondence to Professor Wyn G Lewis, Wales Deanery Specialty School of Surgery, University Hospital of Wales, Heath Park, Cardiff CF14 4XW, UK; wyn.lewis4{at}wales.nhs.uk

Abstract

Background FRCS exit examination success may be interpreted as a surrogate marker for UK Deanery-related training quality. The aim of this study was to evaluate relative FRCS examination pass rates related to Deanery and Surgical Specialty.

Methods Joint Committee on Surgical Training-published examination first attempt pass rates were scrutinised for type I higher surgical trainees and outcomes compared related to Deanery and Surgical Specialty.

Results Of 9363 FRCS first attempts, 3974 were successful (42.4%). Median and mean pass rates related to Deanery were 42.1% and 30.7%, respectively, and ranged from 26.7% to 45.6%. Median (range) pass rates by specialty were urology 76.3% (60%–100%), trauma and orthopaedic surgery 74.7% (58.2%–100%), general surgery 70.0% (63.1%–86%), ENT 62.5% (50%–100%), cardiothoracic surgery 50.0% (25%–100%), oral and maxillofacial surgery 50% (40.0%–100%), neurosurgery 50% (22.7%–100%), plastic surgery 47.6% (30.0%–100%) and paediatric surgery 25% (16.7%–100%). Significant variance was observed across all specialties and deaneries (p=0.001).

Conclusion As much as threefold variance exists related to FRCS examination first attempt success, trainees should be aware of this spectrum when preferencing deaneries during national selection.

  • Surgery

Statistics from Altmetric.com

Introduction

The arena of surgical education, assessment and published works has a long and distinguished history, originating with Galen of Pergamon (AD 131–201), the accomplished Greek physician and scholar, practising in Roman antiquity. The west’s senior surgical college, The Royal College of Surgeons of Edinburgh, has enjoyed a continuous existence since 1505 when the Barber Surgeons of Edinburgh were formally incorporated as a Craft Guild of the Burgh, with an associated Charter of Privileges imposing critical duties that every master surgeon have full knowledge of anatomy and surgical procedures, and all apprentices be literate and thorough testing be performed at apprenticeship end. From its earliest origins, therefore, as the Incorporation of Surgeons (renamed RCS City of Edinburgh 1778), RCS Ed. was an examining body concerned with setting and maintaining professional standards and, during its first two centuries, admitted to membership apprentices trained for 6 years by master surgeons.1

The Royal College of Surgeons of England similarly had a vested interest in its members furnishing proof of training, and in the days of the Company of Surgeons (renamed RCS Eng. 1800), it was customary for candidates to appear before the Court of Examiners after a 7-year apprenticeship to prove their competence. The first published curriculum appeared in 1819, in keeping with the Apothecaries Act (1815), aimed at raising the standard of medical practice, and in 1838, an intercollegiate conference (Edinburgh, Dublin and London) attempted to standardise the UK surgical curriculum with agreed minimum requirements. Finally, in 1884, an accord was reached and the FRCS examination subsequently flourished.2

All of the above clauses remain relevant to contemporary surgical practice, and with the development of formal training programmes, examinations are mapped to curricula, which are continually revised and updated. Yet, if trainees from different deaneries perform differently in high-stakes evaluations, questions arise regarding the causes of such variation, with the most vivid concerning the extent to which differences in training provision may cause differential performance. Moreover, modern-day FRCS examination success may be interpreted as a surrogate marker for deanery-related training quality, and the aim of this study was to evaluate the FRCS, first attempt pass rate, related to UK deanery and surgical specialty, to determine the overall comparative success rates.

Methods

Joint Committee on Intercollegiate Examinations (JCIE) data, related to specialty-specific intercollegiate Fellowship of the Royal College of Surgeons (FRCS) examination results, were accessed from the JCIE website3 for all Section 2 sittings between January 2007 and February 2017. Data for non-national training number candidates were excluded from the analysis. The numbers of candidates successful at their first attempt and the total number of candidates sitting the examination were identified and the percentage first-time pass rates were calculated related to region and specialty. Regional Deaneries were subsequently ranked by all-specialties and individual specialty examination performance, respectively.

Statistical analysis

Statistical analysis appropriate for non-parametric data was performed using SPSS Statistics for Macintosh V.23.0 (IBM, Armonk, New York, USA).

Results

Data for all 20 regions and all nine specialties were available. Deanery ranking for pan-specialty examination performance is shown in table 1.

Table 1

Deanery ranking by FRCS examination performance across all specialties (cardiothoracic surgery, general surgery, neurosurgery, oral and maxillofacial surgery, otolaryngology, paediatric surgery, plastic surgery, trauma and orthopaedic surgery, and urology)

Median and mean first attempt pass rates were 42.1% and 30.7%, respectively, and ranged from 26.7% to 45.6%. First attempt pass rates related to specialty are shown in figure 1 (overall median 50%, range 25%–76.3%). Regional ranking by surgical specialty is shown in table 2.

Figure 1

Boxplot of FRCS first attempt pass rate by specialty. CTS, cardiothoracic surgery; ENT, otolaryngology; GS, general surgery; NS, neurosurgery; OMFS, oral and maxillofacial surgery; PAN, pan-specialty; PL, plastic surgery; PS, paediatric surgery; TO, trauma and orthopaedics; Ur, urology.

Table 2

Regional ranking related to specialty

Table 3 demonstrates regional specialty examination performance ranking by quartiles, and table 4 demonstrates regional performance in terms of the proportion of deanery appearances in respective quartiles based on individual specialty examination pass rates. Statistically significant variability was observed globally across all specialties and regions (p=0.001), but not when assessed by individual specialty (p=0.457, Friedman’s two-way analysis of variance).

Table 3

Regional specialty examination performance ranking by quartiles (Q3, upper quartile; Q1, lower quartile)

Table 4

Regional performance by combined, all-specialty pass rate quartile distribution (all values are in percentages)

Discussion

This is the first study to demonstrate regional and specialty-specific examination-performance-based ranking of UK higher surgical training programmes. Any reasonable observer would be surprised that successful outcomes of such a high-stakes professional evaluation as the FRCS examination varied to such a degree. It was evident from the data that some regions performed more strongly in certain specialties than others, with pass rates ranging from 0% to 100% across all specialties. The biggest variation was observed in the smaller surgical specialties; paediatric surgery was associated with the biggest and fivefold variance (pass rate 16.7%–100%), followed by neurosurgery (fourfold variance, 22.7%–100%) and then plastic surgery (threefold variance, 30%–100%). Specialties with the least variation were general surgery (1.3-fold, 63.1%–86%), and trauma and orthopaedics (1.6-fold, 58.2%–94.9%).

Fitzgerald and Giddings from The Royal Marsden,4 London, reported similar findings in 2011, related to the MRCS examination, citing first attempt pass rates ranging from 54% to 94%. The likely multifactorial responsible influences were acknowledged, and without prior knowledge of the academic and clinical acumen of candidates, it was impossible to ascertain how much regional initiatives or prior undergraduate education may have contributed to the examination outcomes. This theory has previously been examined in respective papers from McManus et al and Bowhay et al who described the significant influence that variation in undergraduate medical school attendance had on the Royal College of Physicians’5 (MRCP UK) and Fellowship of the Royal College of Anaesthetists’6 (FRCA) examination performance, respectively. While these are valid points, the exact influence that both undergraduate and early postgraduate training may have on Higher Surgical Training (HST) exit examination performance is unclear, given the variation in trainee demographic and the range of foundation programme and core surgical training programmes that exist. In contrast, after SpR appointment to a HST programme, successful Certificate of Completion of Training (CCT) award typically takes a minimum of 6 years, likely the longest period of time a trainee will have spent in a single region, and consequently it would seem reasonable to interpret FRCS success as a surrogate marker for Deanery-related training quality.

Contemporary National Selection systems, which operate for most UK HST programmes, render it possible that a trainee may apply to a region in which they have never trained or even visited, and consequently any and all information available with regard to training quality is high-value currency. The publication of annual academic achievement league tables has been employed by the Universities of Oxford and Cambridge since 1963 and 1981, respectively, and these are used as a means of ranking colleges by undergraduate examination performance.7 More recently, ranking of both national and international performance in higher education has become common,6 with three annual UK university rankings published and aimed at assisting prospective student choice. Despite their popularity, league tables must be interpreted appropriately and with caution. The obvious advantages of league tables are that they can provide valuable consumer information with regard to quality, a crucial factor for prospective candidates to weigh when choosing training locations. League tables also serve to provide a framework that demonstrates accountability, accomplishment and quality assurance, and may contribute to improved quality by stimulating interinstitutional competition.8

Ideally, ranking should not be based on a single factor, but compiled using a range of credible factors, with weighting underwritten by theory. Sadly, this is frequently where ranking systems fail and risk being methodologically unsound, unfair or only selectively measuring an institution’s achievements.9 Bowden, in criticising British university rankings, warned of the risk that ranking fails to provide students with the critical information needed to make informed choices. Evaluation of institutions’ relevant strengths and approaches to teaching and learning provides better measures than league-table position.10 In industry, where quality control is allied to success, a different system termed statistical process control is employed, which rather than judging the finished article, examines the process as a whole.11 12 This model employs control charts and recognises that the outputs of even the most perfectly tuned production system inevitably show some variation, and that even under ideal conditions similar providers (eg, regions or doctors) will seldom match each other’s performance exactly.13

The data presented in this review are in its raw format as reported by the JCIE, and arguably the deaneries of Malta and the Armed Forces could be excluded from the analysis. Although not strictly part of the UK National training scheme, Malta’s FRCS results are published in parallel with the remainder of the UK, but the small numbers of applicants mean the data must be interpreted with caution. Results from the Armed Forces are recorded as a separate deanery, yet it does not conform to a geographical area because the ministry of defence allocates trainees to individual deaneries based on trainee need and preference and attributing performance and training quality to the ministry of defence per se would be false. Nevertheless, the high ranking reported related to the specialties of trauma and orthopaedic and plastic surgery may arguably be attributed to additional training opportunities offered by association with the ministry of defence.

There are a number of inherent and potential limitations of this study, which fall in line with the general limitations of league tables described above. The regional and specialty rankings presented are based on a single factor, namely FRCS examination performance, pass or fail, and have therefore been analysed on a digital rather than an analogue scale. The weight given to this as a representation of training quality is clearly a confounding variable. In contrast, all of the data are available to the general public from the JCIE website, and must be assumed to be accurate, spanning a decade of examination sittings, with results that are subject to rigorous quality control.

Conclusion

In the current, unsettled climate of training reconfiguration,14 15 robust demonstrable training quality assurance is vital. No such nationally agreed consensus for the demonstration of overall training quality or regional strengths by deanery currently exists, and it cannot be argued that a league table based on FRCS results alone provides accountability. Measures to reduce inter-deanery variability should be considered, likely related to more uniform setting of curriculum-related global objectives, professionalising the role of surgical trainers, and an enhanced and revised ARCP process as suggested in the forthcoming Improving Surgical Training pilot.16 Formulation of an annually published, regional training quality assurance system based on the JCST CCT certification guidelines would be advantageous and welcome; this should facilitate the development and maintenance of sustainable high-quality surgical training.

Main messages

  • In the current, unsettled climate of training reconfiguration, robust demonstrable training quality assurance is vital.

  • Trainees should be aware of the regional spectrum of FRCS examination first attempt success rates when preferencing deaneries during national selection.

  • Formulation of an annually published, regional training quality assurance system based on the Joint Committee on Surgical Training CCT certification guidelines would be advantageous and welcome.

Current research questions

  • Should there be a nationally agreed consensus for the demonstration of overall training quality or regional strengths by deanery or region?

  • Why does such large regional variation in FRCS examination first attempt success rates exist despite a nationally agreed curriculum?

  • Would ranking of regions by training outcomes prove to be a beneficial step towards global improvement in training quality and outcomes?

  • What is the best method of defining quality in training?

References

View Abstract

Footnotes

  • Contributors CB: data collection, write-up and submission. TA: data collection, data analysis. JP: critical revision of the article. MH: critical revision of the article. WL: oversaw project, final approval prior to submission.

  • Competing interests None declared.

  • Provenance and peer review Not commissioned; externally peer reviewed.

Request permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.