Article Text


Communication skills assessment in the final postgraduate years to established practice: a systematic review
  1. Amy E Gillis1,
  2. Marie C Morris2,
  3. Paul F Ridgway2,3
  1. 1Department of Surgery, Tallaght Hospital, Tallaght, Dublin, Ireland
  2. 2Education Division, School of Medicine, University of Dublin, Trinity College, Dublin, Ireland
  3. 3Department of Surgery, University of Dublin, Trinity College, Tallaght Hospital Campus, Dublin, Ireland
  1. Correspondence to AE Gillis, Rm 1.36 Trinity Centre for Health Sciences, Tallaght Hospital, Dublin 24 Ireland; agillis{at}


Introduction Communication breakdown is a factor in the majority of all instances of medical error. Despite the importance, a relative paucity of time is invested in communication skills in postgraduate curricula. Our objective is to systematically review the literature to identify the current tools used to assess communication skills in postgraduate trainees in the latter 2 years of training and in established practice.

Methods Two reviewers independently reviewed the literature identifying communication skill assessment tools, for postgraduate trainees in the latter 2 years of training and in established practice following Preferred Reporting Items for Systematic Reviews and Meta-Analyses framework, and inclusion/exclusion criteria from January 1990 to 15 August 2014. Databases: PubMed/CINAHL/ERIC/EMBASE/PsycInfo/Psyc Articles/Cochrane.

Results 222 articles were identified; after review, 34 articles fulfilled criteria for complete evaluation; the majority (26) had a high level of evidence scoring 3 or greater on the Best Evidence Medical Education guide. 22 articles used objective structured clinical examination/standardised patient (SP)-based formats in an assessment or training capacity. Evaluation tools included author-developed questionnaires and validated tools. Nineteen articles demonstrated an educational initiative.

Conclusions The reviewed literature is heterogeneous for objectives and measurement techniques for communication. Observed interactions, with patients or SPs, is the current favoured method of evaluation using author-developed questionnaires. The role of self-evaluation of skill level is questioned. The need for a validated assessment tool for communication skills is highlighted.

  • assessment
  • independent practice
  • postgraduate

Statistics from



Communication breakdown is a factor in >50% of all postoperative complications, 70% of all medication errors and 80% of delays in treatment that resulted in death or permanent loss of function for patients.1 Poor communication can exist at any level of patient and healthcare worker interaction, and most errors occur in verbal communication.2 Awareness surrounding adverse events in healthcare is increasing, thought to be attributed to behavioural failures rather than failures in technical expertise.3 Litigation for medical error is increasing. Physicians who provide information, spend more time defining patient expectations and solicit patient opinion tend to have fewer malpractice claims and higher patient satisfaction.4 Communication skills and behavioural interactions have been recognised as important in other high-risk industries such as aviation, and protocols for training communication skills specifically with management teams during stressful events have been developed to address these behaviours.5

In medicine, communication skills are thought to be acquired primarily by observation and by modelling, generally without formal feedback or formal assessment. Integrating communication skills into programmes has been attempted in many forms. The use of single lectures, series lectures, workshops and simulations has been demonstrated.6–9 Communication is multidimensional, and assessment needs to include the indistinct aspects such as establishing the physician–patient relationship, empathy and non-verbal interaction.6

The Kalamazoo conference10 identified three methods for evaluating communication skills, namely checklists of observed behaviours, patient experience surveys and oral or written examinations. Checklists, objective structured clinical examination (OSCE) examinations with standardised patients (SPs) and observed interactions with real patients (whether recorded or observed in real time) are all methods that can be used to develop and test these skills within undergraduate and postgraduate training programmes.

It is recognised that medical education must function on a continuum from medical school onwards throughout a career. At different points, individuals progress through transitions (eg, from postgraduate training to subspecialty training) that require additional support and a rebalance in educational needs and responsibilities.11 Learning habits shift abruptly when exiting training programmes from an enforced curriculum to a self-directed one, for which trainees may be unprepared.11

With communication skills, these are emphasised in the undergraduate curriculum; however, attitudes towards communication skills development tend to diminish as medical students graduate to postgraduate training programmes.12 New postgraduate trainees are fresh from medical school with recent experience of a formalised curriculum, which includes communication skills training. As time progresses through postgraduate training, other necessary skills take priority over communication skills in the learning programme. A decline or erosion in communication skills has been noted over time in undergraduate students.13

In a review of risk communication in postgraduate programmes in the USA, only a median of 12 h was allotted to communication skills development.14 Similarly, a survey of accredited postgraduate oncology fellowship programs in the USA showed only 30% had some form of formal communication skills training.7 Though recognised as a core competency in medical training,15 barriers exist such as lack of faculty time and expertise. Imposing a mandatory core competency assessment is equally hampered by the lack of an accepted, validated method of assessment.7 With new graduates entering into the workforce and physicians from many countries migrating to new healthcare systems, it is necessary to ensure that communication skills are, at minimum, adequate to meet standards for the safe treatment of patients.

The objective of this study was to identify the methods used to evaluate communication skills in postgraduate training and in established practice. The focus is to examine the tools used in the latter 2 years of a primary postgraduate training programme, the point where graduates are potentially transitioning to practice and evaluation methods used for those in established practices. The methodological rigour of these studies will be assessed.


Search strategy

A systematic review of the literature was conducted by two independent reviewers (AEG and MCM) following Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines.16 A comprehensive review of the following electronic databases was undertaken—PubMed/CINAHL/ERIC/EMBASE/PsycInfo/Psyc Articles/Cochrane—using the search terms ‘communication skills assessment’, ‘postgraduate’, ‘surgery’, ‘medicine’ and all related derivations.

Inclusion criteria

All studies published in the English language, from January 1990 to 15 August 2014, were included. Studies required involvement of medical practitioners, physicians or surgeons within 2 years of completion of a primary postgraduate training programme or established practising consultants.

Exclusion criteria

Studies in which the participants were more than 2 years from completion of postgraduate training (early within the training scheme) were excluded. It was recognised that programmes within different training schemes have different end points (eg, family medicine programmes can be 2–3 years in length, surgical programmes can extend from 5 to 8 years (or more) depending upon specialty and training scheme). Care was taken to include all articles that may fit inclusion criteria. Ambiguity on the postgraduate level was erred on the side of inclusion rather than exclusion. No exclusion criteria were placed on assessment methods; any form of communication and method of evaluation could be included.

Criteria for assessing the quality of studies

Each paper was assessed for inclusion/exclusion criteria and for quality of methodology by two independent reviewers (AEG and MCM) using the Best Evidence Medical Education (BEME) guide.17 A third reviewer (PFR) arbitrated for any disputes. Each article was assessed and graded for research design, clarity of aims and use of a control group. The use of objective evaluation methods and self-assessment tools was noted, specifically the rigour of testing preceding the use of these tools. For example, tools that have been validated and published with appropriate reliability evaluation were rated higher than author-developed questionnaires, without accompanying validation. Articles that relied solely on self-evaluation of participants’ confidence in ability scored lower than studies that used objective evaluation with validated tools.


Systematic review process

A total of 2514 articles were identified through systematic review, 2292 articles were rejected on initial review (the majority did not involve medical personnel (1156) or communication skills assessment (728)), leaving 222 articles for abstract review. Following abstract review, 160 articles did not meet inclusion criteria, the majority did not include communication skills assessment (122), 9 were excluded for junior postgraduate year (PGY) status and 5 articles were excluded due to undergraduate status. Sixty-two articles remained for full review. Eleven further articles were identified from hand search of references, leaving 73 articles for full review. After full text review, 39 articles were excluded due to lack of communication skills assessment (21), non-medical or other medical personnel (eg, nursing) (3), early and unclear PGY status not meeting criteria (3) or the full paper was not yet published and only available in abstract form (8). Thirty-four articles were available meeting inclusion and exclusion criteria (figure 1).

Figure 1

Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flow chart for article selection. PGY, postgraduate year.

Study findings


In total, 22 of the 34 articles were based in the USA, 4 in Canada, 3 in the UK, 3 in Europe (2 Belgium, 1 Netherlands), 1 in Israel and 1 in India. Eleven articles involved family/general practice/internal medicine, 8 articles involved surgical residents (orthopaedic, plastic surgery, general surgery), 5 involved oncology practitioners, 4 articles involved paediatricians, 3 involved sampling from multiple disciplines and 3 articles involved radiology residents, geriatric and palliative care fellows, and critical care fellows, respectively. At total of 1355 participants within these papers fit the inclusion criteria of within 2 years of completion of training programme or were in consultant practice (table 1).

Table 1

Studies included in review

Communication assessment tools

Twenty-two articles used standardised patients,6 ,18 ,20 ,24–29 ,31 ,33–35 ,37–39 ,42–47 9 involved OSCE or objective structured performance evaluation.6 ,20 ,24–26 ,43–47 The majority (n=21)6 ,18–22 ,25 ,27 ,29–38 ,42 ,44 ,46 involved self-assessment by questionnaire, usually an author-developed, content-specific questionnaire. Seven articles used self-assessment measures only,19 ,21 ,22 ,29 ,31 ,36 ,37 while the remaining 27 articles used either objective assessment by faculty/coworker and/or patient/standardised patient exclusively, or used in conjunction with self-assessment (table 2). Nineteen articles used an educational initiative to teach communication skills.6 ,9 ,19 ,21 ,27–31 ,33–38 ,41 ,42 ,45 ,49 The initiatives varied from lecture-based content, training programmes, to simulated patients’ scenarios (table 3).

Table 2

Methods of assessment of communication skills

Table 3

Educational initiatives

Evaluation of methodological analysis

The quality of the studies was variable, though the majority (n=26) were grade 3 or above as scored by the BEME quality rating system.17 Rating of 3 or above (4 or 5) states the conclusions can probably be based on results or likely to be true based on the methodological rigour of the studies. The main weaknesses in the studies were the quality of the assessment tools and the presence of objective assessment. The majority of assessment tools (questionnaires) used were either developed by the author or the institution and were not previously validated within the literature or piloted prior to use. Those studies that scored <3 generally relied upon participants’ self-assessment of ability or confidence in ability. The reliability of self-assessment to correlate with actual ability is limited. As the majority of studies had a reasonable BEME rating, the findings can probably be based on the results found (table 2).


Communication within the healthcare system is of vital importance and remains one of the main factors of potential error that threatens patient safety. Though communication has been recognised by medical professional governance as a core competency for the professional, variability in how this skill set is taught and subsequently evaluated66 ,67 leads to heterogeneous tools of assessment highlighted in this review.

Observed interactions with either SPs or actual patients are the most often used techniques for communication skills assessment. The most robust studies have conducted objective evaluations using audio or video recording or real-time observations. These interactions have involved evaluation by expert facilitators or blinded coders using a previously validated checklist. Among these studies, there is no commonality with regards to checklist selection. Author-developed checklists were the most commonly used tools, and the majority are unvalidated. Within the studies identified, there is no standardisation among the groups of participants or scenarios, making comparison between the methodologies blunt. Self-evaluation is also common, which several authors have noted lacks correlation with objective assessment.30 ,32 ,46 There is no standardised format available, nor was the exact criteria as to what constituted improved communication skills standardised. Variations from patient satisfaction,2 ,4 ,9 ,11 ,19 increased utterances of empathy,5 ,19 appropriate responses4 ,5 ,8 ,13–15 ,19 ,20 ,22 and use of increased non-verbal gestures9 ,19 were all included as endpoints constituting improved communication. This review highlights a gap within the literature, namely that a single or series of objective assessment tools that have been widely used and evaluated does not exist.

These same issues are carried to those tools used in the development and evaluation of educational programmes. Though this review was not designed to assess the educational programmes, a number of tools were identified that used communication skill assessment techniques. Educational initiatives to teach communication skills were noted in 19 of the gathered articles. These initiatives were varied, involving lecture-based teaching, group workshops and/or role-play/scenario teaching. The majority of educational programmes used role-play in small groups with standardised or actual patients. The length of the programmes was unstandardised, from several hours to several days. Three computerised programs for analysing utterances within audio data were noted as an objective measure of communication skill and have been validated and used in many studies.28 ,35 ,48 ,49 Self-assessment was again noted as a common measurement technique of evaluating the effectiveness of a programme by demonstrating a change in communication skills. Self-evaluation is inherently inaccurate. Pre-existing confidence in one's ability is not necessarily an accurate measure of competence. It has been pointed out: “…the worst self-assessment accuracy appears to be among physicians who are, at the same time, the least skilled and the most confident.”68 From the 19 studies noted, there was no consensus on whether teaching communication skills actually improves communication, whether evaluation was purely objective or included self-evaluation. Again, the studies used different measurements of what constituted improved communication, which made direct comparison difficult. Those studies that reported some objective improvement in skill sets27 ,28 ,34 ,38 ,43–45 vary in technique and duration, and evaluation, leaving an agreement on the ideal scheme for communication skills development as yet unidentified.

Two minor issues were noted that could be of interest in future studies of communication skills. The study of Horwitz et al22 highlighted a potential use for the Social Skills Inventory (SSI). SSI is a previously validated and refined tool for generating a profile of social skills and has been extensively used within a variety of literature bases. Horwitz et al22 found that an imbalance in composite social skill profile could be elicited, indicating a potential communication skills deficit. This raises the question of whether this tool could be used to identify trainees and physicians with an imbalance of social skill profile. It may be a useful tool in establishing correlative communication deficiencies that can be targeted for improvement.

The second issue is a question of acquisition or deterioration of communication skill over years of training. Conflicting conclusions were made between several papers.21 ,24 ,26 ,47 Two articles24 ,25 noted a trend of an inverse association of worsening communication skill with increasing PGY level and suggest that this implies communication skills are inherent and not taught. Two further articles26 ,47 show a positive association with advancing PGY level and increased communication skills. The participant groups examined by four other groups9 ,22 ,32 ,46 noted that no significant difference was found between communication skill and PGY level or years of experience. The variations in assessment and the stipulations of what constitutes an improvement in communication skill would likely account for the differences, and conclusions on whether learning or inherent skill is the primary driver of communication fluency cannot be made. However, the argument that providing insight into a deficit (of interpersonal communication skill) and providing a structured correctional programme of exposure and practice could seemingly provide for discrepancies in natural skill.


This review is limited by the small number of studies meeting inclusion criteria and by the heterogeneity of the published papers. Each of the reviewed studies is affected by a number of limitations and biases. The majority of these studies recruit volunteers creating a bias towards those interested in improving or assessing their own communication skills. Those participating show insight and have an interest in communication skills that is often not present in those that have poor communication skills. The ‘Hawthorne effect’,69 where a participant's behaviour can improve or change in response to being observed, is a prevalent confounder in many of these studies with external observation. The halo effect or examiner bias could be existent as a number of evaluations are performed within residency programmes and examiners are likely to know or to have had some prior knowledge of the participants.

Also, search terms focused on communication skills assessment specifically; however, other associated concepts such as empathy, teamwork, professionalism and leadership were not included, but may have communication as a unifying theme that would not be represented within the search provided here.

This review highlights the heterogeneity within the literature surrounding communication skills and the assessment of these skills at the postgraduate level of training and beyond. It has highlighted the more common tools and methods of evaluation used in the latter stages of training and in practice. Many tools exist, though no one method or methods have become the standardised means to assess this skill set. Much work needs to be done. A standardised format for evaluation of communication skills needs to be developed. The ideal educational programme, duration, method and conduct needs to be defined. This review has also identified the need for a defined, agreed standard of what constitutes acceptable communication skills.

The quality of the studies and the innovative and novel techniques demonstrated by the studies included in this review indicate that much promise exists in finding an accurate means of furthering this necessary competency in medical education.

Main messages

  • No definitive, validated tool to evaluate communication skills exists at the postgraduate level.

  • Development of a validated tool is needed to progress standardised communication skills assessment at this level.

  • The benefit of educational initiatives for improving communication skills cannot be demonstrated from the available literature.

  • A defined standard of what constitutes acceptable communication skills needs to be defined.

Current research questions

  • What is the ideal educational programme to teach communication skills?

  • Does the use of a social skills tool assist in identifying people in need of further communication skill training?

  • Do communication skills ameliorate or deteriorate with advancing postgraduate levels?

Key references

  • Roter D, Rosenbaum J, de Negri B, et al. The effects of a continuing medical education program in interpersonal communication skills on doctor practice and patient satisfaction in Trinidad and Tobago. Med Educ 1998;32:181–9.

  • Ponton-Carss A, Hutchison C, Violato C. Assessment of communication, professionalism and surgical skills in an objective structured performance-related examination (OSPRE): A psychometric study. Am J Surg 2011;202:433–40.

  • Razavi D, Merckaert I, Marchal S, et al. How to optimize physicians’ communication skills in cancer care: results of a randomized study assessing the usefulness of post training consolidation workshops. J Clin Oncol 2003;21:3141–9.

  • Levinson W, Roter D. The effects of two continuing medical education programs on communication skills of practicing primary care physicians. J Gen Intern Med 1993;8:318–24.


View Abstract


  • Contributors MCM and AEG conducted the development and implementation of the project design, acquired and interpreted the data. PFR assisted on design of the study and on analysis and interpretation of the data. All authors contributed to either writing or re-drafting this paper. All authors have reviewed this paper in its final version prior to submission with approval.

  • Competing interests None.

  • Provenance and peer review Not commissioned; externally peer reviewed.

Request permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.