Article Text

Download PDFPDF

The Foundation Programme assessment tools: An opportunity to enhance feedback to trainees?
  1. S Carr
  1. Correspondence to:
 Dr S Carr
 University Hospitals of Leicester NHS Trust, Leicester General Hospital, Gwendolen Road, Leicester LE5 4PW, UK; sue.carr{at}


The recent change in working patterns of doctors in training has meant that the traditional systems of education are under increasing pressure and that there is the need to maximise new opportunities for learning. One new opportunity may arise after the introduction of the mandatory assessment systems (Mini-CEX, DOPPS, Multi-source feedback, and Case based discussion) in the Foundation Programmes. In this review the new assessment procedures for the Foundation Programmes are outlined and the potential of these assessments (using Mini-CEX as main example) as an opportunity to give feedback to trainees discussed. The importance of feedback in professional development and some of the techniques available for giving feedback are described. The Foundation Programme assessments will occupy a significant amount of trainees’ and trainers’ time and it is important that opportunity for feedback and learning is maximised.

  • feedback
  • postgraduate medical education
  • modernising medical careers
  • Mini-CEX
  • feedback

Statistics from

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

The introduction of The European Working Time Directive and the New Deal document1 have had a profound effect on the working patterns of doctors in training. There has been a change in working patterns from a traditional on-call pattern to a shift system of working that has inevitably led to a reduction in the quantity of time available for learning. As a result of these changes in working practices some authors have reported deterioration in quality of learning opportunities.2,3 The reduction in hours worked has increased work intensity and reduced opportunity for personal reflection and feedback from colleagues (that is, consultants, registrars, and fellow senior house officers).

In addition, important changes in the structure of doctors training have recently occurred with the introduction of Modernising Medical Careers4,5 and the commencement of Foundation Programmes for all doctors graduating from medical school in the UK.

The programmes consist of a two year planned programme of training and assessment:

  • Foundation year 1—equating to previous pre-registration house officer training

  • Foundation year 2—(post-registration year) will incorporate a generic first year of training.6

The tools to assess competency in the Foundation Programme6,7 and are the Mini-CEX assessment (Clinical Evaluation Exercise),8 Direct Observation of Procedural Skills (DOPPS), Case based discussion (CBD), and Multi-source feedback (MSF). These assessment methods aim to assess trainees’ performance in a real clinical setting.

The Mini-CEX was developed, piloted, and evaluated in the USA and is now widely used to assess doctors on American Residency programmes.9–11

The mini-CEX assessment entails direct observation by an educational supervisor of a trainee’s performance in real clinical situations (15–20 minute) and is designed to assess skills such as history taking, clinical examination, communication skills, diagnosis, and clinical management. The assessment is repeated on multiple occasions and can occur in various clinical settings—that is, clinic, ward rounds, GP surgeries, etc. The method has been shown to be reliable and to have construct validity11 and to be a good method of education as well as an assessment tool. Mini-CEX has also been evaluated in the assessment of clinical skills in medical students in the USA.12

Direct Observation of procedural Skills (DOPPS)

Historically, competence in practical procedures has been assessed using log books and opinion of educational supervisors. The Royal College of Physicians developed the DOPPS tools and report that directly observed performance is likely to be more valid and reliable than the previous log-book based system.13–15

Case based discussion (CbD)

Focuses on evaluation of clinical reasoning by reviewing a case and the trainee’s entries in the patients’ case notes. This assessment tool was developed based upon the General Medical Councils performance procedures and its use has previously been described in primary care.16

Multi-source feedback (MSF)

This method uses questionnaire data from eight colleagues medical and non-medical assessing aspects of performance. MSF has been used mainly in industry and business13,17–21 to assess performance and as a means of providing feedback to trainees. The mini peer assessment tool (Mini-PAT) is a multi-source feedback tool that collates the views from a range of clinical colleagues and compares with a trainees self assessment of performance. The rating and free text comments from the eight assessors are then fed back to the trainee by the educational supervisor.15

The mini-CEX and other assessment tools used in Foundation Programmes will take trainees and assessors a significant amount of time to perform. Therefore, it is essential that in addition to assessment, that we maximise the potential for education especially in light of the problems presented by change in working patterns and limited contact with trainee doctors.

The mini-CEX and other assessments tools entail direct observation of trainees and as such the assessments offer an opportunity for regular contact between trainees and trainers in clinic on ward rounds, etc, that may help to provide meaningful and timely feedback to trainees about clinical performance. By such means, we may help redress the perceived reduction in feedback and mentorship that have arisen after the introduction of shifts and new working patterns in hospitals.

In the author’s opinion, some of the assessments are easier to facilitate than others. The DOPS, CbD, and MSF seem to be comparatively easy to accommodate into the working day but the Mini-CEX is more complex requiring more planning and scheduling into either clinic or ward round time. Implementing the assessment tools will have significant effects on clinical service and therefore it is important we use the time with trainees effectively and negotiate adequate time to undertake the assessments.


In the Foundation Programme curriculum6 the importance of giving feedback to trainees after each assessment is emphasised.

Providing good quality and timely feedback has an essential role in learning and professional development in medicine.

In clinical medicine feedback refers to the giving of information describing a doctor’s performance in an observed clinical situation. The trainee is given specific, subjective comments on their observed performance in a way that is useful for them to consider and use to improve their future performance.

Feedback presents information and is not intended to be judgmental, although there is almost inevitably some judgement attached (boxes 1 and 2).

Box 1 Pendleton’s rules of feedback22

  • Observer clarifies matters of fact

  • Trainee identifies what went well

  • Trainer highlights what they observed went well

  • Trainee discusses what did not go well and how they could improve this aspect of performance

  • Trainer identifies observed areas for improvement

  • Trainer and trainee agree areas for improvement and formulate and action plan

Box 2 Useful rules for giving feedback23

  • Clarity—be clear about what you want to say

  • Be specific—avoid general comments

  • Ownership of feedback (use “I” or “the assessors” type statements)

  • Emphasise the positive, be constructive

  • Comment on behaviour that can be changed, not personality

  • Be descriptive rather than evaluative

  • Be careful with advice—help the person come to a better understanding of their issue and how they can identify actions to address the issue more effectively

  • Timing and environment—agree a time and place

There are several methods described to help teachers provide feedback to trainees.21–26 One of the older but more commonly used feedback techniques in clinical medicine is that described by Pendleton.22

This technique delivers feedback to the trainee in a structured way and aims to be non-evaluative.

Pendelton’s series of questions give the opportunity for the trainee to make observations about their own performance and to set goals for the future. However, there can be difficulties delivering feedback to trainees using this rather old fashioned and rigid structure. The strict format of the feedback can become predictable and may inhibit spontaneous discussion of points as they occur to the trainee and trainer. In addition, because the technique contrasts “what was done well” with “what could be done differently” it is difficult to avoid the perception that the feedback is contrasting “good points” with “bad points”. The doctor may feel the opening comments become predictable and insincere and be bracing themselves for the anticipated criticism that will follow. The trainee may become defensive and the learning potential of the feedback will be reduced.

There are many other feedback techniques described in the literature.26 For example, Silverman et al24,25 described the ALOBA (or “SETGO”) technique (agenda led, outcome based) of feedback that uses the structure shown below:

  1. What I (observer) saw—descriptive, specific, non-judgmental feedback by observer

  2. What else did you the learner see?

  3. What does learner think?

  4. What goals are we trying to achieve?

  5. Any offers of how we should get there?

One possible advantage is that this method focuses more quickly on the trainees’ areas of concern and as a result of acknowledging difficulties may reduce the trainees’ defensiveness and may be less evaluative. A further potential advantage over the Pendleton’s rules method is that the trainee is an active participant rather than a passive recipient of feedback from the facilitators and other group members. There are other established models of giving feedback including “SCOPME model, Chicago model, etc, which are described further in a recent review article.26 It is important that a variety of different techniques are used and that the approach be varied each time so the experience does not become predictable. The methods described above are quite dated and hierachical and other newer methods provide a more real life and multiprofessional approach—that is, 360 degree type appraisals. In the Foundation Programmes trainees are now involved in a MSF with assessment from eight raters, both medical and non-medical, which is relevant to assessing performance in a multidisciplinary workplace.

Feedback is an important part of the process of improving clinical skills and trainees usually appreciate feedback.23 Giving feedback shows concern and regard for the person and their professional development and as a result feedback may also help motivation and satisfaction of trainees. Most clinicians are familiar with the concept and principles of giving feedback but often the value of using feedback as a teaching tool are underused.23 In the past, very little attention was given to providing trainees with feedback. A study of house physicians27 reported that house officers received almost no feedback and developed their own systems of self validation to compensate for lack of external feedback. In such situations some trainees may develop a lack of confidence but others may develop a misguided sense of clinical competence.

Feedback has been underused as an educational tool in clinical medicine for a number of reasons. Firstly, the need to observe the trainees performance—an opportunity curtailed by changes in working practices but perhaps refreshed by the Foundation Programme assessments. Secondly, the teacher may be concerned about the impact of negative feedback upon the trainee and upon the trainee-trainer relationship. The MSF assessment in Foundation Programmes will be potentially very useful as the technique incorporates feedback from eight assessors and not just the educational supervisor who is presenting the feedback.

In addition, it is essential to ensure that trainers are properly taught the techniques of adult learning and how to give feedback to trainees. Trainers should preferably be observed when they give advice and feedback to trainees as part of training to be an educational supervisor.

We need to continue to use and develop our skills in the use of feedback in clinical medicine. Without adequate feedback good performance is not acknowledged and problems with clinical competence go uncorrected for long periods of time. We have moved on from the past decade when in hospital medicine no feedback indicated satisfactory progress and negative feedback came indirectly in the form of a poor reference and difficulty getting a new post.



There have been a three publications describing Mini-CEX as a feedback tool. Holomboe et al28 reported upon feedback given after 107 audiotaped mini-CEX sessions. In 80% of the sessions the supervisor made at least one recommendation to the trainee for improvement. The assessor allowed the trainees to react to the feedback in 61% of sessions but only 34% of assessors asked for the trainees’ self assessment of the encounter. After the assessment 8% of trainers and trainees formulated an action plan. The authors concluded that the educational supervisors were using the encounter to provide feedback and recommendations but were underusing the opportunity for other interactive feedback methods including trainee self assessment and action planning.

Two other studies reported on feedback to medical students after Mini-CEX assessments. Kogan et al12 found that after an average of 21 minutes’ assessment feedback was given for a mean of eight minutes. Similarly, Hauer et al29 studied 30 minute Mini-CEX assessments involving 22 medical students. The feedback given after observations was on average 15 minutes’ duration. There is no assessment of quality or usefulness of feedback in either study.

Multisource feedback (MSF) and feedback using other assessment tools

Many studies have reported use of MSF in business and industry and concluded that feedback from MSF generally results in improvements in overall performance.20 However, a number of factors influencing the success of feedback using these tools have been identified. Negative feedback can provoke a variety or responses that may not be beneficial and people who received feedback discrepant from their own ratings of themselves tended to believe the feedback was unhelpful and were likely to react negatively. Another important finding was that negative feedback can lead to disillusionment and failure to achieve goals. On the other hand, positive feedback may lead to over confidence and reduced efforts. Cynicism and negative attitudes to the MSF process (ratees and raters) also influenced whether people were likely to change after the feedback.

There are numerous examples of feedback using these tools in medicine. In surgery Violato et al18 found MSF to be useful in making changes in practice but another study reported no impact of MSF on surgical practice.19 A study of general practitioners reported that the physicians perceptions of the feedback process was most important and that feedback perceived as negative had no value or a negative impact.20 A recent study involving physicians concluded that “when interpersonal, communication, professionalism, or teamwork behaviors need to be assessed and guidance given, MSF is one of the better tools that may be adopted and implemented to provide feedback and guide performance”.19

There is very little information in the literature regarding feedback after DOPPS or case based discussion at the present time and further work will need to be done to assess the value of feedback given to trainees after these assessments.


Interactive feedback is important to help doctors improve and develop professionally. In the light of recent changes in medical working patterns and changes to the structure of junior doctors training we need to use new opportunities to observe trainees and provide good quality, timely feedback to facilitate learning. The Mini-CEX and other assessment tools involved in the Foundation Programmes present an opportunity to observe trainees and to provide immediate and relevant feedback. The training of educational supervisors in the use of assessment tools and feedback techniques is important to maximise this new opportunity for feedback to trainees.

The Foundation Programme represents an important change in postgraduate medical education in the UK. The provision of appropriate time and recognition for educational supervision and assessment in consultant job plans and liaison with trusts regarding the implications for clinical service will be essential.



  • Funding: none.

  • Conflicts of interest: none.