Article Text

Use of spaced education to deliver a curriculum in quality, safety and value for postgraduate medical trainees: trainee satisfaction and knowledge
  1. Jeffrey Bruckel1,
  2. Victoria Carballo2,
  3. Orinta Kalibatas2,
  4. Michael Soule3,
  5. Kathryn E Wynne4,
  6. Megan P Ryan5,
  7. Tim Shaw6,
  8. John Patrick T Co4,7
  1. 1Edward Lawrence Center for Quality and Safety, Massachusetts General Hospital, Boston, Massachusetts, USA
  2. 2Quality, Safety, and Value Division, Partners HealthCare, Boston, Massachusetts, USA
  3. 3Department of Psychiatry, Massachusetts General Hospital, Boston, Massachusetts, USA
  4. 4Department of Pediatrics, Massachusetts General Hospital for Children, Boston, Massachusetts, USA
  5. 5Central Coast Local Health District, University of Sydney, Sydney, Australia
  6. 6Workforce Education and Development Group, University of Sydney, Sydney, Australia
  7. 7Graduate Medical Education, Partners HealthCare, Boston, Massachusetts, USA
  1. Correspondence to Dr John Patrick T Co, Partners HealthCare, Director of Graduate Medical Education, 7 Whittier Place, Suite 108, Boston, MA 02114, USA; jco{at}


Purpose Quality, patient safety and value are important topics for graduate medical education (GME). Spaced education delivers case-based content in a structured longitudinal experience. Use of spaced education to deliver quality and safety education in GME at an institutional level has not been previously evaluated.

Objectives To implement a spaced education course in quality, safety and value; to assess learner satisfaction; and to describe trainee knowledge in these areas.

Methods We developed a case-based spaced education course addressing learning objectives related to quality, safety and value. This course was offered to residents and fellows about two-thirds into the academic year (March 2014) and new trainees during orientation (June 2014). We assessed learner satisfaction by reviewing the course completion rate and a postcourse survey, and trainee knowledge by the per cent of correct responses.

Results The course was offered to 1950 trainees. A total of 305 (15.6%) enrolled in the course; 265/305 (86.9%) answered at least one question, and 106/305 (34.8%) completed the course. Fewer participants completed the March programme compared with the orientation programme (42/177 (23.7%) vs 64/128 (50.0%), p<0.001). Completion rates differed by specialty, 80/199 (40.2%) in non-surgical specialties compared with 16/106 (24.5%) in surgical specialties (p=0.008). The proportion of questions answered correctly on the first attempt was 53.2% (95% CI 49.4% to 56.9%). Satisfaction among those completing the programme was high.

Conclusions Spaced education can help deliver and assess learners’ understanding of quality, safety and value principles. Offering a voluntary course may result in low completion. Learners were satisfied with their experience and were introduced to new concepts.


Statistics from


Delivering quality care to patients is a primary aim of all healthcare providers, health systems and payers. However, patient safety and quality of care continue to be major public health issues.1 ,2 Sustainable improvements in care quality will require that practising providers are able to assess and refine the care systems in which they practice. Fluency in quality and safety principles will be critical for healthcare providers as the healthcare system evolves to become more patient-oriented and value-oriented. Early introduction of the principles of quality improvement, patient safety and value can promote incorporation of these principles into clinical practice both during and after training.

Although the incorporation of quality and safety education into graduate medical education (GME) is increasing, many trainees still do not receive formal education in these areas as part of their GME curriculum.3–8 This need has been highlighted by multiple organisations throughout the world, including the United States Accreditation Council for Graduate Medical Education (ACGME) Clinical Learning Environment Review Program, the European Commission Patient Safety and Quality of Care Working Group, and the United Kingdom's General Medical Council.9–11 These organisations have called for increased integration of quality and safety education into graduate medical training. Best practices must be identified and disseminated for broader implementation.

Spaced education is a teaching strategy that delivers educational content electronically over several weeks or months (spaced in time) in a case-based or quiz format.12 ,13 This design is based on the educational psychology research findings of the spacing and testing effects (see box 1).14–19 Spaced education and other web-based tools have been shown to effectively teach material to trainees.20 These programmes have been effective in traditional medical education topics, as well as for improving teaching skills and mentoring of students.21–25 Spaced education has also been demonstrated to be effective in improving patient safety behaviours of trainees and in trainee knowledge of clinical practice guidelines.26 ,27 It is not known whether spaced education for quality and safety would be widely accepted if offered broadly to GME programmes within an institution and whether there is a better time in training for offering this education.

Box 1

Description of spaced education

Spaced education

  • Spaced education is a strategy for delivering educational content over several weeks to months in a quiz or game-like format.12 ,13

  • Spaced education is based on theories developed from educational psychology research:13

    • Spacing effect—Knowledge retention is improved if content is presented and repeated in small increments over time.14

    • Testing effect—Knowledge retention is higher for material presented in a test-like format.15

Here, we report on an innovative curriculum designed to expose trainees to the concepts of quality, safety and value using a spaced education platform (see box 1). The goal of this programme was to facilitate education in these principles through an educational game-like competition. Study aims were to (1) implement a spaced education course in quality improvement, patient safety and value, designed for a general audience of postgraduate trainees; (2) to assess learner satisfaction after completing the programme; and (3) to describe trainee knowledge in these areas.


Spaced education course design and function

The spaced education platform provides a case-based, electronic educational experience over 8 weeks. After enrolling in the programme, an automated email is sent to the participant with the first case scenario and question, and the participant is asked to select the best response. After answering the question, the participant is shown the intended response and a brief description of the issues addressed by the question. There are also references available for more in-depth learning. If the question is answered correctly on the first attempt, the question is ‘retired’. If the question is answered incorrectly, he/she receives it again in three days until it is answered correctly. Each week's module is designed to take ≤5 min to complete. The programme then moves on to the next case scenario and learning objective the following week. In order to facilitate engagement with the programme via game-like competition, scores were displayed publicly (using pseudonyms for each participant to ensure anonymity). Only the course managers and the individual have access to pseudonyms.

Curriculum development

The curriculum was designed to focus on quality and safety of interest to trainees regardless of their specialty. Learning objectives were developed by the course authors, which included house staff (KEW, MS) enrolled in training programmes. Learning objectives were reviewed by leaders in quality and safety for their relevance to the quality and safety objectives of the institution, as well as by trainees to assess their clinical relevance. The final learning objectives were narrowed to the eight highest-impact objectives, shown in table 1. Case scenarios addressing each of the learning objectives were created by the authors and were either authored by current trainees or had significant input from trainees during development. Sources from the quality and safety literature were used to create each of the cases, and cases were vetted with quality and patient safety leaders.11 ,16–40 Selected case scenarios and answer choices can be found in box 25, and the course in its entirety can be found in the online supplementary appendix.

Table 1

Learning objectives

Box 2

Selected cases—Quality Improvement: Writing a SMART Aim Statement

You are working with a process improvement team on a project to improve preoperative assessments and operating room scheduling. This project is being undertaken in response to a problem with wrong tests being performed in the preoperative area prior to procedures, resulting in additional phlebotomy being performed. The team has examined the process involved in the assessment area and is ready to write up their plans for intervention. Which of the following represents a well-written aim statement?

  • A. Improve the efficiency of the preoperative screening area while reducing the number of unnecessary tests performed by 50% and costs associated with unnecessary tests

  • B. Increase utilisation of the preoperative screening area by surgeons and anaesthesiologists

  • C. Reduce the number of missed laboratory draws from 5% of patients to <1% of patients within six months

  • D. Improve patient satisfaction with the preoperative testing area by 25%

Our answer

The only aim statement that is written in a SMART way is ‘Reduce the number of missed laboratory draws from 5% of patients to <1% of patients within six months’. The other aim statements are missing at least one element of a SMART aim.

Take-home message

Aim statements are an important part of any quality improvement project and help ensure that the improvement strategy is focused on a specific goal. One common acronym used to describe the elements of a well-written aim statement is SMART. (Note that there are different versions of the SMART acronym using varying terminology, but with similar overall meaning.)






These elements ensure that the aim of the project is focused and that improvement has a clearly identifiable and attainable goal.

References and resources

New York State Department of Health: Model for Improvement—Review of Aim Statement Worksheet.

Box 3

Selected cases—Safety: Response to Error

You are the senior resident in cardiology. Neil, a 71-year-old man, was admitted 3 days ago for rapid atrial fibrillation. Neil is now taking 60 mg of short-acting diltiazem every 6 h with excellent rate control. In preparation for discharge, your team plans to convert Neil to 240 mg of long-acting diltiazem per day, which is ordered by Lucy, your intern. About 2 h later, you receive a page from Neil's nurse asking you to come to the bedside immediately. When you arrive, his heart rate is 30, his blood pressure is 80/40. After reviewing his medication list, it becomes clear that 240 mg of short-acting diltiazem has been given. After ensuring that Neil is being properly cared for, you approach Lucy in the team room to discuss the error. Which of the following should be your initial approach?

  • A. Ask Lucy to be more careful when writing orders because Neil could have died. Discuss the case with the attending physician and forward the issue to the programme office for consideration as an M+M case.

  • B. Take Lucy to a quiet place and ask if she is okay. Suggest that she work with you to file a safety report and participate in the root-cause analysis and improvement actions.

  • C. Ask Lucy to call the attending physician to discuss the case and acknowledge her error. Tell her that you hope nothing like this happens again in the future.

  • D. Take Lucy to discuss the problem with the bedside nurse and pharmacy to determine why the error occurred.

Our answer

Take Lucy to a quiet place and ask if she is doing okay. Suggest that she work with you to file a safety report and participate in the root-cause analysis and improvement actions.

What (might have/actually) happened?

The patient was admitted to the intensive care unit for vasopressor support until the diltiazem has been metabolised. Fortunately, the patient was ultimately discharged home without any sequelae.

Take-home message

It is essential to treat those involved in errors with respect and understanding, and involve them in the process of investigation in order to promote a positive culture of safety. The minutes after the recognition that an error has occurred are critical to developing a culture of safety within a hospital. The first priorities should of course be to ensure that the patient is being appropriately cared for and notifying the responsible attending that a patient-related event has occurred. The next most important priority is to ensure that those involved in the error are processing the event appropriately. The ‘second victims’ in the events are the staff involved in the error, who often feel shame, fear and sadness after an event. These can have a significant impact on professional careers, including change of career and possibly depression and suicide. Studies have shown that the most important factor in healing from this injury is the initial response by the clinician's supervisor. Direct involvement in the investigation process and improvement interventions is also potentially beneficial.

Punitive measures, such as asking the intern to own up to their error, admonishing providers to be more careful, punitive actions and discussion in a public area in front of colleagues, are both not likely to prevent future errors, as well as make it less likely that the intern will discuss errors and mistakes in the future. Discussion with the other involved providers (nurses and pharmacy) should occur through official channels and should not be punitive.

References and resources

McCay L, Wu AW. Medical error: the second victim. Br J Hosp Med (Lond) 2012;73:C146–8.

Ryan C, Ross S, Davey P, et al. Prevalence and causes of prescribing errors: the PRescribing Outcomes for Trainee Doctors Engaged in Clinical Training (PROTECT) study. PLoS ONE 2014;9:e79802.

Box 4

Selected cases—Safety: Near Miss

Eva, a 1-month-old girl, is being admitted for a fever. To meet standard of care, Eva receives her first dose of antibiotics, including gentamicin (to receive one dose every 24 h) at 06:00, while in the emergency department (ED). Sam, the ED nurse, is very busy and near the end of his morning shift and forgets to note in his nursing notes that the medications were given. Upon arriving to the floor an hour later, Eva's floor orders are entered by Joe, the intern.

On activation, the new orders in the system prompt Susan, the inpatient nurse, to give Eva her first dose of antibiotics at 08:00. When Susan approaches with the antibiotics, Eva's mother asks if it would be too much medication given she had received some antibiotics just a few hours ago, and the intern is called to the bedside. Joe notices the error after talking with Eva's family, cancels the order for antibiotics and discuses what happened with Eva's mother and the care team.

What is the most appropriate step Joe could take to help ensure that similar episodes do not occur again?

  • A. Look up the emergency department nurse's name and report her to the ED nursing supervisor.

  • B. Report the error to his senior resident and discuss the error with other interns so that they can avoid the same problem in the future.

  • C. Apologise to the family for the error and file a report in the hospital incident reporting system.

  • D. No action needed as no harm reached the patient.

Our answer

Apologise to the family for the error and file a report in the hospital incident reporting system.

What happened?

The patient narrowly avoided a potentially toxic dose of gentamicin.

Take-home message

Reporting of Near Miss events can yield significant information about safety risks within a hospital. The nature of safety problems is such that events that result in patient harm are quite rare. However, system problems can exist that make a process prone to error when certain circumstances arise. This is exemplified in James Reason's Swiss Cheese Model of Error, which shows that an error can occur when the holes in the error-prevention system line up.12 Knowledge of near misses can help prevent these problems from reaching the patient.

The patient's family was responsible for discovering the error and is likely already aware that an error occurred. Discussion with the family is the appropriate and responsible thing to do. The family should be advised that the hospital will work to help prevent problems like this in the future. Work done at the University of Michigan on early disclosure and apology has been shown to reduce malpractice claims as well as promote a culture of safety.13 This stance has been adopted by many institutions in the State of Massachusetts.

References and resources

Reason J. Human error: models and management. Br Med J 2000;320:768–70.

Kachalia A, Kaufman SR, Boothman R, et al. Liability claims and costs before and after implementation of a medical error disclosure programme. Ann Intern Med 2010;153:213–21.

Box 5

Study setting and context

Study setting and context

  •  – Course designed and implemented by the Partners HealthCare (PHS) Offices of Graduate Medical Education (GME) and Quality, Safety and Value

  •  – PHS network (Massachusetts, USA) was founded by Massachusetts General Hospital (MGH) and Brigham and Women's Hospital (BWH).

  •  – Course was offered to interns, residents and fellows at GME programmes sponsored by MGH or BWH.

Programme directors of ACGME accredited residency and fellowship programmes sponsored by either Massachusetts General Hospital or Brigham and Women's Hospital were approached to gain their consent for offering the course to their trainees. Programme directors were offered the opportunity to take an abbreviated version of the course themselves to determine whether the programme was of value to their trainees and to provide feedback (they did not have to complete the course).


The course was offered three times during calendar year 2014:

  • In March 2014, to trainees enrolled in a residency or fellowship programme

  • In June 2014, to postgraduate year 1 (PGY-1) trainees at orientation

  • In July 2014, to more advanced subspecialty resident and clinical fellow trainee orientation

Participants in the March 2014 course were drawn from trainees currently enrolled in training programmes. Participants in the June and July 2014 training courses were drawn from trainees attending orientation. Participants were solicited via email invitation for the March 2014 course and via email and live invitation during orientation for the June/ July 2014 course. To promote participation, those trainees who completed the course and exit survey were entered into a lottery to win a tablet computer (two winners per course).

Data collection

Learner satisfaction

We assessed learner satisfaction through two primary methods. The first method was to assess enrolment and completion rates. These data were collected continuously throughout the course by the software program. Continued participation in the course was taken to be a marker of satisfaction with the course content being delivered. The second method used to assess learner satisfaction was through comments that participants left in responding to each week's case content and question. We tracked these comments and occasionally on two occasions made minor wording edits in real time. We also administered a postcourse survey to participants who completed the course. This survey had six questions addressing the course in general and eight questions addressing the individual cases specifically. The survey scoring used a five-point Likert-style scale with the specific anchors strongly disagree, disagree, neutral, agree or strongly agree. There was also space for free-text comments to be made. The programme administrators were able to determine when participants completed a survey (in order to select a prize winner); however, the responses to the survey questions were de-identified.

Trainee knowledge

We assessed trainee knowledge by assessing the number of correct responses for each question out of the total number of correct responses.

Data analysis

Learner satisfaction

Course completion rates were assessed at the conclusion of the course. The proportion of participants starting the course and completing the course was assessed. We analysed differences in satisfaction between course offerings (during the academic year vs during orientation), as well as by level of training and specialty. The postcourse satisfaction survey was analysed by course offering; we were unable to assess any further differences between participant responses due to the lack of identifiable survey data. Survey responses were analysed using mean responses. There were also full-text responses that were left in comment form, which were aggregated and reviewed. The number of full-text comments was low, precluding applying robust qualitative analysis of this data; however, some comments addressing specific themes were of interest.

Trainee knowledge

Question response data were analysed using the per cent of correct responses to each question. These data were analysed as continuous variables.

Statistical methods

Categorical variables were expressed as proportions, and comparisons were made using difference in proportions (with 95% CI for the difference in proportions based on the Wald CI). Statistical significance for the comparisons was determined using two-tailed Fisher's exact test or χ2 test if multiple levels were present (as specified in the results). Continuous variables were expressed as mean and 95% CI, and comparisons were made using t tests for comparisons of two groups or one-way analysis of variance (ANOVA) for comparisons of more than two groups (as specified in the results). Statistical significance was assessed at the p<0.05 level. Data were analysed using JMP 11 Pro statistical platform (SAS Institute, Cary, North Carolina, USA).



Participants represented a broad range of training programmes, as shown in Supplemental table 1a. The course was offered to a total of 1950 trainees. A total of 305 (15.6%) trainees enrolled in the programme, 177 in the mid-year resident programme, 91 PGY-1 trainees during the orientation programme and 37 advanced trainees during the orientation programme.

Figure 1

Flow chart of enrollment in the program.

Course completion rates

Overall, 265/305 (86.9%) answered at least one question and 106/305 (34.8%) completed the programme. A higher proportion of participants in the March programme answered at least one question compared with the June/July orientation programme, with 177/177 (100%) vs 88/128 (68.8%) of those enrolled; difference 31.3% (95% CI 22.9% to 39.0%), exact p<0.001. However, a lower proportion of participants completed the March programme compared with the orientation programme, with 42/177 (23.7%) vs 64/128 (50.0%) of those enrolled; difference 26.3% (95% CI 15.3% to 36.6%), exact p<0.001.

There was no difference in programme participation or completion between PGY-1 trainees and advanced trainees during the orientation programmes. Of the PGY-1 trainees enrolled in the orientation programme, 64/91 (70.3%) answered at least one question compared with 24/37 (64.7%) of advanced trainees; difference –5.5% (95% CI −23.5% to 11.9%), exact p=0.54. Course completion rates were 43/91 (47.3%) for PGY-1 trainees compared with 21/37 (56.8%) for advanced trainees; difference 9.5% (95% CI −9.5% to 27.7%), exact p=0.44.

Among those who enrolled, there was a significant difference in completion rates by specialty (χ2=23.1, 13 DF, p=0.04). Completion rates differed between surgical and non-surgical specialties, with a completion rate of 80/199 (40.2%) in non-surgical specialties compared with 16/106 (24.5%) in surgical specialties; difference 15.7% (95% CI 4.7% to 25.9%, exact p=0.008). The specialty with the highest completion rate was emergency medicine with 4/5 (80%), and the programme with the lowest completion rate was radiology with a rate of 2/15 (13.3%).

Learner satisfaction survey

The response rate on the evaluative survey (for those who completed the course) was 27/42 (64%) for the March programme, 30/44 (68%) for the PGY-1 programme and 17/21 (81%) for the advanced trainee programme. Satisfaction in the programme was high, with mean responses ranging from 3.89 (95% CI 3.75 to 4.04) for overall enjoyment to 4.13 (95% CI 3.96 to 4.29) for enjoyment over other learning methods (on a scale of 1–5). Responses to each of the survey questions are presented in table 2. Themes identified in the comments of the survey included varying levels of prior exposure to the material, that the content was interesting and engaging, and the desire for more advanced presentation styles such as videos, other multimedia or a ‘choose your own adventure’ style programme.

Table 2

Exit survey scores, mean (95% CI)

Trainee knowledge

The mean percentage of correct answers on the first attempt was 53.2% (95% CI 49.4% to 56.9%). There was no difference in overall per cent correct across specialties (see online supplementary table 1a; overall ANOVA p=0.59) or between PGY-1 participants and advanced training programme participants (see online supplementary table 1b; t test p=0.64). There was no difference in overall performance between hospital sites (see online supplementary table 1c; overall ANOVA p=0.60). Among those who answered at least one question, the per cent correct on individual questions ranged from 44.9% to 96.5% in the March programme, 46.4% to 94.7% in the PGY-1 programme and 46.0% to 95.8% in the advanced trainee programme. Performance on each of the questions is shown in table 3.

Table 3

Case scenario per cent correct out of total attempts, n/n (%)

Question performance was not predictive of failure to complete the programme. Among those who answered at least one question, those who dropped out had a mean score of 52.7% (95% CI 47.9% to 57.5%) correct on the first attempt compared with 53.9% (95% CI 48.0% to 59.8%) correct among those who completed the course (t test p=0.75). There was no difference in rate of completion between the hospital sites (overall ANOVA p=0.68).


This innovative spaced education programme successfully delivered quality and safety concepts to a large number of postgraduate trainees. Course enrolment and completion rates were modest; however, satisfaction was high among those trainees who completed the course. We also found that among trainees who volunteered for the course, baseline knowledge of quality and safety concepts was relatively low, with wide variation in knowledge across the different concepts assessed in the cases.

While GME training programmes are required to educate trainees about quality and safety, it is unclear how to do this most effectively. We found that enrolment was modest, but significantly higher among PGY-1 trainees compared with advanced trainees in the orientation. This may provide the best opportunity to offer trainees a course with this content and design. Trainee enrolment may be low when a spaced education course is voluntary, even for a course as brief as ours. Institutional GME leadership should consider these findings if they are considering implementing a similar curriculum and strategy. Despite the lower than anticipated completion rates, learner satisfaction among those completing the course was high. This suggests that this style of content presentation may appeal to some learners more than others and may not be an effective means for delivering content for some learners.

Although the programme was not designed to comprehensively evaluate learner knowledge, we were able to evaluate the baseline knowledge of the participants. Among trainees who volunteered to take the course, the level of performance on the individual items suggested relatively poor baseline knowledge of the concepts. Trainees correctly answered questions about only half the time on the first attempt. The level of performance of trainees that were offered the course during the academic year somewhat reflects the effectiveness of our GME programmes’ education in these areas, while the performance of trainees that were offered the course at the time of orientation reflects the effectiveness of the trainees’ prior education. These data can be used at both the institutional and programme level to assess and refine curricula and educational strategies used. These data reinforce the idea that trainees need specific education in quality and safety principles, and that this education should come early in their careers. Similar spaced education programmes may help assess where the gaps are widest, as well as fill part of this knowledge gap.

Prior research has demonstrated that spaced education can successfully deliver quality-related and safety-related content, although most of these studies have used randomised controlled trial study designs.27 Prior studies have not evaluated implementation of spaced education at an institutional level in a real-world application. Because many training programmes may not have the infrastructure for formal integrated quality and safety education, similar platforms may represent a resource by which GME training programmes may incorporate quality and safety principles into education for trainees. Although there was a relatively low rate of enrolment in the programme, the majority of those enrolled completed the course, and satisfaction among those who completed the course was high. In general, learners preferred this style of learning to other similar teaching methods and reported learning new concepts and principles through the course.

Our study had several limitations. Given that the programme was a voluntary pilot programme, the initial enrolment was relatively low. Other than offering incentives, we did not aggressively solicit participants in the programme. The programme was not mandated to trainees in training programmes and represented an additional time commitment to trainees. Enrolment in the programme could be improved by making participation mandatory, though this may adversely impact learner participation and satisfaction. Our satisfaction survey was not administered to enrolees who did not complete the programme, limiting our ability to assess satisfaction in this group of participants. We were likewise unable to assess for factors that prompted trainees to leave the programme without completing it (time constraints, lack of exposure to new material, perceived value). Our questions were designed to be applicable to a broad range of trainees; however, a programme more tailored to individual specialties may provide more value to learners and have a higher completion rate. We were also unable to assess improvement in knowledge after completing the programme as we did not administer a postsurvey knowledge assessment. Our statistical methods did not account for multiple testing; therefore, some degree of type I error inflation may have been present. Also, we had a relatively low sample size; therefore, we may not have had sufficient power to detect effects with small sizes.

This spaced educational programme provides an innovative way to introduce quality and safety principles to trainees or to reinforce concepts. Although the treatment of the topic is not in-depth, the focus on general concepts that are important to practising clinicians across a wide range of specialties provided value to our learners. This type of programme will not provide the learner with comprehensive knowledge, but could be an important part of a comprehensive strategy for quality and safety education during postgraduate training. Future studies should evaluate whether mandatory programmes would be more successful or whether spaced education is better suited as an adjunct to more comprehensive educational strategies in quality and safety (such as quality and safety councils for housestaff or more formal educational programmes). Ultimately, its role in improving care quality should be assessed.

Main messages

  • Spaced education can effectively deliver quality and safety content.

  • Voluntary spaced education programmes may have poor uptake.

  • Learners rated spaced education as an effective method of learning.

Current research questions

  • What are the factors that lead to learners not completing the programme?

  • Does quality and safety education improve reporting of safety events/incidents?

  • Could an online programme supplement an in-person comprehensive quality education programme?


This submission is not related to any previously published programmes and has not been presented as an abstract or poster presentation. Thanks to Tejal Gandhi, MD, MPH, and Cyrus Hopkins, MD, for assistance with conception of the programme and review of course material.


Supplementary materials

  • Supplementary Data

    This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.


  • Contributors All authors contributed substantially to the research conception, design, authorship of questions, and authorship of the manuscript. All authors approved the final version of the manuscript submitted.

  • Funding Institutional funds from Partners HealthCare.

  • Competing interests None declared.

  • Ethics approval Partners HealthCare Institutional Review Board.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Patient consent Participants were provided with study information sheets, and their participation in the programme constituted implied informed consent.

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.