Article Text

Download PDFPDF

Impact of laboratory cost display on resident attitudes and knowledge about costs
  1. Theodore Long1,2,
  2. Tasce Bongiovanni1,
  3. Meir Dashevsky3,
  4. Andrea Halim4,
  5. Joseph S Ross5,
  6. Robert L Fogerty5,
  7. Mark T Silvestri1,6
  1. 1Robert Wood Johnson Clinical Scholars Program, Yale School of Medicine, New Haven, Connecticut, USA
  2. 2Department of Internal Medicine, Yale School of Medicine, New Haven, Connecticut, USA
  3. 3Emergency Medicine Residency Program, Yale School of Medicine, New Haven, Connecticut, USA
  4. 4Orthopaedics and Rehabilitation Residency Program, Yale School of Medicine, New Haven, Connecticut, USA
  5. 5Section of General Internal Medicine, Yale School of Medicine, New Haven, Connecticut, USA
  6. 6Department of Obstetrics and Gynecology, Yale School of Medicine, New Haven, Connecticut, USA
  1. Correspondence to Dr Theodore Long, Robert Wood Johnson Clinical Scholars Program, Yale School of Medicine, 333 Cedar Street, SHM IE-61, PO Box 208088, New Haven, CT 06520, USA; theodore.long{at}yale.edu

Abstract

Aim Cost awareness has been proposed as a strategy for curbing the continued rise of healthcare costs. However, most physicians are unaware of the cost of diagnostic tests, and interventions have had mixed results. We sought to assess resident physician cost awareness following sustained visual display of costs into electronic health record (EHR) order entry screens.

Study Design We completed a preintervention and postintervention web-based survey. Participants were physicians in internal medicine, paediatrics, combined medicine and paediatrics, obstetrics and gynaecology, emergency medicine, and orthopaedic surgery at one tertiary co are academic medical centre. Costs were displayed in the EHR for 1032 unique laboratory orders. We measured attitudes towards costs and estimates of Medicare reimbursement rates for 11 common laboratory and imaging tests.

Results We received 209 survey responses during the preintervention period (response rate 71.1%) and 194 responses during the postintervention period (response rate 66.0%). The proportion of residents that agreed/strongly agreed that they knew the costs of tests they ordered increased after the cost display (8.6% vs 38.2%; p<0.001). Cost estimation accuracy among residents increased after the cost display from 24.0% to 52.4% for laboratory orders (p<0.001) and from 37.7% to 49.6% for imaging orders (p<0.001).

Conclusions Resident cost awareness and ability to accurately estimate laboratory order costs improved significantly after implementation of a comprehensive EHR cost display for all laboratory orders. The improvement in cost estimation accuracy for imaging orders, which did not have costs displayed, suggested a possible spillover effect generated by providing a cost context for residents.

  • Health care costs
  • electronic health records
  • medical education-systems based practice
View Full Text

Statistics from Altmetric.com

Introduction

Providing better care at lower costs is the central recommendation of the Institute of Medicine (IOM) report on healthcare quality.1 However, instead of cost reduction, USA has seen persistently rising healthcare costs. In 2002, USA spent 15% of its gross domestic product on healthcare. By 2013, this number had risen to 18%. By 2032, annual family health insurance premium costs are projected to exceed average household income.2 One strategy for curbing this rapid growth is to reduce unnecessary health spending, which the IOM estimated at $750 billion in 2009 alone.3 To ensure that young physicians are prepared to help achieve this goal, the Accreditation Council for Graduate Medical Education (ACGME) includes systems-based practice as one of its six competencies required in residency training. Cost awareness is a key component of this competency.4 However, despite this stated imperative, the fact remains that most physicians are unaware of healthcare costs.5 ,6 Numerous educational curricula have been developed to remedy this problem.7–12 Unfortunately, most interventions have had mixed results on improving cost awareness or have not been sustainable.9 ,13 Further, successful studies have focused primarily on small groups of residents and have required substantial time investment on the part of the institution.

As an alternate strategy for improving cost awareness, calls have recently been made to increase cost transparency for physicians,14–16 and some institutions have started to take steps in this direction.17–19 It remains unknown, however, what the effect will be of cost transparency and whether it will achieve improvements in physician knowledge of healthcare costs. If cost transparency was found to be at least as effective as educational curricula at improving cost awareness, it could offer a more easily scalable strategy for instilling cost awareness across residency training programmes.

We therefore sought to implement a comprehensive cost display for laboratory orders at an academic institution and to study its effect on trainee cost awareness. We examined the change in resident physician attitudes and knowledge about costs following an administrative intervention that incorporated sustained visual display of costs for all laboratory tests into electronic health record (EHR) order entry screens. To disentangle the effect of laboratory cost display on cost awareness in general, we evaluated awareness of costs for imaging orders as a control, with the cost display only being implemented for laboratory ordering. We hypothesised that cost awareness would improve for laboratory orders, but would remain unchanged for imaging orders.

Methods

Study design

We conducted a preintervention and postintervention web-based survey of all residents in internal medicine, paediatrics, combined medicine and paediatrics, obstetrics and gynaecology, emergency medicine and orthopaedic surgery at Yale-New Haven Hospital (YNHH), a tertiary care academic medical centre. In addition to surveying current residents, the preintervention survey was also administered to incoming interns matched into one of the six predefined residency programmes, who would later be eligible for the postintervention survey. Institutional Review Board exemption was obtained for this study.

Intervention

Beginning April 2014, Connecticut-specific Medicare reimbursement rates, taken from the 2014 Medicare Laboratory Fee Schedule,20 were displayed on laboratory order entry screens within the YNHH EHR for 1032 unique laboratory orders performed at YNHH. When providers searched for an order, the costs were passively displayed alongside the test name and were labelled as ‘Approx Cost’ (figure 1). Medicare reimbursement rates were used because they roughly approximate societal costs and have been used in recent studies evaluating the impact cost displays on ordering.18 ,19 ,21 An information page explaining the source of the cost information was linked from the EHR login screen. Imaging costs were not displayed.

Figure 1

Display of cost on EHR order screen.

Data collection

We administered the preintervention survey in March 2014 and the postintervention survey in October 2014. The survey was created using Qualtrics Online Survey Software (Provo, Utah, USA) and distributed through the university-based email system. The survey used previously reported Likert-scale questions to examine attitudes and knowledge about healthcare costs (1=strongly disagree; 5=strongly agree).6 ,9 ,22 We also asked residents to estimate Medicare reimbursement rates for 11 commonly ordered laboratory and imaging tests, such as complete blood count with differential, basic metabolic panel, and non-contrast CT of the head. Radiology reimbursements were obtained by summing values from the Medicare Physician Fee Schedule (professional fees) and the Medicare Outpatient Prospective Payment System Fee Schedule (facility fees).23 ,24

Analysis

We compared attitudes before and after the cost display for all participants using Wilcoxon ranked-sum tests. We used Student's t tests to compare cost estimation accuracy before and after for all participants. Based on previously described metrics that were used in a large meta-analysis looking at physician knowledge of cost,6 cost estimates were considered accurate if responses were between 50% and 200% of the Connecticut-specific Medicare reimbursement rate. All participants were assigned a unique study ID, enabling examination of residents who took both presurveys and postsurveys using paired Wilcoxon signed-rank tests for attitude questions and paired t tests for cost estimation accuracy.

We performed multivariable linear regression to test whether cost estimation accuracy was associated with specialty or postgraduate year (PGY). The dependent variable was estimation accuracy, and explanatory variables were specialty, PGY level and preintervention/postintervention. Finally, we performed a difference-in-difference analysis with the treatment group being laboratory cost estimation accuracy and the control group being imaging cost estimation accuracy. We chose imaging cost estimation accuracy to be the control group because the intervention only included laboratory cost display. Analyses were performed using Stata SE V.13.0 (StataCorp, College Station, Texas, USA).

Results

We received completed surveys from 209 of 294 eligible residents during the preintervention period (response rate 71.1%) and from 194 of 294 eligible residents during the postintervention period (response rate 66.0%); 138 residents completed both surveys. The specialties and PGY levels of respondents at the time of the postintervention survey are shown in table 1.

Table 1

Specialty and postgraduate year (PGY) level of residents

The proportion of all participating residents that agreed/strongly agreed that they knew the costs of tests they ordered increased substantially after the cost display (8.6% vs 38.2%; p<0.001), as did the proportion that agreed/strongly agreed they had adequate access to cost information (1.0% vs 24.8%; p<0.001) (table 2).

Table 2

Cost awareness attitudes among residents before and after visual display of laboratory costs

Cost estimation accuracy among all participating residents increased after the cost display from 24.0% to 52.4% for laboratory orders (p<0.001) and from 37.7% to 49.6% for imaging orders (p<0.001) (table 3).

Table 3

Laboratory and imaging order cost estimation accuracy among before and after visual display of laboratory costs

Resident specialty and postgraduate level were not independently associated with estimation accuracy in multivariable analysis. The change in cost estimation accuracy for laboratory and imaging costs before and after the intervention are shown in figure 2.

Figure 2

Change in cost estimation accuracy after cost display.

When limiting our analyses to those residents who responded to both the presurveys and postsurveys, enabling paired evaluations, our findings were largely consistent. After the cost display intervention, the proportion of residents in the paired response group that agreed/strongly agreed that they knew the costs of the tests they ordered increased from 6.5% to 40.6% (p<0.001). The proportion of residents that agreed/strongly agreed that they had adequate access to cost information increased from 0.7% to 29.0% (p<0.001). Furthermore, the accuracy of laboratory order and imaging order cost estimation increased from 23.6% to 52.8% (p<0.001) and from 36.1% to 51.6% (p<0.001), respectively.

Using a difference-in-differences analysis, we found that laboratory order cost estimation accuracy increased significantly more than did imaging order accuracy, in both paired and unpaired analyses (p<0.001).

Discussion

We found that the accuracy of resident physician cost estimations improved substantially after 6 months of the laboratory cost display. Accuracy improved for laboratory costs and imaging costs, the latter of which were not displayed. In addition, residents of multiple specialties reported increased awareness of costs and access to cost information after laboratory cost displays went into effect. Our results suggest that cost display may be an effective and scalable intervention to increase cost awareness among physicians in training.

The recognition of cost reduction by the IOM and the inclusion of cost awareness as an ACGME requirement have created a mandate to enhance trainee awareness of healthcare costs.3 ,4 This has led to the development of multiple educational curricula.10 ,25 ,26 Several prior studies have demonstrated the immediate impact of didactic or audit-and-feedback interventions on physicians’ cost knowledge or ordering behaviour. For example, Post et al9 conducted a preintervention/postintervention study using audit and feedback among internal medicine residents and found improvement in knowledge and attitudes about costs of care. The long-term effectiveness of many educational interventions, however, has been poor.6 ,13 ,25 ,27 Additionally, curricula are often labour-intensive, which may pose practical limitations to scalability. One prior study evaluating the impact of broad-scale cost display on physician cost knowledge did not show the intervention to be effective. In this 1990 report, which included both residents and attendings, physicians’ cost estimates were not improved after 6 months of cost display.27 This study, however, was performed prior to the recently increased emphasis on costs of care. Furthermore, it did show that physicians responded to cost information by changing their ordering patterns. With the substantial advancement in EHRs over the past 25 years, there is now a much greater opportunity to meaningfully implement a cost display intervention if it improves physician cost awareness and reduces order costs. In our study, improvements in cost awareness were similar to those previously reported by educational curricula, and the administrative cost display intervention was easily implemented for all house staff at our institution.

The improvement in residents’ cost knowledge for imaging orders in our study, for which costs were not displayed, was unexpected and counter to our hypothesis. We designed our study with imaging cost estimation as a control. While the difference-in-difference analysis did demonstrate that there was a greater improvement in laboratory cost estimation accuracy compared with imaging cost estimations, the substantial improvement in imaging cost estimation suggests that these estimates were possibly influenced by the display of laboratory costs. We believe that displaying costs for laboratory orders may have provided a ‘cost context’ for physicians, leading to increased discussion or research about diagnostic test costs in general, and providing a reference point for other healthcare costs. Thus, the spillover effect of this ‘cost context’ generated by the laboratory cost display may have improved residents’ ability to accurately estimate imaging costs. Nevertheless, imaging tests, which were exposed only to the ‘cost context’ component of the intervention, did not see the same degree of improvement as laboratory tests, suggesting that actual ‘cost information’ may provide additional benefit over the influence of a ‘cost context’ alone. While cost estimation improvement could be attributed to trainee advancement, this seems unlikely because there were no preintervention or postintervention differences in cost estimation accuracy across postgraduate level. Although historical trends could theoretically have accounted for this finding, physician cost awareness in USA has been low for many years, making it unlikely that knowledge spontaneously improved during our study period.6

Our study has several limitations. First, it was restricted to trainees at a single academic hospital, which limits generalisability to broader settings, such as outpatient practices or community hospitals, as well as to other academic centres. Second, our study was also limited to residents in only six specialties, which could limit generalisability to other disciplines. However, we purposefully surveyed trainees from a variety of medical and surgical specialties, and found no differences between the specialties included. Therefore, our study provides a proof of concept that cost display may be an effective intervention for improving cost awareness across training programmes. Third, our study is a pre–post analysis, and not a randomised trial, which potentially limits our causality inferences. However, our difference-in-difference analysis helps reduce the likelihood that our observed results were due to underlying historical factors. Finally, we used a range of 50%–200% of the cost to determine accuracy of estimates. While this range is supported by prior data, it does represent a large range, and may overestimate cost knowledge in both the preintervention and postintervention surveys. However, we used this range for before and after the intervention, and believe that it represents a true improvement in cost knowledge among trainees.

Based on our results, further studies are needed to determine the effect of this intervention and others on changing trainee behaviour. The optimal interventions to improve trainee knowledge of cost should be studied along with other methods of encouraging cost-saving behaviour among trainees. Potential interventions would include giving trainee physicians feedback on their expenditures compared with their peers, or through incentivising trainees for minimising costs.

In summary, our intervention of a comprehensive cost display for all laboratory orders at an academic institution resulted in substantial improvement in the accuracy of cost estimation for laboratory and imaging orders among resident physicians, even though imaging costs were not displayed as part of the intervention. This finding suggests a possible spillover effect generated by providing a cost context for residents placing orders. Our results support using cost display as an intervention to increase cost awareness among physicians in training. If found to be effective elsewhere, this straightforward administrative intervention could be easily scalable and may prove to be an effective strategy for fostering lasting cost awareness.

Main messages

  • Less than 10% of trainees believed that they had an adequate knowledge of medical costs prior to the implementation of price displays within the electronic medical record.

  • Trainee knowledge of medical costs increased following implementation of price display within the electronic medical record system

  • Trainee estimation of cost became more accurate even for orders that were not displayed in the electronic medical record following price display implementation, suggesting a ‘cost context’ effect.

Current research questions

  • What method of educational intervention is the most effective at teaching trainees about medical costs?

  • What effect does trainee knowledge of medical cost have on ordering patterns or behaviour?

  • Do physicians at varying practice types or with compensation models have a greater knowledge of medical costs compared with academic institutions?

Acknowledgments

The authors would like to thank Bradley Herrin, MD of the Department of Pediatrics, Yale School of Medicine, for his assistance in data collection.

References

View Abstract

Footnotes

  • Prior presentations: an abstract for this manuscript was presented at the Northeast Regional Meeting of the Society of General Internal Medicine as an oral presentation and at the national meeting of the Society of General Internal Medicine as a poster presentation.

  • Twitter Follow Theodore Long at @tglong8

  • Contributors All authors listed have contributed sufficiently to the project to be included as authors, and all those who are qualified to be authors are listed in the author byline.

  • Competing interests None declared.

  • Ethics approval Institutional Review Board, Yale School of Medicine.

  • Provenance and peer review Not commissioned; externally peer reviewed.

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.