Article Text
Statistics from Altmetric.com
Observations on life as an NHS R&D director
In the early 1990s, an expert group chaired by Professor Anthony Culyer, was commissioned by the government to report on the state of research and development (R&D) within the National Health Service (NHS) in the UK, and to make recommendations which would secure and strengthen its future. His report, published in 1994, drew important conclusions which have had a far reaching effect on R&D within the NHS. The financial resources associated with R&D entered the NHS vernacular with his name attached, so called “Culyer money”, and the consequence of his recommendations has been the construction of an entirely new NHS organisational element both centrally in the Department of Health and in NHS trusts and other local NHS organisations. R&D directors and managers are now a necessary part of the life of every NHS trust, large and small, and a new brigade of civil servants in the Department of Health have preoccupied themselves with the appropriate use of NHS resources for R&D.
I have been an R&D director in two different NHS organisations for five of the seven years since the Culyer report began to be implemented. Having recently passed on the role, my aim in this paper is to recount the positive force for good which these developments have offered, as well as noting (usually with the wisdom of hindsight) how more realistic thinking among policymakers, and wiser approaches to local implementation, might have allowed us to achieve even more.
The Culyer report made two crucial observations. Firstly, an unequivocal statement that R&D must be regarded as a core activity of the NHS, and must therefore always have some funding priority. A truism for many of us perhaps, but clearly a principle susceptible to pressure in a cash restricted NHS increasingly preoccupied, in the mid-1990s and ever since, with its failure to deliver the health expectations of the nation. Secondly, he confirmed that the NHS was already spending substantially on R&D yet had little real idea of the extent of that spend, nor had any reliable mechanisms to identify it. He drew the inevitable conclusion that, unless these deficiencies were corrected, a relentless restriction on R&D spending was inevitable as other health care priorities were driven into an NHS budget which in the mid-1990s was not growing in real terms.
The Culyer report was accepted virtually in its entirety, and NHS trusts were soon being required to divert managerial resources to support R&D. Trusts such as my own, a provincial teaching hospital of moderate size, appointed R&D directors, typically consultants with a personal research track, along with an R&D manager and other appropriate administrative support. The two immediate goals were the development of an R&D strategy for the trust and the accurate identification of the money presently being spent on R&D, “Culyer money”. Enthusiasts grasped with relief the first substantial evidence that the NHS was truly committed to R&D, the sceptics immediately feared that accurate identification of our “Culyer money” would inevitably mean that we could lose as well as gain if the Department of Health sought to reallocate such funds to institutions perceived to be more successful in research terms.
The two major tasks—R&D strategy and financial management—remain the core of an R&D director’s life to this day, and more recently the implementation of “research governance” has been added to our tasks. Although inevitably interlocked, I will discuss each of these three elements separately.
My personal perspective is also coloured by local NHS organisational change. At the beginning I was R&D director at Leicester General Hospital, one of the three teaching hospitals in Leicester, each a separate trust. For the last two years I have been R&D director of a single trust, merging those three hospitals, University Hospitals of Leicester NHS Trust, now one of the half dozen largest trusts in the UK.
R&D FINANCE
At the core of Culyer’s recommendations was the need to identify, and secure for the future, R&D funding in the NHS. The newly appointed R&D directors were given the task in 1995–96 of identifying that funding. The advice on how this should be done (perhaps understandably) was somewhat flexible and the consequence was a great variety of methodology at work in different trusts and their finance departments. A substantial element of the R&D spend was to be found in salaries or parts of salaries of research active health professionals, and correct identification of elements truly spent on R&D was challenging enough; but identification of the proportion of any hospital’s infrastructure being spent on R&D was an even more inexact science. The final statements of the amount of R&D money in each were inevitably viewed with varying degrees of confidence as an accurate reflection of the true spending on R&D. When it was announced that these declared funds had ipso facto become the R&D budget of the NHS (a total declared to be ∼ £450 million) some disquiet followed. Variations in R&D budgets declared by individual trusts were particularly striking and were not always perceived to outsiders as proportionate to R&D performance. While most teaching hospitals within the M25 motorway around London had R&D budgets in excess of £10m and some three or four times more, teaching hospitals beyond the M25 had a budget of more than £5m. Even broad comparisons of research output against these inputs suggested that the route by which the R&D budget had been defined was at best an approximation.
Further tensions were to follow. Individual trusts were soon asked to bid to retain their present R&D budget on the basis of current R&D activity and future planning with the possible “carrot” that a strong bid might be followed by a diversion of additional funds to that trust. The combined processes of budget identification and the subsequent bid proved extremely demanding for R&D directors and their fledgling teams, and inevitably brought with them a frisson of uncertainty as to the changes which would follow. Perhaps predictably, any budget reallocations in the first round were extremely small, defended with the notion that major change could prove financially destabilising to trusts with large R&D budgets. Ironically, however, it transpired that the demanding phase of work leading to the declaration of the R&D budget had not after all entirely secured funding for R&D as Culyer had envisaged. The R&D budget for most hospital trusts fell in real terms over the next few years; in part for the laudable reason that some of the NHS R&D budget was “top sliced” centrally to support new R&D initiatives, and in part because year on year increases in the R&D budget which government made available were consistently short of inflation. Since the majority of the R&D budget in most hospitals continued to support salaries, there was inevitable pressure even to maintain current levels of activity.
Such restriction of the R&D budget following so soon after the optimism created by the government’s acceptance of the Culyer report, was a considerable disappointment for the NHS R&D community. After all it seemed that even the powerful advocacy of NHS R&D by successive NHS R&D directors was not achieving priority when ministers and civil servants were increasingly challenged with delivering waiting list targets and other immediacies of health care. Only this year, for the first time since the Culyer report, has there been any growth in real terms in the NHS R&D budget, although even now the growth is proportionally less for R&D than other aspects of the NHS.
REDEFINING THE R&D BUDGET
From the beginning there was broad understanding that the NHS R&D budget, once defined and protected, should be spent in two ways. Firstly, it must support the infrastructure necessary to make an organisation research competent. Secondly, it was widely recognised that R&D priorities and needs should be identified both locally and nationally to ensure that the rather modest R&D budget was used in the most coherent way. An extreme view which gained some support was that the NHS must only support R&D with immediate impact on health care, and that any research more remote from the clinic, including all health related laboratory research, was to be the province exclusively of medical schools. Soon however a broader and more balanced understanding of the range of R&D of legitimate relevance to the NHS emerged and continues to be maintained.
There was soon enthusiasm at the Department of Health for increasingly detailed definitions of R&D costings, this being perceived as the way to ensure “value for money” in the use of the R&D budget. When costing of even straightforward clinical episodes, such as elective surgical procedures, was proving challenging, it could have been predicted that detailed R&D costing would prove unattainable, since so much research activity does not fit into neat activity definitions and fixed term projects. Misguidedly, the Department of Health understood accountability in terms of precise definition of the use of small amounts of money, rather than understanding that a large research competent organisation with good local management could be entrusted over several years with an R&D budget working to longer term measurable goals. Regrettably this has led over the last few years to much dissipation of energy in seeking levels of financial detail which are unachievable. R&D directors and their managers have been required to assist the Department of Health and their external advisors in developing financial information systems with little prospect of meaningful success. Discussions have sometimes become surreal; I have particularly unhappy memories of attending a nationally organised meeting where accountants and civil servants asked a group of clinician researchers to define the amount of time per week in hours or parts of hours spent on each research project in which they were involved. The ensuing discussion in which clinicians pointed out the absurdity of the question, particularly since all such work was done in evenings and weekends away from the rigours of the daily work of the NHS, was a model of failure in communication.
While such work on a national R&D financial system has been laboured and frustrating, financial progress has been much more substantial locally. In my own trust the R&D budget is now defined and recognised. It is not a notional amount hidden within a clinical directorate budget, but has attached to it specific staff, facilities, and resources. There is also a trust board and trust executive level agreement that the R&D director, in discussion with clinical directors, has true influence in the allocation of that resource, giving the opportunity for its redirection to areas of productivity and priority. Such agreements are mandatory if R&D is to progress.
R&D STRATEGY
My own experience of developing a strategic role for R&D within a large teaching hospital has been thoroughly positive, although it is important to recognise that some other NHS organisations, often with less research activity, have continued an uphill struggle. In my first role in a smaller trust, the trust board embraced enthusiastically the R&D strategy which we developed and was unequivocally supportive in seeking its implementation. In the merged trust things were taken a step further. The strengthening of R&D was one of the core reasons stated for the merger of the three hospitals in the city, and by appointing the R&D director to the trust board, the trust signalled the value it placed on this element of its work. Inevitably establishing and promoting such a culture in smaller hospital trusts with rather little existing R&D activity, or in primary care where R&D has been poorly resourced, has been particularly challenging.
R&D collaboration
R&D is essentially collaborative and real local progress will only occur when there is partnership both within and beyond the NHS. Beyond the NHS, partnership with universities is of course crucial. Discussions need to be wide reaching. In our own case there are three universities within Leicestershire—one contains our medical school and therefore has strong and long established strategic and funding links with the NHS. A second contains the nursing school, a professional field where R&D has been relatively lightly developed. The third does not train health care professionals but has substantial expertise in bioengineering and sports science. Thus there is much to gain if all three universities are partners of NHS R&D. Within the NHS, although our trust was the dominating partner in terms of R&D budget and critical mass, partnerships with our mental health trust and with primary care had much to offer. In establishing a Leicestershire NHS R&D strategic alliance that partnership was achieved with director level representation from primary care (through their own established primary care research alliance), from the hospital and mental health trusts, from the three universities, and from the health authority. This alliance has been a force for good and will now modify its shape, but not its goals, by incorporating representation from Northamptonshire coinciding with our new strategic health authority boundary. It is through this alliance that we have sought to make a powerful case that R&D is a justifiable priority for new NHS funding through the health improvement programme. We have argued vigorously that it is not possible to have a health improvement programme without R&D to direct some elements of that improvement, and that it is not possible to rely only on the centrally allocated R&D budget to achieve those goals as more and more new approaches to treatment and health care require evaluation before implementation.
Multiprofessional research
A second strategic issue relates to the development of multiprofessional research. Critics have complained that R&D in the NHS is controlled by a “medical” model, in which doctors lead most R&D and set the culture and context in which R&D is promoted. I would argue that thus far this has unavoidably been so given the much more limited research capacity among nurses and other health care professionals. To increase this capacity requires strategic investment. This is not achieved by a broad increase in research awareness, which is an important educational goal but not an R&D budget priority, rather by nurturing and sponsoring committed individuals developing their expertise within multiprofessional R&D teams; an approach which takes time. Such teams will more often than not be led by doctors at present, but this should be seen by all involved as a pragmatic necessity rather than a statement of hierarchy. Such approaches require careful discussion and mutual respect locally.
Collaboration with medical schools
A third key strategic issue is to understand correctly how research in medical schools and in the NHS are inextricably intertwined. All clinical research undertaken by clinical academics involving patients calls on the resources and indemnification of the NHS and must be properly documented and managed. Furthermore, in all medical schools a substantial proportion of clinical academics are funded entirely by the NHS both for the academic and clinical elements of the post. These complex arrangements are indicative of some longstanding anomalies in national budget allocation to higher education and health respectively, and are well beyond the influence of R&D directors. But such funding dependence between the local NHS and a medical school should be a point of contact and collaboration, with the mutual interdependence of the two organisations and the major gains for both parties being recognised. Thus a medical school simply could not exist without the NHS resource it received to support its research and teaching. On the other hand the local NHS gains immeasurably from the expertise of a local medical school, bringing with it an upward spiral of excellence in recruitment and retention of high quality staff as well as the intellectual culture and environment which follows. But such a partnership also has tensions. Universities are powerfully driven by the research assessment exercise, and will increasingly wish to invest only in research activity likely to secure 5 and 5* ratings in any future assessment. The NHS also recognises excellence in R&D, but this will be a more broad based view. There will be NHS research priorities of little interest or value to a university and the NHS will need to take risks investing to increase research capacity and develop on proven areas of clinical research. There is much common ground, but it can often be those areas without overlap which provoke most tension. The R&D director must be a cultural leader within the trust to ensure that these distinctions and cooperations are properly understood. The R&D director must also be in a position to engage in constructive discussion with the dean of the medical school. My own experience has been extremely positive in this regard, and it is undoubtedly more straightforward when a single R&D director representing the one local hospital trust can discuss with the medical school allocations of a single large R&D budget. Earlier days when three teaching hospitals each with its own R&D director, R&D strategy, and R&D budget were in discussion with the medical school, inevitably were less effective.
RESEARCH GOVERNANCE
While the new NHS R&D structures were still finding their feet, establishing their strategic role, and securing their financial arrangements, a new challenge was presented to them. The notion of “research governance” was introduced to the NHS and the responsibility for its implementation fell to R&D directors.
The development and implementation of research governance have many parallels with that of clinical governance which had preceded it a few years earlier. The principle underlying clinical governance was unarguable—that we should seek to maintain excellence and improve the quality in every aspect of clinical care. Yet behind this positive truism many in the NHS sensed a negative aspect of clinical governance—namely that the government wished to create an NHS in which such major adverse incidents as the scandal surrounding paediatric cardiac surgery in Bristol could never happen again. Many clinicians regarded the principles of clinical governance as self evident, and their espousal so entrenched in clinical practice that it was superfluous to develop detailed organisational arrangements to promote them. Only gradually has it been seen that clinical governance arrangements themselves have provided a coordinating structure within which inappropriate practice can be identified early and correctly managed, risk reduced, and an upward spiral of improving quality be maintained. Yet however good the systems, there can never be absolute guarantees that “it could never happen again”.
Research governance likewise is based on self evident principles that all research should be undertaken to the highest scientific, ethical, and financial standards. Yet it also carries the negative aspect that some in government hope research governance will create an NHS in which events such as the organ retention scandal at Alder Hey could never happen again. Yet however good the systems, there can never be such absolute guarantees.
The immediate response of many researchers to a complex research governance framework requiring extensive documentation was parallel to the response of many to clinical governance. To resent the intrusion of an excessively “controlling” organisation which required them to document the obvious and provide assurances about elements so fundamental to their research ethos they could not conceive they should ever need to be checked. Only gradually has the protection offered to investigators by such a scheme become clearer. Only by seeing the impact of a research governance framework in the early identification of potential governance issues and their prompt and effective correction, have many investigators been able to see the value of the proposals. Nevertheless R&D directors have faced considerable work in “winning the hearts and minds” of experienced researchers and helping them to understand that, whatever went before, they now work in a research governance environment whereby not only must they do things well but must be seen to be doing them well.
The role of the R&D directors and their teams has been not only to establish the necessary local organisational framework, but to satisfy the almost insatiable appetite of regional and national observers for information and procedure, while protecting wherever possible individual researchers from bureaucratic drudgery which may act as a deterrent to research involvement. The R&D team must act as translator and interpreter so that the burden of demands for information can be better understood by individual researchers.
WHAT NEXT FOR R&D DIRECTORS
With the near completion of a financial framework for R&D in the NHS which it is to be hoped will change little over the next few years, and the implementation of research governance, the first phase of NHS R&D management is near completion. Despite some of the challenges outlined in this short review, the necessary management systems for R&D in the NHS is now properly in place. Many key battles have been won, not least the placement of R&D unequivocally as a core aim of the NHS and a necessary requirement for a vigorous and effective clinical climate in any NHS organisation.
R&D directors must continue to stake these claims to show the sceptics that money is well spent, and that research which will change clinical practice now, as well as research which may only change it in 10 or more years, are both legitimate calls on NHS resource. Complimentary and respectful partnerships between the NHS and our universities can unleash an enormous potential of health related research. Carefully focused investment will gradually strengthen the research capacity among non-medical health professionals to the great gain of all.
PERSONAL REFLECTIONS
Like any senior NHS management position in the present day, to be an R&D director is to be pulled in many directions. To work as an R&D director without losing grip of one’s own clinical practice or personal research portfolio is always stretching and often requires the ruthless selection of those meetings which must be attended among those which could be attended.
Like all senior NHS management positions, frustration and satisfaction walk hand in hand, particularly when those given responsibility for R&D in government are not always well versed in the realities of the researchers’ existence. My personal view is that the direction taken by policymakers and planners in the Department of Health has not always assisted researchers in the short term. Unsurprisingly, since it mirrors attitudes in other aspects of NHS planning and organisation, the talk has been about autonomy and local control, yet the requirements have been for unrelenting centralisation of information gathering. The information to be gathered has often not assisted researchers, but has allowed the construction of defences at the centre against criticism were things not to go well. Much more could have been achieved more quickly if greater amounts of the energy of R&D directors and their teams could have been reserved for imaginative strategic thinking and planning of new research initiatives rather than the introduction of machinery sometimes unnecessarily complex.
Will R&D succeed in the NHS? It has to. R&D directors, and the teams who support them, are now better placed than ever to ensure that success. I for one am glad to have been involved early along the way.