Intended for healthcare professionals

Analysis And Comment Health policy

Evaluating and implementing new services

BMJ 2006; 332 doi: https://doi.org/10.1136/bmj.332.7533.109 (Published 12 January 2006) Cite this as: BMJ 2006;332:109
  1. Ann McDonnell, lecturer in nursing (a.mcdonnell{at}sheffield.ac.uk)1,
  2. Richard Wilson, research fellow2,
  3. Steve Goodacre, senior clinical lecturer in emergency medicine2
  1. 1 School of Nursing and Midwifery, University of Sheffield, Sheffield S1 4DA
  2. 2 School of Health and Related Research, University of Sheffield
  1. Correspondence to: A McDonnell
  • Accepted 17 November 2005

Evidence based health care should apply to the way that services are delivered as much as it does to treatments

Changes to the delivery and organisation of health services should be evaluated before they are widely implemented. Evaluation should be sequential, moving from theory to modelling, explanatory trials, pragmatic trails, and ultimately long term implementation. 1 However, this sequence is rarely followed. New services are often implemented, or existing services are changed, before evaluation can take place. Any subsequent evaluation will have to use unreliable methods (such as an uncontrolled, before and after design) and is, of course, too late to influence implementation. We use three examples from the NHS to show how enthusiasm can overtake evidence and the benefits of a more considered approach.

Changing the organisation of services

Implementing organisational change in health services requires substantial effort and typically needs to be driven by enthusiastic groups and individuals. There are many examples of delays in getting existing evidence into practice. The slow pace of organisational change is often seen as problematic in the drive towards an evidence based health service. However, sometimes the converse is true. Too much momentum may lead to inappropriate implementation of change before evaluation is complete. Managing this momentum offers the key to rational evaluation and implementation of changes in service organisation and delivery.

The drive for change in the way services are delivered can spring from various sources, including political imperatives, policy drivers, and enthusiasm from clinicians. Enthusiasm for improved services is desirable but can blind enthusiasts to the possible downsides of an intervention. Evidence based care may mean delaying the introduction of new treatments until robust evidence exists of their effectiveness. This approach is well suited to simple interventions aimed at individual patients, such as drugs. Here, momentum is often driven primarily by commercial imperatives. Although political and professional influences are brought to bear in the introduction of new drugs, as shown by recent controversies over treatments for Alzheimer's disease and multiple sclerosis, current regulatory frameworks attempt to ensure that new drugs cannot be prescribed before they have been thoroughly evaluated.


Embedded Image

Would this service exist if it had been evaluated first?

Credit: ITV/REX

When an existing device or operative technique is modified for a new purpose, intervention is more complex. But even here, a framework of regulation helps to curb the enthusiasm of pioneers and ensure that use of the new technique is based on evidence as well as passion and commitment. Since 2003, The National Institute for Health and Clinical Excellence (NICE) interventional procedures programme and the Review Body for Interventional Procedures have been assessing the safety and efficacy of new procedures. They gather evidence by systematic review and formulate guidelines. Use of a new procedure may be restricted to certain circumstances or to specific healthcare facilities. This review process has attracted criticism from some people who believe it will stifle change and innovation.

Achieving a balance between controlling the momentum for change and maintaining enthusiasm is more difficult for complex innovations such as new clinical services. We have selected three examples which show the importance of managing momentum as part of a planned framework for the development, implementation, and evaluation of new clinical services. In the first two, the pace of implementation outstrips the emergence of evidence. Both are top-down innovations, one driven by professional bodies and one by policy makers. The third is a bottom-up approach, where the pace of implementation and evidence are more evenly matched.

Acute pain teams

Acute pain teams were introduced in the United Kingdom in response to concerns from many professional groups that postoperative pain control was unacceptably poor and that new techniques such as patient controlled analgesia and epidural techniques should be used with appropriate safeguards. In 1990, a report by the royal colleges of surgeons and anaesthetists recommended the introduction of acute pain teams in every hospital that did inpatient surgery. 2 Although rigorous evidence of the effectiveness of these teams was lacking, 3 84% of acute hospitals in England had an acute pain team by 2000, and surveys reported wide variation in terms of membership and activities. 4 5

Currently, many teams are experiencing difficulties with funding, which is hampering development of the service. 5 In cases such as this, where momentum for change overtakes the search for evidence, it may be difficult in the future to maintain established services in the face of competing financial pressures. This is also likely to affect staff morale.

NHS Direct

NHS Direct was set up in December 1997 as a telephone advice line run by nurses to provide “easier and faster advice and information for people about health, illness and the NHS so that they are better able to care for themselves and their families.” It was not primarily intended to reduce demand on other services, but the chief medical officer hoped that it would “help reduce or limit the demand” on immediate care services. 6

An observational study in the three areas where NHS Direct was first established found that it did not reduce pressure on immediate care services but may have restrained increasing demand on part time general practitioners' out of hours services. 7 However, by the time this study was published the service had been extended to cover large parts of the country.

Audit of NHS Direct estimated that about half the £90m ($159m; €133m) annual cost of NHS Direct was offset by encouraging more appropriate use of NHS services. 8 This raises questions about the value of the remaining £45m spent on NHS Direct each year. NHS Direct is associated with high consumer satisfaction 9 but so are most health services. It is underused by older people, ethnic minorities, and other disadvantaged groups.

NHS Direct now covers the whole of England and Wales, and it would be difficult to withdraw the service without substantial reorganisation and disruption of other services. Yet we are still uncertain whether the resources currently used to support NHS Direct are being well spent.

Examples of organisational changes without robust evidence

NHS diagnostic and treatment centres Rapid access chest pain clinics Critical care outreach services Emergency department “see and treat” Advanced access in general practice NHS walk-in centres Modern matrons Nurse consultant roles The internal market in the NHS

Stroke units

The development of stroke units in the United Kingdom has been slower and more organic. Stroke units began to appear in the 1950s in the early days of the NHS. The underlying premise was that care of stroke patients could be improved if it was delivered in a more organised fashion. Only a few of these units were established in the 1950s, and one in Northern Ireland published an observational study of its performance before the end of the decade. 10 Randomised controlled trials were first done in 1962, and in the years up to the 1980s a few formal trials were reported. Results from the initial studies suggested that stroke units produced benefits. However the growth of stroke units remained slow and uneven, even into the 1990s. Further randomised controlled trials, with increasingly rigorous designs, continued to show benefits in outcome. 11 Recent systematic reviews have also confirmed the effectiveness of stroke units. 12

Overall, the pattern here has been of innovation followed by a period of evaluation and reflection. Development and implementation has been incremental and supported, at least latterly, by a rigorous evaluation of the benefits.

Power of momentum

Although introduction of acute pain teams was clinically driven whereas NHS Direct was politically driven, the process by which implementation overtook evaluation was similar. In both cases there was a perceived imperative to take prompt action based on clinical need or perceptions of public demand. Evaluation, to determine whether implementation would be effective, was an afterthought. The goal of action seemed to be service innovation itself, so the outcomes of any subsequent evaluation were poorly defined and could potentially be redefined in the light of negative evaluation. The box lists other examples in which delivery of services has been changed without robust evidence.

Conclusion

Health services are constantly changing. It is not always clear why change happens and how the tipping point is reached. 22 Greenhalgh and colleagues have identified the key role that opinion leaders and champions have in organisational innovation. 23 These champions may be politicians or professionals, but if they value action (or the appearance of action) over effective change it is not surprising that evaluation will be a low priority.

Evaluation should precede implementation and follow a staged approach, as recommended by the Medical Research Council. 1 Explicit strategies to manage the pace of change need to be developed at an early stage and should include organisations responsible for changing service delivery in the NHS and health services research. It should be explicitly recognised, particularly when change is driven by politicians or professional groups, that implementation of change is not an end in itself but should have clearly defined goals which are measured as part of planned strategy for evaluation.

Although changing the way in which an existing clinical service is delivered may seem to present little risk, our preconceptions about what works in practice can often be wrong. For example, the use of air ambulances makes sound sense intuitively. However, formal evaluation showed that the benefits are limited and the costs substantial. 24

We have focused on the role of politicians and professionals in driving implementation before proper evaluation, but in future, with increasing commercialisation of health services and the development of public-private partnerships, other players may be involved. If the health service community fails to develop explicit strategies to manage momentum, we risk being swept along by a tide of change driven in part by the need to improve profit margins rather than patient care.

Summary points

Changes to the delivery and organisation of health services should be evaluated before they are widely implemented.

Too much momentum may lead to inappropriate implementation of change before evaluation is complete

A regulatory framework has been established to assess the safety and efficacy of therapeutic interventions

A similar approach needs to be taken for the development of new clinical services

Editorial by Gabbay and Walley and pp 107, 112

Acknowledgments

We thank James Munro for his helpful comments.

Footnotes

  • Contributors and sources The idea for this article came originally from AM and RW. The article was written jointly by AM, RW, and SG. AM is the guarantor.

  • Competing interests RW is project manager of the Review Body for Interventional Procedures and, as such, his salary is reimbursed to the University of Sheffield. AM and SG work on Department of Health funded research into the delivery and organisation of health services.

References

  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
  20. 20.
  21. 21.
  22. 22.
  23. 23.
  24. 24.
View Abstract