Alongside the history of medicine is another history. The history of the organisation—and reorganisation—of the health service. Both pursuits have as their aim improvements to the care and treatment of patients. However, while medicine has been governed by an evidence-based approach, this has been less the case with the development of the organisation and delivery of the NHS, which has been more constrained by political and financial imperatives.
It is significant, then, that the Five Year Forward View  and to some extent the Next Steps on the Five Year Forward View  underlined a commitment to evaluation. Robust evidence is critical for achieving many aims of this national strategy, including to “break down the barriers in how care is provided between family doctors and hospitals, between physical and mental health, between health and social care”.
One lesson from previous transformation programmes (for example, work to integrate health and care services or redesign care pathways) is that changes rarely have the intended effect from the outset. Healthcare is complex and unpredictable. Even with the best planning, things go wrong. Therefore, evaluations are needed to check that the ambition of these programmes is matched by demonstrable improvements in the quality and efficiency of the care provided.
More generally, evidence allows for a much more informed debate about what has been achieved so far by the health service, what the priorities are for the future, and what further changes may be needed. This kind of conversation can help leaders to galvanise and maintain support from clinicians and the public for change.
Unfortunately, the quality of the evidence underpinning changes to the organisation and delivery of healthcare has often fallen far short of what is expected in medicine. The public accounts committee is one of the groups who has expressed concern. 
As an example of what could be done, this week the Health Foundation has published an evaluation of the impact of redesigning urgent and emergency care in Northumberland, which included building the UK’s first specialist hospital devoted to emergency care in Cramlington and converting three existing A&E departments to urgent care centres.  The analysis was conducted by the Improvement Analytics Unit: a partnership between the Health Foundation and NHS England to provide local teams with access to robust information on the impact of changes to healthcare.
The evaluation made comparisons against a control area that was formed by combining data from other parts of England, and found slight reductions in the amount of time patients spent waiting in A&E (by around 14 minutes) but quite big increases in the number of A&E attendances amongst the local population in the first year (around 13.6%).
The original intention behind the changes was to centralise emergency care for the most seriously ill and injured patients in a specialist facility, and so provide faster access to senior doctors and diagnostics. This may have happened for many patients, and there is nothing in these evaluation findings to suggest that the original idea to centralise emergency care was wrong.
Yet, A&E attendances increased by more than emergency admissions, suggesting that because of these changes, more people went to hospital without requiring inpatient care. The Northumberland primary and acute care system has sought to reduce inappropriate unplanned care in hospital settings, suggesting that further tweaks to the model are needed.
In planning their next steps, it will be important for the local partners to have a deeper understanding about the mechanisms at play. One possible explanation of our findings is that patients perceived that A&E was a more attractive place to seek care after a new state-of-the-art hospital was opened. This line of argument points to improving primary care as an alternative to A&E.
Because there are several options for what to do next, it is now necessary to combine these results with other sources of information (including the views of service users and healthcare practitioners) to plan next steps. The robustness of the quantitative information provided by the IAU is a useful foundation without which it might not be possible to begin this process.
Unfortunately, many evaluations of NHS organisational change have not had access to such robust data. Where studies have been done, they have often lacked a control group, and conclusions have been drawn based on changes over time in the outcomes experienced by the group of patients affected, which may have been present with or without the intervention.
As a result, evidence about NHS organisational change is often misleading or not credible, with implications for the development of health services. The underlying issues are multifaceted, but include the (now commonly accepted) reality that many NHS organisations are seriously under powered when it comes to using data effectively. 
Much more needs to be done if the history of changes to the organisation and delivery of healthcare is to live up to the scientific standards expected from its medical cousin. Ultimately, there must be demonstrable improvements in the quality of care provided for patients.
Adam Steventon, director of data analytics, Health Foundation, London UK.
Competing interests: None declared.
 NHS England. Five Year Forward View. London, NHS England 2014.
 NHS England. Next Steps on the Five Year Forward View. London, NHS England 2014.
 House of Commons Committee of Public Accounts. Integrating health and social care. HC 959. London: House of Commons. 2017.
 O’Neill S, Wolters A, Steventon A. The impact of redesigning urgent and emergency care in Northumberland. London: Health Foundation. 2017. http://www.health.org.uk/publication/impact-redesigning-urgent-emergency-care-northumberland
 Bardsley M. Understanding analytical capability in health care: Do we have more data than insight? London: Health Foundation. 2016. http://www.health.org.uk/publication/understanding-analytical-capability-health-care