20 Jun, 16 | by BMJ Clinical Evidence
374 interventions have been reported to be effective in experimental stroke; 97 were tested in clinical trials but only one of these was shown to be effective. The principle of drug development goes that if a therapy improves outcome in animals the next step is to test it in humans in a clinical trial, with the hope that important improvements in outcome will be seen. The reality is that there are huge amounts of animal data (too many drugs), from experiments that are not designed with sufficient rigour or adequately reported, and important improvements in outcomes are usually not seen in clinical trials (too few medicines). Many animal experiments are failing in their objective effectively to inform human health.
And there are a lot of drugs around – in 2013, 350 publications describing animal experiments were published every day! I understand that the average life scientist has neither the desire nor the need to consume this amount of information, but even within a limited research domain relevant data accumulate rapidly. In an endeavour where future ideas, decisions and directions are based on our existing knowledge it is important that we are able to identify, critique and synthesise data in an unbiased, timely and useful manner.
In the CAMARADES group we have led the effort to apply systematic review and meta-analysis tools to preclinical studies. Our aim is to provide a platform to assess the quality and range of evidence, identify gaps in the field, assess for publication bias, seek to explain discrepancies between preclinical and clinical trial results and to inform clinical trial design. In contrast to reviews of clinical data, observing heterogeneity and identifying its sources is substantially more valuable than headline summary estimates of how effective a drug is.
We have now systematically collated data from over 7000 studies covering 19 different disease models reporting outcome from nearly 200,000 animals. Some of our findings are striking – only 1% of these studies report performing a sample size calculation, and about 2/3 fail to report each of randomisation and blinded assessment of outcome – both of which are associated with overstatements in reported efficacy. Assessment of publication bias in 499 publications describing focal ischaemia identified that around 1/6th of experiments remain unpublished, which leads to an overstatement of efficacy of at least 30%. Clearly, we need to address these limitations.
Along with others we have proposed strategies to address many of these problems, including for instance multicentre preclinical animal studies. The NC3Rs have developed two practical tools, The Experimental Design Assistant to guide researchers through the design of animal experiments and the ARRIVE guidelines to improve the reporting of research using animals. Many strategies have been proposed to address publication bias, including the pre-registration of animal trials and even registries of such studies.
Better design and reporting should lead to better science, and should make research synthesis less challenging. It may be that fewer drugs “work” in animals if the experiments are performed properly, but if this means that fewer (and better tested) medicines are taken to clinical trial, the process will be more efficient and the chances of success will be higher. We need fewer drugs, and more medicines!
Dr. Emily Sena is a research fellow specialised in the validity of preclinical studies at the Centre for Clinical Brain Sciences, University of Edinburgh. She completed her PhD in 2010 at the University of Edinburgh that also included a one year exchange at the Department of Medicine, University of Melbourne. Her research interests are in the use of systematic review and meta-analysis of preclinical studies to increase the understanding of critical facets of translational medicine and developing new hypotheses for testing in the laboratory. Together with Professor Macleod she has established a multicentre international research collaboration facilitating the pooling and analysis of data from a range of disease models across the basic sciences. To support this she has developed an internationally accessible database, similar to the Cochrane library, to facilitate interpretation of the preclinical data used to justify progression to clinical trial.
She will be speaking at Evidence Live 2016 on Wednesday 22nd June.