“Much poor research arises because researchers feel compelled for career reasons to carry out research that they are ill equipped to perform” — Douglas Altman 1994
The research literature has become flooded with papers describing poorly designed and poorly reported research that is unread and uncited. For example, Ioannidis recently estimated that only around 3% of systematic reviews were “decent and clinically useful”, with many being redundant (27%), fatally flawed (20%), or misleading (13%). [1] An analysis of articles in cardiovascular journals found that 15% of articles were never cited and 46% had 5 or fewer citations. [2] This results in a significant waste of research effort: first, in the production of poor and unused research; second, in the effort in the processes of peer review, publication, and dissemination.
This squandering of research effort takes many forms including failure to take adequate account of previous studies when planning new work, poorly formulated research questions, inadequate study designs, inefficient study conduct, incorrect analyses and failure to report results. [3] Despite the efforts of journals and research grant agencies peer review continues to be of variable quality; prospective trial registration remains incomplete and reporting of trial results in registries is still uncommon. [4]
These poor practices are influenced by “publish or perish” pressures, which distort the pursuit of research funding, publication and professional promotion. Some organisations have tried to combat these effects. The DORA statement set out 18 recommendations to attempt better alignment of assessment and incentives with good quality research, in particular emphasising scientific quality over publication metrics. [5] The recent publication of the “Hong Kong Principles” sets out guidance for performance assessment that should encourage better research practices by emphasising trustworthiness, rigor and transparency. [6]
These pressures apply mainly to established researchers. In our view, we also need to investigate possible negative influences of medical training programs. Both medical students and trainee doctors report stress created by research projects that they complete as part of their training. [7,8] There is little published analysis of the content of research training curricula, and none on the quality of research that is undertaken by trainees. We recently analysed the published research training curricula of 58 Australian medical and surgical specialty colleges and their specialty subdivisions. [9] Most colleges require trainees to undergo research training and many require them to complete research projects. Colleges commonly place great importance on trainees leading research projects, with weight given to authorship and presenter status. In contrast, they give less weight to crucial elements of research apprenticeship: participating in research, acquiring research skills, and being supervised by staff who are suitably qualified in research methods. Some excellent research may very well be produced by trainees enthused, equipped, and supervised to address important research questions. The problem comes from requiring almost all trainees to produce and present research findings. And it doesn’t stop at the end of training. Unwisely, research productivity is also important in determining promotions and career progression, even for trainees who are not planning to become researchers. [7]
These influences are underpinned by two important, but untested assumptions. The first is that specialist-clinicians are the natural research leaders. While in some cases this is correct, it conflicts with contemporary values, which emphasise collaborative multi-disciplinary teams performing well-designed studies addressing important questions. The second untested assumption is that research projects are a good way of learning research skills. There is probably value in planning a project and considering the methodological and logistical challenges; however, the requirement to complete the research within a tight timeframe alongside competing pressures of clinical training risks hurried poorly conducted projects that contribute to research waste.
The late Doug Altman said “We need less research; better research; and more research for the right reasons”. [10] What would the clinical training world look like if we took Altman’s challenge seriously? A simple classification of research engagement is: (1) Users of research—who practice evidence-based medicine, ask well-formulated questions, search for the best research, then appraise and apply that; (2) Doers of research—who are engaged in, but do not lead primary research projects. This serves an apprenticeship function and important role in generating evidence; (3) Leaders of research—who develop research ideas, apply for grants, and lead or supervise projects. [11] We think developing skills for the “User of research” stage is essential for all trainees. Many should also be engaged as “Doers of research” as that is integral to good clinical practice. However, not everyone is suited or motivated to be “Leaders of research”, and training programs should recognise those different needs. This would lead to less research but better research and reduce research waste.
Paulina Stehlik, Institute for Evidence-Based Healthcare, Bond University, Gold Coast, Queensland, Australia and Evidence Based Practice Professorial Unit, Gold Coast Hospital and Health Service, Southport, Queensland, Australia
David Henry, Institute for Evidence-Based Healthcare, Bond University, Gold Coast, Queensland, Australia and Evidence Based Practice Professorial Unit, Gold Coast Hospital and Health Service, Southport, Queensland, Australia
Paul Glasziou, Institute for Evidence-Based Healthcare, Bond University, Gold Coast, Queensland, Australia
PS DH and PG are responsible for providing teaching in evidence-based practice and research to staff at Gold Coast Health (GCH) and recently published a review of specialty training research curricula (http://dx.doi.org/10.1136/bmjopen-2019-034962).
Competing interests: We have read and understood BMJ policy on declaration of interests and have no relevant interests to declare.
References:
- Ioannidis JPA. The Mass Production of Redundant, Misleading, and Conflicted Systematic Reviews and Meta-analyses. The Milbank Quarterly 2016;94(3):485-514. doi: 10.1111/1468-0009.12210
- Ranasinghe I, Shojaee A, Bikdeli B, et al. Poorly cited articles in peer-reviewed cardiovascular journals from 1997 to 2007: analysis of 5-year citation rates. Circulation 2015;131(20):1755-62. doi: 10.1161/CIRCULATIONAHA.114.015080 [published Online First: 2015/03/26]
- Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. The Lancet 2009;374(9683):86-89. doi: https://doi.org/10.1016/S0140-6736(09)60329-9
- Jordan VMB, Farquhar CM. Is there any hope that we can reduce research wastage and prevent publication bias? Journal of Hospital Management and Health Policy 2019;3
- DORA. San Francisco Declaration on Research Assessment: DORA; 2012 [updated 08/01/2020. Available from: https://sfdora.org/read accessed 08/01/2020 2020.
- Moher D, Bouter L, Kleinert S, et al. The Hong Kong Principles for Assessing Researchers: Fostering Research Integrity. World Conference on Research Integrity,, 2019.
- Kasivisvanathan V, Tantrige PM, Webster J, et al. Contributing to medical research as a trainee: the problems and opportunities. BMJ 2015;350:h515. doi: 10.1136/bmj.h515
- Mykkanen K, Tran V. The ACEM trainee research requirement is no longer relevant. Yes. Emergency Medicine Australasia 2017;29(6):724-25. doi: 10.1111/1742-6723.12892
- Stehlik P, Noble C, Brandenburg C, et al. How do trainee doctors learn about research? Content analysis of Australian specialist colleges’ intended research curricula. BMJ Open 2020;10(3):e034962. doi: 10.1136/bmjopen-2019-034962
- Altman DG. The scandal of poor medical research. British Medical Journal 1994;308(6924):283-84. doi: 10.1136/bmj.308.6924.283 %J BMJ
- Del Mar C. Publishing research in Australian Family Physician. Australian family physician 2001;30(11):1094.