You don't need to be signed in to read BMJ Blogs, but you can register here to receive updates about other BMJ products and services via our site.

Research design

Quantifying the burden of injury in ‘data-poor’ setting; a local-need- driven approach?

12 Oct, 16 | by Brian Johnston

Editor’s Note: earlier this year the journal published injury data from the Global Burden of Disease project. In an accompanying editorial I noted that many of the regional or sub-national estimates were “derived from aggregation and extrapolation of limited primary sources “and yet could “become the basis for policy or programming at an intensely local level.”

I saw this as a challenge to researchers, a call to “crowd source” burden of disease data from  the subregions and subpopulations unrepresented, or simply estimated, in the global aggregate. If we identified those needs and provided resources for good data collection, data management and data reporting , the information collected would be immediately useful at the global scale and  – one hopes – at the local level too. 

Dr. Safa Abdalla, a member of our editorial board, approaches that suggestion with some caution and – in this guest post – draws distinctions between the needs and experience of researchers and public health professionals in “data-rich” and “data poor” environments. – Brian Johnston (Editor-in-Chief)


safa-abdallaSome parts of the world, typically in the low- and middle- income country classification range, lack solid basic information about frequency and distribution of injuries in their population. That is not to say that they lack the sources or the capacity to measure them, but in those same places, the public health practice machinery had been occupied (not entirely unduly of course) with a cluster of conditions like communicable diseases that international actors have been investing heavily to tackle. In such environment, local objective assessments of all potentially impactful conditions may not have been deemed necessary. As a result, priority setting has been skewed towards those conditions of historical focus without heavy reliance on local epidemiological evidence.
The very first global burden of disease and injury assessment and subsequent versions have highlighted the need to consider the burden of all realistically possible conditions that affect human health – including injuries – in a way that allows objective comparisons and consequently objective priority setting. Arguably, data from so called ‘data-poor’ countries had not always been sufficient and/or accessible enough to feed into these global-level estimation projects and data gaps were filled with an assortment of methods that continue to evolve to date, probably at a rate that surpasses the rate of improvement in the quantity and quality of data from those countries.
The burden of disease assessment methodology is very demanding, not only computationally but in terms of data input, requiring epidemiological estimates at the very granular level of disease and injury sequelae, and synthesizing those into a range of novel summary measures (Disability-adjusted life years for example). Yet, incidence, prevalence and mortality of any condition at a broader level are key inputs for country- or locality-level policy development and health service planning and monitoring. It is in measuring those epidemiological quantities that the value of country-level estimation in data-poor settings lies, without necessarily delving into the complexities (and relatively unnecessary luxury for the time-being) of summary measure calculation. In addition, country-level assessments can uncover gaps in data systems that, when addressed, can create a seamless flow of better quality data for local decision making.
But with whom does the onus of carrying out such local-level estimation reside? Undeniably, global estimation efforts have produced country-specific estimates, stimulated country data hunts that fed data into their machinery and, in a few ‘data-rich’ countries, facilitated full burden of disease and injury assessments too. However, to date, injury burden estimates for the vast majority of ‘data-poor’ countries come from indirect estimation in these global projects. One can argue that alternatively, an approach that is driven by the need for public health action (be it strategy updating or service development) would be the most beneficial for producing estimates for those very countries at national, sub-national or subgroup levels. This approach entails that a local team of researchers, public health practitioners and other stakeholders evaluate all their data sources, use them in a simple and transparent fashion to develop the best estimates that fit their purpose, and take action based on the estimates and other relevant input while also identifying the data gaps and working on filling them. Arguably, informing local public health action should take priority over informing the global view, but global burden estimation efforts can still (and must) benefit from the products of this process. However, the process needs to be driven by local demand for estimates and not by the need to fill gaps for the global estimates. It should also be led, undertaken and owned by local teams of public health practitioners, analysts and researchers. The reason for this is that assessing and using health data are basic public health functions that all public health practitioners and analysts in any country should be capable of carrying out. Relying on external support from ‘global project’ teams to develop country estimates denies public health practitioners and researchers in those ‘data-poor’ countries the opportunity to hone their skills in public health data assessments and epidemiological estimation. It also denies them ownership of any subsequent efforts to improve data availability via epidemiological studies or administrative data collection.
This approach need not be limited to injury burden assessment but is much more needed for that latter. This is mainly because injuries in many low- and middle- income countries had been neglected for so long that epidemiological assessments of other conditions traditionally associated with those countries are likely more abundant. Hopefully as more and more country teams assess, use and improve their own injury data sources, this reality will eventually change.

Safa Abdalla
twitter: @Safa12233

Censoring research

23 Jun, 16 | by Barry Pless

I am posting this for all Injury Prevention blog readers who are researchers or interested in research. I do so in part because John Langley is one of the pioneers in our field and was one of the Senior members of our editorial board from IP’s earliest days. But I also do so because the issue that prompted him to write this op-ed for his local Otago paper is by no means restricted to New Zealand. It is a widespread and important issue that has the potential to corrupt research in all manner of ways. IBP 

Read it and think carefully about the implications for your own work. Thanks to John for sharing it and for ODT for permission to reproduce it. Thankfully Injury Prevention, to the best of my knowledge, has never had to deal with this issue.

 Restrictive publication clauses in health research contracts

I was both pleased and disappointed to read the two ODT articles (20 February, 5 March) on this subject. Public airing of this important issue is long overdue. I was disappointed as I gained the impression that despite several scientists publicly expressing their concerns, the University of Otago Deputy Vice-Chancellor for Research appeared to have none. The Deputy Vice-Chancellor’s apparent lack of concern, however, is consistent with my experiences of research administration at the University.

For 20 years I was the Director of the University of Otago’s Injury Prevention Research Unit. This Unit was entirely dependent on funds from government agencies. Contrary to my expectations, the University of Otago did not pro-actively seek to protect me, or the public, from clauses in draft contracts that placed restrictions on publishing research findings. Protecting researchers increases the chances that they can serve the public interest, and meet the University’s legislated role of being the critic and conscience of society.

I spent a significant amount of time challenging clauses that would allow a government agency to censor a finding they disagreed with, or deny the right to publish work at all. On a couple of occasions during contract negotiations I was reminded by government agencies that they could purchase the outputs they wanted from other Universities or private organisations who would accept the restrictive clauses.

Why would Universities be accepting these clauses? I believe a key factor is the importance of, and competition associated with, generating research income. Collectively, NZ’s eight publicly funded universities derive approximately 15% of their income from research contracts. The success a University has, relative to others, in obtaining external research funds also has a significant role in determining the funding it receives from government through the Performance Based Research Funding scheme. Research income is critical to ensuring high quality research, thereby maintaining and enhancing a university’s reputation and thus attracting students, staff, and further funding. The purchasers of research can use the competition between universities, and researchers, to their advantage in getting restrictive publication clauses accepted.

Private research suppliers can accept restrictive publication clauses as they are typically uninterested in publishing in peer-reviewed journals and have no statutory or moral obligation to serve the public interest. While they would produce a research report for the purchaser, these reports are not frequently published, easy to access, or subject to rigorous quality control, e.g., independent peer review.

It was also my experience that many senior researchers did not care about the restrictive clauses. Why would this be so? Success in attracting research funding is, for most researchers, critical to pursuing their research, producing publications, and to promotion and public recognition.

This behaviour contrasts with Royal Society of New Zealand’s (RSNZ) Code of Professional Standards and Ethics in Science, Technology, and the Humanities. Section 8.1 of that code states that members of the Society must: “oppose any manipulation of results to meet the perceived needs or requirements of employers, funding agencies, the media or other clients and interested parties whether this be attempted before or after the relevant data have been obtained;”

I accept the right of a purchaser of research to see an advance copy of any paper for publication and to make any comments on it. But requiring modification beyond correcting factual errors is unacceptable. Even purchasers suggesting toning down a phrase here and there and putting in some qualifiers is problematic from a purchaser who is concerned to minimize bad publicity that might arise from the paper. It also places the researchers in a bind. Should they comply? If they don’t comply are they putting at risk future research funding from the purchaser? I suggest they might be. Why deal again with a ‘difficult’ group of researchers when you can purchase the work elsewhere?

The best approach to this issue is transparency. Make the deliberations between researchers and purchaser accessible to all as in the open review practiced by some scientific journals so readers can trace the discussion. This approach would not deal with contracts that explicitly prohibit the researchers publishing at all.

The Health Promotion Agency (HPA), a crown entity, charged with promoting healthy lifestyles recently put out a request for proposals (RFP) to assess whether the reduction in trading hours in Wellington has any impact on alcohol-related harm. The ‘indicative’ contract for this RFP stated: The Supplier will not publish the results of the Services undertaken pursuant to this Contract”. Only when challenged did HPA advise that it was a negotiable clause. Potential University researchers interested in bidding for such research should be able to take the ‘indicative’ contract as a reflection of the intent of the purchaser. Irrespective of this, preparing a high quality research proposal involves significant resources and many researchers would consider it was not worth the effort given the uncertainty of their right to publish.

In effect, some government purchasers are getting to decide what findings, if any, the public gets to see from research the public has paid for, either by pressuring some universities to accept restrictive clauses or by buying what they want from private suppliers.

Universities of New Zealand, the representative body of New Zealand’s eight universities, and the RSNZ need to enter in discussions with Government with a view to ensuring government research RFPs do not impose these restrictions. Interference with researchers ability to bid for, execute, and publish research compromises the role universities have as critics and conscience of society.

We need an independent audit of government research contracting to determine to what degree restrictive publication practices and the use of private suppliers is undermining the public’s right to be fully informed of the findings of research they have paid for.

Emeritus Prof John Langley

Focusing on the ‘why’ and the ‘how’

20 Jan, 16 | by Sheree Bekker



I draw attention to a recent post  from The BMJ blog – Chris Baker: Child obesity in India? Tell me something I don’t know! as it struck me as relevant to the field of injury prevention. 

The BMJ blog post centres around the fact that only two qualitative studies have been published in the past 15 years on the issue of child obesity in India, with the majority of research being prevalence studies – and concludes:

…let us divert resources away from the “what” and “who” of child obesity towards the “why” and “how.” These questions require the application of qualitative research methods with families and health professionals to explore the lived experience of being overweight or obese, and the broader social and cultural beliefs related to this growing burden.

As we know, and as a quick search for qualitative studies in Injury Prevention shows, our field does indeed recognise the importance of qualitative work, with skilled researchers using qualitative methods to answer the types of ‘why’ and ‘how’ questions that we encounter with regards to our injury prevention interventions.

Over and beyond the qualitative/quantitative debate however, this blog post struck me as pertinent to readers of Injury Prevention as it raises the important point of relevancy. Relevancy of methods to the research question, and relevancy of research questions to the population.

Relevancy matters.

Do make sure to read the post for thought-provoking points that are raised as to the types of questions we should be asking, and thus the deeper issues that we can seek to uncover and address through our intervention research.

Latest from Injury Prevention

Latest from Injury Prevention