The field of systematic review, of which Archimedes we believe sneaks in under the ‘rapid review’ heading, has long since held a solid foundation to what a systematic review needs to do. It needs to have a clear question, with a comprehensive search, and assessment of included studies bias / quality, a synthesis (which may be mathematical; meta-analysis), and a set of conclusions that draw this together.
What it’s been long struggling with is how to ‘best do’ each of these areas. ‘Best’ is itself problematic – take ‘best’ searches for example. Do they find every single possible scraplette of possible information, taking 3 months of daily specialist work, where the qualitative bulk of the data, leading to the same practical conclusion, was found in the first week? (And how do you know – prospectively – when the tipple into ‘enough’ has been reached?)
A number of reviews find both good and not-so-good studies that answer a question. If we include all of them, we may end up with a biased (untruthful) answer, which is practically unhelpful. If we only include the good studies, or in some case, good study, we end up with an answer which is so imprecise it’s practically unhelpful.
Where we have arrived is to try to seek a balance of the good against the perfect. We will undertaken a search sufficient for our needs: Archimedes will want a couple of databases, well explored. Your Government health funder should be expecting a fair bit more effort than that. Mostly, we’re going to look for syntheses that give us information which looks like it will be the best answer we have in order to act; that may at times be ‘inaction’ as it means not moving to the new drug, wibble-setting or superScan. We do this with the full and clear knowledge that mostly we work on a tissue thin membrane of evidence tied together with gossamer threads of experience and washed constantly with a soup of hope.
- Archi