The aged among you may be aware wing mirrors in other countries carried a warning .. “Objects in the rear view mirror may appear closer than they are”. Those of a certain persuasion may be unable to read that without singing too. There’s a physical and psychological truth to the statement which we should be wary of when looking at research, and which we often miss.
We’ve written here a lot about biases, and in retrospective studies these can be huge and unseen. The elephants we need to speak of are; the accuracy of discovery in records; the ones we missed because they were coded or unreadable; the variability of people; and the spark that started the fire. Plus the usual one – stats aren’t helpful at making truth emerge from garbage.
The spark needs to be questioned – what is it that got the writers of this retrospective study to look at the issue? If it was after a (probably chance) run of Three Things, and those three and incorporated into the data and analysis, you’ll have skewed your sample to ‘finding’ something. How did you find the records you were looking for? Was the list generated by three of you starching your heads for those kids? Recall biases may well highlight those with more dramatic outcomes (or parents). How did the looking back at cases get recorded .. do the authors tell you all the entries they couldn’t read, or didn’t contain enough data? The things that are recorded may be quite variable too, and may not have the same rules for reproducibility which are incredibly annoying-but-needed in clinical trial work.
It’s really disappointing to keep remembering this. It’s frustrating to think “what I’m reading might be untrue”. It’s irritating to wait for better research to occur. Evidence based medicine is about holding and explaining those spikey bits as well as trumpeting the next great step tho.
Sorry, folks.
- Archi