Three Dangerous Words!

There is a common saying that three of the most dangerous words used in healthcare are ‘in my experience’ as relying on it can cause some big problems. However, I think there are another three words being used a lot recently that can be just as dangerous, these are ‘research has shown’!

Healthcare has moved away from individuals experience-based practice to a more scientific evidence-based practice in an effort to reduce clinicians bias and ignorance adversely affecting patients care and treatment. This includes physiotherapy where historically clinical experience has often been highly respected and revered over and above science, research and evidence.

Clinical experience is no doubt useful for many things, in fact, I think for some things it is probably the most reliable thing we have (ref). However, clinical experience should never be fully trusted or solely relied upon. History is littered with examples of where experts have relied on their experience too much and been very very wrong at huge costs to others (ref).

Drunken Evidence

Evidence-based practice within physiotherapy is slowly being adopted with more physios reading, engaging, and participating with research which is a great thing. And as much as I’m an advocate for research and evidence, it needs to be recognised that evidence-based practice has some pitfalls if not used and implemented carefully and sensibly.

Unfortunately, I see many physios using research and evidence not carefully or sensibly. In fact, I see some physios using research like a drunk uses a lampost, more for support than illumination. That is they only find research to support what they already think and use it to justify what they already do.

Screen Shot 2019-06-14 at 17.15.14

Some therapists think they are evidence-based clinicians if they find a research paper demonstrating that a treatment they use ‘works’ (usually by the p-value being <0.5). But this is NOT how evidence-based practice works!

Pubmed Googling

Evidence-based practice is the very difficult and complex skill of interpreting, synthesizing, weighing, and implementing the whole of the evidence base in its entirety to support our methods, treatments and interventions. It’s not the reading of one, two, or even a few papers that already support your own beliefs. If you’re basing your practice on one or two research papers it’s usually because you have only read one or two research papers and are ignorant of others that may challenge or question them.

Many healthcare professionals, and I will include myself here at times, cherry-pick the evidence-base, using what they want, and ignoring what they don’t want. Most use Pubmed like they use Google doing a quick keyword search, often not going past the first page of results, and clicking on the first paper that catches their eye. This means they often find what they want to know, rather than what they need to know. It also means they get very skewed views and beliefs about what they think works, and what they think doesn’t.

IMG_2516

Bad Science

One of the issues with the evidence-base is that there are literally 100’s of papers published to support anything you want, especially in the field of physiotherapy which is notorious for publishing bad science, with lots of low-quality, highly biased, poorly designed research trials (ref). For example, you can find papers showing how woolly pants cure low back pain, ultrasound applied clockwise is more effective, and even spinal manipulation reverses death. There is literally citable research out there to support what you want, or don’t want, such as K-tape helps, or it doesn’t, manual therapy helps, or it doesn’t, even exercise helps, or it doesn’t.

There is literally a shit tonne of shit research out there, and this quagmire of crap is being defecated daily, and means you have to wade waist-deep through the crap to find the good stuff. This is time-consuming, difficult, frustrating, and hard work, and means good quality research is very hard to find and often goes unnoticed, whereas bad research is very easy to find and often gets promoted.

Broken Science

Many say that science and evidence-based practice is broken because of these issues, but that’s nonsense. Evidence-based practice isn’t broken, it’s just very, very, very hard to do well, and often it’s abused and misused by those who don’t understand it. Evidence is a tool, and like any tool, it’s only as good as the person using it.

Unfortunately, levels of scientific literacy and understanding are terrible within the general public and not that much better in healthcare, with most clinicians not able to tell the difference between good quality, rigorous, ethical research from poor quality, flawed, biased research.

correlation

It still amazes me how many healthcare professionals hold a Bachelor of Science yet couldn’t tell you the difference between specificity or sensitivity, reliability or validity, efficacy or effectiveness, statistically significant or clinically meaningful. And don’t even get me started on how many clinicians do not understand the issues with p-values, the role of effect sizes, blinding, control groups, randomisation, power, publication bias, data mining, p-hacking, HARKing and the reproduction crisis.

Some really useful resources for better understanding of all the issues I’ve just mentioned can be found here and here, and also check out the Science Daily website and the Everything Hertz podcast as well, as I often find these excellent resources for improving your understanding of research, statistics and the basic scientific method.

The Truth

A common misunderstanding of evidence-based practice is that many think it will give them clear and definitive answers. Sometimes it can, but often it doesn’t. Research never really proves anything true. Research often just tells us what’s more probable and less wrong!

Many think research and evidence will give them quick, simple, easy answers to messy, complex and confusing questions about how they should assess, diagnose and manage people with pain. It won’t, not at all. If anything reading research can make things even harder and more complicated, which is why many give up after trying for a bit.

Image result for you can't handle the truth gif

Reading research doesn’t give clinicians answers, it gives them an appreciation of uncertainty and an ability to recognise the probability of what’s less wrong and what’s more right. But only if they’re able to think critically, have good levels of scientific literacy, and more importantly be tolerant of complexity and uncertainty.

Many healthcare professionals lack tolerance to uncertainty as they don’t want to look ignorant or stupid to others. For example, no patient wants to hear or see a clinician dithering and dallying, stuttering and stammering scratching their head wondering what to do next.

A lack of tolerance to uncertainty in healthcare is also due to societal pressures and deeply rooted constructs that healthcare professionals should always know what to do when patients come to see them. It is still assumed by many that the clinician alone decides what to do and how to proceed rather than it being a shared process between the patient and the clinician.

Due to these issues, many clinicians hide and mask their uncertainty by abusing and misusing the research to give them an air of confidence and certainty. This also avoids them having to have difficult and awkward discussions with patients about the uncertainty and complexity of treatments and outcomes.

Summary

As I said at the beginning there is no doubt that clinical experience can be useful in some situations, but using it alone is fraught with problems. But, there is also no doubt that relying on poorly conducted, highly biased, and methodologically flawed research has just as many problems.

Healthcare professionals need to be careful that phrases like “I know this works” or “This is what I have always done” are not mindlessly replaced with “the evidence says” or “research has shown”.

As always thanks for reading

Adam

OTHER BLOGS

ARCHIVES

6 COMMENTS

  1. Important topic in this world of InstaEducation. It seems contradictory that you speak of probabilities and quite appropriately the need for uncertainty, yet much of what you post clearly suggests there ARE right or wrong interventions, with only a single systematic review as “Truth”. Now perhaps my analysis of your point is not aligned with yours, but I guarantee you 95% (p<0.05) of your followers (n=54,530) just Googled “p hacking” using the private browser function ?

  2. You, sir, are brilliant. Thank you for humbling me, …one more time. “I see physios using research much like a drunk uses a lampost, that is they use it more for support than illumination.” That was me! “Research never really proves anything true, right or correct. Research actually tells us what’s more probable, likely and less wrong, not what’s true!” → I am going to work really hard to become this me from now on. You have become my mentor, and I hope to meet you someday. Thank you.

  3. Ahh. You made my day again. Your language has the freshly clear crispness of a mountain stream.

Comments are closed.

Related news