Welcome to the second installment of my blog-report from Infection Prevention 2015, focused on improving the systems around the delivery of safe healthcare, and infection prevention and control:
Part II: Improving the systems
The economics of HCAI is going to become increasingly important as the NHS – and healthcare systems worldwide – continue to “seek efficiency savings” (aka demand more for less). So the overview of HCAI economics from Dr Nick Graves (QUT, Australia) was timely. I find it remarkable that we are still so reliant on the 2000 Plowman report to gauge the cost of HCAI – surely there must be a more sophisticated approach? There is something rather uncomfortable about setting an ‘acceptable’ level of HCAI, or putting a £ value that we would be prepared pay to save a life, but this is exactly what we have to do to manage the demands of scarcity. Dr Graves presented some useful worked examples to illustrate his point, around coated catheters, hip replacements, hand hygiene improvement, and MRSA screening. In most cases, there comes a point where a health benefit is too expensive to ‘purchase’, which is an uncomfortable but very real choice across all areas of healthcare (e.g. cancer drugs).
Dr Jenny Wilson (UWL) gave a helpful ‘meet the experts’ session on SSI. For somebody fairly new to SSI surveillance (like me!) it was invaluable. SSI causes considerable increased morbidity and mortality, and yet we capture only a fraction of SSIs in most existing surveillance approaches. Furthermore, it seems that the harder you look for SSIs, the more you identify (aka ‘seek and ye shall find’). Especially when you add robust post-discharge surveillance. So, how can we improve? A recognition that surveillance definitions are not the same as clinical definitions will help, but we need to explore new ways of identifying SSIs. Finally, the ‘One Together’ initiative may help us to be a little more co-ordinated in our surveillance approaches.
Dr Nikki Maran (NHS Lothian) explored the importance of human factors in infection prevention and control. Healthcare is dangerous when you compared it with peer activities. Why is this? Because healthcare workers are lazy and don’t care about causing harm? Absolutely not. Humans have heroic capacities: compassion, empathy, care, knowledge, resilience… But healthcare workers tend to be more Clark Kent than Superman (although patients tend to expect the latter!). Mistakes happen (especially when staff are tired, stressed or, interestingly, insulted)! Also, are we setting achievable targets? Careful observation of an anesthetist (like Dr Maran) at work identifies so many hand contacts and so little time! This calls for a modified procedure to make it achievable for human beings. So how can we improve? Knowledge and expertise doesn’t really affect the likelihood of error and, whilst rules can be helpful, we’re all like toddlers and like to push the boundaries. We need to develop a safety culture, exploring and harnessing human factors to improve our delivery of safe, effective healthcare.
Dr Jason Tetro (‘the Germ Guy’) began with the infectiousness of infection prevention enthusiasm (although, disappointingly, he gave no estimation of the basic reproduction number, R0)! Jason gave an overview of the cornerstones of effective science communication: education, enrichment, engagement, and entertainment. I was listening carefully because I have to give a similar talk at ECCMID next year. Not sure whether I will be able to skilfully weave in some local culture (in this case, regular Beatles references)! The talk got me thinking about whether we should consider paying for professional marketeers to launch our IPC campaigns?
Finally, to round off the conference, we were privileged to host Mark Gallagher, a Formula 1 executive. Mark drew a series of thoughtful comparisons between F1 and healthcare. The priorities of F1 and healthcare are closely aligned: 1. Great performance; 2. Robust, reliable systems; 3. Safety. The history of safety in F1 is chequered (do you see what I did there…), to say the least. F1 used to kill at least 1 driver per year and if you drove for 5 seasons before 1960, you had a 25% chance of death (not including the countless fans and race marshals also killed). The safety culture in F1 was driven by public scrutiny from televised races in the 1990s. How similar then that our own safety culture is being driven by the scrutiny of public reporting. Mark detailed a watershed race weekend in 1994 when two drivers were killed (Roland Ratzenberger and Ayrton Senna – and I listed Ratzenberger first for a reason). In both cases, human factors contributed: Senna’s team made modifications to his steering without reference to the due process, and Ratzenberger ignored the protocol to return to the pits for a once-over of his car following an accident. F1 has learnt much from these tragedies; the question for us is do we learn enough from our daily tragedies of transmission? Mark spent most of his time talking about high performance team work. An F1 team has a huge range of skills, experience and paygrade in the same team, and the natural hierarchy arising from this situation can be damaging, with fear of recrimination (or actual recrimination) silencing junior team members. It’s the same situation in the operating theatre or resus room. Not only can a flag from a junior team member avert a disaster, it can also be an unexpected source for innovation (if we listen). Team leadership is crucial, and the ability to inspire their team is what sets a great driver apart from a good driver. Michael Schumacher was the greatest in Mark’s view, who’s mantra was: “Once something is a passion, the motivation is there”. Finally, Mark considered the similarity of pit stops to a clinical team in full flow – and I was struck by the similarly of Mark’s videos of pit stops with Dr Maran’s videos of an anesthetic team at work! We watched a video of a 23s pit stop from the 1980s – slow, chaotic and dangerous. F1 has improved the process and safety around pit stops dramatically: times have plummeted from 23s to <2s (slightly skewed by no refueling these days), with improved safety. The infamous pit stop fire engulfing Jos Verstappen is case in point (again caused by human factors). Although dramatic, nobody was injured. Wouldn’t it be helpful if a patient was engulfed in harmless flames whenever they acquired a line infection? The team who was responsible would be much less likely to make the same mistake again! So, F1 systems are robust and configured to allow team members to make mistakes, with checklists and buddy systems in place. Do we have enough flex in our systems to allow for “to err is human”?