Highlights from APIC 2014

APIC 2014I couldn’t make it to APIC this year, but I have picked out a few highlights. More than 300 abstracts were presented so I can only scratch the surface here, but the good news is that they’re all available in an AJIC supplement.

Multidrug-resistant Gram-negative rods

One of the oral presentations was on controlling CRE in Texas (Cifelli et al). The interventions comprised improvements in lab identification and patient electronic tagging, and front-line infection prevention and control practices (dedicated rooms, equipment and staff etc). It’s difficult to know which of these approaches (if any!) made the difference: we still don’t know what works to control CRE.

A group from Louisville explored transmission of CRE in an LTAC (Kelley et al). LTACs have previously been shown to be a hotbed for CRE transmission in some parts of the USA. They found that almost half of patients that acquired CRE were admitted to beds that had been previously occupied by a CRE patient, which brings a new meaning to ‘hotbed!’ This links in with previous studies showing that admission to a room previously occupied by a patient with MDROs is a risk factor for acquisition. It also shows that CRE (K. pneumoniae at least) can survive for long enough on surface to bring indirect transmission via environmental contamination into play.

Definitions and terminology surrounding CRE and MDR-GNR in general are in a state of confusion. Both require urgent clarification. A survey of 79 hospitals by Jadin et al for their definitions of MDR-GNR yielded virtually 79 different definitions! This makes it challenging for facilities to communicate clearly about MDR-GNR, since what qualifies as MDR-GNR may not make the cut in another hospital. And this is not even accounting for variations in lab diagnostics!

A small prevalence survey of CRE carriage in Michigan by Berriel-Cass et al found that 2 (3.8%) of 53 patients were colonized. Neither patient had history of CRE, but one who did have a history of CRE screened negative! It’s difficult to know who is at high risk for CRE carriage, and even more difficult to know how long they will carry it for. However, we probably know enough to conclude that “once positive always positive” is a sensible (if somewhat conservative) approach.

The rest

A fascinating study from Arizona by Sifuentes et al evaluated a hygiene intervention in a LTCF. A number of bacteriophages were used as markers for pathogenic virus transmission and inoculated onto hands and surfaces. The viruses spread rapidly throughout the faculty over a short time period (measured in hours), and a hygiene intervention significantly reduced the level of contamination of hands and surfaces. Most similar work has been performed in the acute setting, so some data from the non-acute setting is particularly welcome. This study illustrates the dynamic interplay between hand and surface contamination. In a way, hands are just another highly mobile fomite that are not disinfected frequently enough!

Jinadatha et al performed a very timely study exploring whether serial passage of bacteria with sub-lethal UV exposure prompts reduced susceptibility to UV. The study demonstrates that 25 serial exposures to UV did not affect bacterial UV susceptibility. However, the study did not explore whether other useful mutations may have occurred in the “survivors”; perhaps this is a job for whole genome sequencing in a follow-up study?

Faecal microbiota transplantation (FMT) is quickly becoming the standard of care for recurrent CDI. A study by Greig et al tells the story of implementing a FMT programme. The literature for FMT are impressive, but the ‘nuts and bolts’ of implementation are challenging. Where do you get the donor stool form? How do you screen the donors? Who performs the procedure? Who pays? Will it work here? Are just some of the questions that need to be negotiated for successfully implementing an FMT programme. The message from this study: it’s worth it – 83% of patients with recurrent CDI had resolved within 30 days.

Finally, I remain rather skeptical that “CA-CDI” is really on the rise. I may have to revise my opinion based on this abstract by Rogers and Rosacker, showing that a community-based educational intervention reduced the rate of CA-CDI!

Which transmission route is most important for influenza?

influenza (2)

“We are no more in the aerosol camp than the contact camp” conclude the authors. And this seems to be how it is in terms of influenza transmission routes – you’re either in one camp or the other. This 2010 PLoS Computational Biology paper is hardly hot off the press, but it is important and it does, to an extent, put the question of which camp you are in for influenza transmission to bed: you need to pitch your tent in different camps depending on the circumstances.

The paper describes a model to compare the various transmission routes for influenza, principally airborne, droplet and contact. The study evaluates four transmission routes: ‘respirable particles’ (<10 µm), ‘inspirable particles’ (>10 µm, <100 µm), ‘direct droplet spray’ (>100 µm) and ‘contact’. The model tests 10,000 scenarios, considering possible variation in virus properties, host susceptibility and environmental factors (such as the number of influenza shedders).

The key finding is that contact transmission had the highest average basic reproduction number (R0) (1.7) followed by droplet (0.27), respirable (0.05) and inspirable (0.006) particles (Figure). However, that is only part of the story. Of the 10,000 scenarios evaluated, contact only was associated with high transmission in 3,069, all four routes in 342 and none in 4,765. In high host density settings, all routes were more frequently important. Conversely, when self-inoculation was more common (i.e. when simulated individuals touched their simulated nose, eyes and mouths more frequently), contact transmission was more important.

Spicknall

Figure: Basic reproduction number (R0) of four influenza transmission routes, ‘respirable particles’ (<10 µm), ‘inspirable particles’ (>10 µm, <100 µm), ‘direct droplet spray’ (>100 µm) and ‘contact’.

The findings are interesting and probably very important. It’s a shame they were not able to evaluate the relative importance of contact transmission involving contaminated surfaces compared with contact transmission that occurs independent of surface transmission (this has been evaluated elsewhere). Also, I remain suspicious of modeling in general. If simplifying assumptions are too simplistic (which is often the case), the model spits out garbage, which is worse than useless. Put another way, Bertha can produce anything if she’s given the right inputs! Plus, it’s difficult to know how applicable these findings are to other respiratory viruses.

Still, the paper does shed light on the relative importance of influenza transmission routes. Which is most important? Well, that depends on the context. If you’re in a small room, airborne and droplet transmission is key. If you’re admitted to a room following the discharge of a patient with influenza, then contact transmission is key. Hence, we need to be flexible when considering influenza transmission routes and ‘contextualize’ our interventions accordingly.

Citation: Spicknall IH, Koopman JS, Nicas M, Pujol JM, Li S, Eisenberg JN. Informing optimal environmental influenza interventions: how the host, agent, and environment alter dominant routes of transmission. PLoS Comput Biol 2010; 6: e1000969.

Image: Sanofi Pasteur.

MERS-CoV: a survival guide for you and your patients

This time last week, we were on red alert that a MERS-CoV pandemic was gathering pace. The news over the last few days has been more encouraging, with no new cases reported in Saudi Arabia, the epicenter of the MERS-CoV cases. However, there are still plenty of patients with MERS-CoV who need to be cared for around the world. Like SARS-CoV, but unlike the “usual suspects” that cause HCAI such as MRSA and C. difficile, MERS-CoV has the capacity to affect both healthcare workers and patients. Thus, I hope that this ‘survival guide’ will prove useful to those on the front line.

I presented a webinar on ‘MERS-CoV: coming to a hospital near you? Infection prevention and control challenges’. You can download the slides here. I came across a fantastic blog whilst preparing the webinar: ‘Virology Down Under’ by Dr Ian Mackay. I’ve used some of his excellent images in the slides, with his kind permission.

There are two transmission routes to consider for MERS-CoV: droplet / aerosol and contact / fomite. CDC recommends both airborne and contact isolation procedures to reflect these transmission routes. This involves placement of the patient in a negative pressure airborne infection isolation room, and the use of gloves, gowns, eye protection and N95 (FFP3) mask, and, of course, hand hygiene. PHE recommends a similar approach.

First and foremost, as a respiratory virus, inhalation of infected droplets is likely to be the most important transmission route. However, whilst not an ‘airborne’ virus (such as measles), aerosols can be generated by MERS-CoV patients that comprise small droplet nuclei that travel for long distances. A recent study of influenza suggests that the generation of aerosols is surprisingly common, to the extent that ‘living and breathing is an aerosol generating procedure’. Indeed, a recent study showed that a MERS-CoV aerosol diminished by only 7% over 10 minutes (compared with 95% for influenza). The use of an N95 (FFP3) mask will prevent direct inhalation of droplets / aerosols, and gloves, gowns and eye protection will prevent contact with mucous membranes and contamination of clothing or hands for subsequent nasal inoculation. But, if MERS-CoV aerosol is generated in the patient’s room (which seems likely), how long will it last and will the subsequent admission to the next room be at risk?

Let’s assume a patient sheds a MERS-CoV infectious aerosol of 6-log. I’m not aware of any infectious dose data for MERS-CoV yet, but for SARS-CoV it can be as low as <20 plaque forming units. Given the decay rate of 7% over 10 minutes, infectious aerosol above the infectious dose could be present after the discharge of the patient for a little under 26 hours! Even if the virus was shed at a lower titre, infectious aerosol times would still be considerable (Table). Could this be a job for automated room disinfection systems, which address both surface and airborne contamination? One such system, hydrogen peroxide vapour, has recently been shown to inactivate the SARS-CoV surrogate, TGEV.

Shed titre Time to reach 20 virus particles
1000000 26 hours
100000 20 hours
10000 15 hours
1000 9 hours
100 4 hours

Table: Relationship between shed titre of MERS-CoV and time to reach 20 virus particles.

Whilst respiratory viruses are not that great at surviving on surfaces compared with C. difficile spores and some vegetative bacteria, they can survive long enough to bring contact / fomite transmission into play. A number of reviews have concluded that contact / fomite transmission is an important route for influenza and other respiratory viruses such as rhinovirus. The SARS-CoV and surrogates exhibit unusual survival properties compared with other respiratory viruses, with survival times often measured in days, weeks or even months. Recent data suggests that MERS-CoV shares this property, surviving for >2 days when dried onto hard surfaces compared with only a few hours for influenza tested in the same study. Fortunately, these enveloped viruses are inactivated rapidly by usual hospital disinfectants, so I can understand the CDC’s recommendation for standard environmental disinfection.

So, how to protect yourself and your patients? The answer is simple for direct patient care: wear your gown, gloves, goggles and N95 (FFP3) mask, and wash your hands! It’s simple in theory, compliance with these measures in practice is not as good as you may expect. Prof Seto published a study in the Lancet in 2003 about compliance with personal protective equipment (PPE) and hand hygiene during the SARS epidemic. Remarkably, even when caring for patients known to be infected with SARS, <30% of healthcare workers self-reported that they wore a mask, glove, gown and washed their hands as they should have done. And this was self-reported, so you’d expect there to be a bias towards compliance! Most importantly, none of the healthcare workers who complied with all four measures became infected.

Seto Lancet SARS PPE

Figure: Healthcare worker compliance with mask, glove, gown use, and hand washing during the SARS epidemic, stratified by those who became infected with SARS.

To summarise: how to protect you and your patients from MERS-CoV:

  • Place patient in negative pressure airborne infection isolation room, where available.
  • Wear the correct PPE when in the room (gloves, gown, N95 / FFP3 mask, goggles), and wash your hands.
  • Pay attention to the potential for contaminated surfaces and air, particularly following the discharge of the patient.
  • Oh, and if you go on holiday to Saudi Arabia, don’t kiss any camels!

Perspectives from ECCMID 2014: the box set

eccmid 2014

I’ve published a few ‘Perspectives from ECCMID’ on the blog over the last few days, so thought it would be useful to post a summary:

You may also be interested in some other updates from ECCMID elsewhere in the blogosphere:

Perspective from ECCMID Part IV: We need to stop polluting our planet with antibiotics

Effluent

Professor Joakim Larsson gave a frankly chilling lecture on antibiotic pollution and its impact on the environmental resistome. Antibiotic resistance genes are fairly common in soil bacteria, and indeed, pre-date the use of antibiotics. Furthermore, the reservoir of resistance genes in soil bacteria seems to have increased since the 1940s when we began using antibiotics. This creates a huge reservoir of resistance determinants to the tune of some 1030 bacteria, an unimaginably massive number that we can only begin to understand through analogy. Fortunately, there is not a free flow of antibiotic resistance genes from environmental to hospital bacteria. However, where there’s close contact and selective pressure, transfer of resistance genes from environmental bacteria to hospital pathogens does occur. Prof Larsson introduced the idea of ‘minimal selective concentration (MSC)’, the cost-benefit equation for bacteria carrying antibiotic resistance genes.

This problem is driven by the appropriate and inappropriate use of antibiotics in human medicine, agriculture and aquaculture. Indeed, we all know about the high rates of NDM-1 in the New Delhi water supply; the modern day John Snow’s water pump handle (although the solution is not as obvious)? Another important driver is antibiotic contaminated effluent from pharmaceutical factories producing antibiotics. A large amount of pharmaceutical production of antibiotics occurs in emerging markets, such as India. There are tight regulations on what factories are allowed to release into their surrounding environment in many countries, but some are largely unregulated. One plant in India released phenomenal amounts of one particular antibiotic, ciprofloxacin, in waste water: 44kg per day. To put this in context, that’s almost 5x the amount of the same antibiotic consumed by the whole of Sweden per day, and the concentration of the antibiotic in the waste water was higher than therapeutic levels of the drug in humans! Unsurprisingly, this provides a strong selective pressure for the development of antibiotic resistance in the bacteria surrounding the factories. There are special treatments available to reduce or eliminate antibiotic contamination of factory waste (e.g. active carbon filtration or ozone treatment). But incentives are required to ensure that these technologies are implemented in the resource-limited settings where the factories are based.

Prof Larsson is planning some research to help to understand the relationship between environmental bacteria and hospital pathogens, for example, through his ‘NoCURE’ (Novel Carbapenemases – UnRaveling the Environmental reservoir) project, and the BacMet database for registering biocide and metal resistance genes, which are both worth checking out. As we come towards the end of antibiotics, the last thing we need to be doing is polluting our planet with antibiotics, which provides a selective pressure for the development of resistant bacteria, some of which will find their way into hospitals sooner or later.

You can view some other ‘Perspectives from ECCMID’ here.

Image: ‘Effluent tank’ by Bob Shand.

Perspective from ECCMID Part III: CDI synthetic “repoopulation” (bacteriotherapy) closer than you think & “CA-CDI” still pie in the sky

Bacteriotheraphy for CDI is closer than you think

As our understanding of the importance of a happy, healthy microbiota develops, it seems increasingly clear to me that bacteriotherapy (administration of a controlled multi-species dose of bacteria) is a real prospect for the treatment of CDI (and most likely other conditions). This is illustrated by the dramatic effectiveness of faecal microbiota transplantation (FMT) for recurrent CDI. FMT is pretty crude, in every sense; synthetic FMT would be safer and more palatable. But I hadn’t realized how far the research towards available bacteriotherphy for CDI had advanced. Dr Trever Lawley gave an expert overview of his research programme, which is pointed in this direction.

Dr Lawley began by describing the human microbiota as a fingerprint: it’s consistent and unique. The microbiota is highly organized, to reflect its function, resulting in microenvironments. Antibiotics are like an atomoic bomb, resulting in huge perturbation of the gut microbiota. The idea of bacteriotheraphy to redress the balance is not new. Pioneers of bacteriotherapy (aka “repoopulation”) for CDI date back to at least 1989.

So, which bacteria get the nod to be included in the synthetic mix? It’s not an easy question, since examining the massively populous human microbiota is a daunting prospect and requires the application of novel tools (see Fig 1 of this excellent open-access review for a useful summary of the methods to examine the human microbiota and microbiome). Human trials and mouse model indicate that single species theraphy and probiotics are equivocal at best. These are blunt weapons to complement the nuclear fall out of the antibiotic A bombs! Dr Lawley’s reaseach has found an irreducible minimum of 6 species that are necessary for effective bacteriotherapy (in mice at least). Now all that is required is to find a common growth medium…oh, and do some humans trials!

Another speaker, Dr Cornley, mentioned another approach to preventing CDI: the prophylactic administration of metronidazole. If you’re read my Perspective from ECCMID on Selective Decontamination, you can probably guess which approach I’d choose.

“CA-CDI” still pie in the sky

A number of speakers contributed to the debate on whether “community-acquired” CDI is on the rise. Dr Scott Weese outlined the potential for foodborne risk of CDI, beginning with a ‘disclosure’ that we can all relate to: “I like to eat but I don’t like foodborne illness”! C. difficile is present in food animials (especially young ones) and strains are shared with humans. Rates of carriage are low, but Dr Weese made a good point on cumulative exposure. If 2% of burgers are C. difficile contaminated, I eat C. difficile on my 98th burger (not exactly, but you get the point). Plus, C. difficile spores can survive usual cooking times (which is not so relevant for me: I like my burger meat rare)! The carriage of C. difficile in animals combined with the high carriage of C. difficile in small human animals means that exposure to C. difficile is probably a daily event. But is this a risk? For a healthy 25 year old in the community, probably no. For a haematology inpatient, probably yes.

Dr Marjolein Hensgens considered whether CDI is still primarily nosocomial. The distinction of community vs. hospital onset is easy, but community vs. hospital acquisition is much more challenging and epidemiological disitinctions are approximate at best. For example, in the UK, a “Trust-apportioned” (=hospital acquired) case requires a specimen from an inpatient who has been in the same hospital for at least 4 days. Any readmission (even if they were in the hospital the previous week) is considered “non Trust-apportioned”, but it’s important to remember that this is not the same as “community-acquired”. The fact that the Trust-apportioned and non Trust-apportioned cases track each other so closely in the UK reductions suggests that almost all cases were healthcare-associated (Figure 1).

CA-CDI_2Figure 1: the number of CDI cases reported to Public Health England, defined as “Trust-apportioned” or “non Trust-apportioned” from 2007 onwards.  

An important US study suggested a stepwise increase in CA-CDI. However, this apparent increase could be explained by a number of other factors. Firstly, a high proportion of patients with apparent CA-CDI actually have had healthcare exposoure of some kind if you look hard enough (82% in this study). So this upward trend in “CA-CDI” could very well be HA-CDI with unrecognized healthcare exposures. Secondly, it is difficult to know whether there have been any changes in the number of diarrhoeal stools tested in the community. Infectious diarrohea has always been common in the community, but is rarely tested for CDI. Thirdly, comparing the epidemiology of patients who develop CDI in the community with those who develop CDI in hospitals could result in a misleading picture. A more appropriate comparator would be patients who have non-CDI diarrhea in the community. Finally, does WGS prove that hospital acquisition of CDI is now rare? No, it only proves that transmission from known symptomatic CDI cases is less frequent than you may expect. There are many other sources for hospital acquisition of CDI, not least asymptomatic carriers. We’re surrounded by C. difficile so of course a degree of CA-CDI occurs. But is it increasing? I still think no – or at least, not rapidly due to phase-shift in epideimogogy (that we saw with the emergence of CA-MRSA in the late 1990s).

You can view some other ‘Perspectives from ECCMID’ here.

Image: C. difficile‘ by AJ Cann.

Perspective from ECCMID 2014 Part II: What to do about MDR-GNR?

 gram neg

I was hoping that the ECCMID 2014 session on ‘Outbreaks of MDR Gram-negative bacteria: what works and what does not work?’ would bring some answers from large, controlled studies to improve the evidence base for MDR-GNR control. I’m sorry to report that most of what was presented only served to highlight the limitations of the evidence base! There’s a bit of a Catch 22 here: in most settings, the problem lies in outbreaks, but the answers lie in large, adequately controlled cluster randomized studies in endemic settings.

  • Dr Weterings from NL provided a rather bleak start to the session, reporting an outbreak of carbapenem-resistant K. pneumoniae in a hospital and nursing home. Environmental cultures regularly grew the outbreak strain (including a shared glucose meter) and the control measures that were effective in the hospital were more challenging to implement in the nuring home.
  • Dr Gonzalez-Galan found a bundle of interventions dramatically effective to reduce the rate of endemic MDR A. baumannii. The bundle comprised surveillance, hand hygiene audit, and a checklist for environmental cleaning and contact precautions compliance. But which element of the bundle worked, and were any elements redundant?
  • Dr Cohen reported an MDR A. baumannii outbreak in Israel affecting 70% of ventilated patients at its peak, which forced colistin as the empiric VAP therapy. Proper disinfection of the ventilators brought the problem under control. Similarly, an endoscoy-associated ESBL K. pneumoniae outbreak in Norway (reminescient of the NDM outbreak in Chigago) was controlled by implementing proper endoscope disinfection.
  • Probably the most useful presentation of the session was from Dr Cataldo preseting a systematic review of interventions for MDR-GNR. Most studies (78% of the 86 included) were in outbreak settings, and plagued by low quality. Nonetheless, bundles were 2x more effective than single interventions (45% vs. 28%). The study struggled to determine convincingly which element of the bundles was most effective, but hand hygiene, contact precations and education came through as the pillars of effective bundles.
  • Dr Dettenkofer showed that an educational intervention improved compliance with standard precautions (especially hand hygiene and to a lesser extent the inappropriate use of examination gloves for some procedures). However, ‘standard precations’ are far from standard, and it seems that you need to go further than standard precautions to control MDR-GNR.
  • Dr Hussein showed that standing over healthcare workers and telling them to wash their hands improved compliance (unsurprisingly!). I venture that hospitals would only take this measure in extreme circumstances, although hand hygiene “enforcers” are not without precedent.
  • Dr Perencevich reported that the Hawthrone effect tends to strike after 15 mins of observation, so hand hygiene observations should be kept short and sweet. (Incidentally, hand hygiene compliance was higher among doctors than nurses in this study; I think it’s the first time I’ve ever seen it this way around!)
  • Dr Hansen presented data from the PROHIBIT collaborative, who found that alcohol based hand rub usage tracks the prevalence of antimicrobial resistance across Europe. However, the rate of red and yellow cards in the Euro 2008 football championships also correlates with antimicrobial resistance rates across Europe, and national consumption of chocolate correlates with the national rate of Nobel laureates: collelation doesn’t necessarily mean causation!
  • Finally, Dr Langelar reported that the Dutch national healthcare inspectorate visits were effective in raising standards. But was this papering the cracks or effecting culture change?
  • I am sure there were lots of good posters on this topic too, but I didn’t get very far with those. Perhaps somebody else did and would like to provide some additional information?

Dr Evelina Tacconnelli gave a thoughtful talk comparing the various international guidelines for MDR-GNR, reflecting on the recently published ESCMID version. The subject is broad, specifically in terms of which MDR-GNR, and in which setting. Guidelines for CRE in a general hospital population would look quite different to guidelines for CRAB in the ICU. Dr Tacconnelli focused on the areas of controvosy: isolation for ESBL carriers, how to prioritise limited side rooms (see useful ‘Lewisham’ isolation prioritization tool in Appendix 6 of these Irish guildelines), selective digestive decontamination, and the need for bundles. Finally, Dr Tacconnelli referenced a neat model for the effectiveness of various infection control interventions for controlling the spread CRKP. This is a clever study, and probably useful, but much like Berta (showing my age), incorrect inputs result in meaningless (or worse, misleading) outputs.

Dr Anna-Pelagia Magiarakos discussed some of the challenges of implementing guidelines, reminiscent of Dr Evonne Curren’s recent talk on a similar subject. One important point is to have some guidelines to implement! Countries lacking guidelines for the control of MDR-GNR tend to have higher rates (ECDC and PROHIBIT data). Once you have some guidelines, barriers to implementation need to be overcome: time, culture, resources, lack of understanding or belief that they will work, competence, habit, routines and “ivory tower” guidelines written by those detacted from the coal-face, to name but a few!

So are we any closer to knowing what works to control MDR-GNR following ECCMID 2014? Bundles are more effective than single interventions, but we still don’t know which elements of the bundle are most important, and this will vary by pathogen and setting. We need more studies like the commendable but complex MOSAR Lancet ID study.

You can view some other ‘Perspectives from ECCMID’ here.

Image credit: Iqbal Osman.

Perspective from ECCMID 2014 Part I: a voice against ‘selective’ digestive decontamination (SDD)

eccmid 2014

I enjoyed this year’s ECCMID in Barcelona very much, and came away feeling scientifically, culturally and culinarily enriched! Many thanks to the organizers for such a broad and interesting programme. One of the most interesting sessions was the very final session, on controlling MDROs in the ICU. The session boiled down to the pros and cons of three approaches to decolonization: selective digestive decontamination (SDD), mupirocin for MRSA nasal decolonization, and chlorhexidine gluconate (CHG) bathing. The faculty of Dr Brun-Buisson, Dr Harbarth, Dr Bonten and Dr Huang made it an engaging session.

Selective digestive decontamination (SDD)

The problem is antibiotic resistant bacteria, particularly in the ICU. Is the solution really indiscriminate use of antibiotics to temporarily suppress the load of antibiotic resistant bacteria in the gut? It doesn’t make a lot of sense to me either.

Selective oral decontamination (SOD) or selective digestive decontamination (SDD) is not a new concept, and has been around for some 45 years. Only recently have impressive studies emerged demonstrating that SDD and, to a lesser extent, SOD suppress the load of antibiotic resistant bacteria in the gut, reduce mortality and reduce transmission (de Jonge, de Smet and Daneman). But it’s not without collatoral:

  • The use of antibiotics leads to antibiotic resistance, sooner or later. A number of studies suggest that SD is not associated with an increased overall prevalence of resistant bacteria (not least the impressive Daneman review). Indeed, one study showed that rates of resistance actually decreased on units using SD. However, these studies conflate the potential for reducing transmission (and hence reducing unit-level prevalence) with the risk of selecting resistant sub-populations, which both seem likely. One particular concern is the emerging data that SDD drives colistin resistance. Are we playing with fire by overusing our drug of last resort? Furthermore, the abundance of key antibiotic resistance genes doubled on units using SDD when using a microbiotic approach in a recent study.
  • Speaking of the microbiome, another speaker described antibiotics as a ‘microbiome-busing atomic bomb’, so perhaps we should rename SDD as ‘scorched earth decontamination’ (SED)! The importance of a happy, healthy microbiota is beginning to dawn on us. We need to make friends with our microbiome, not obliterate it with unindicated antibiotics.
  • The impressive studies showing the value of SDD have been performed in the Netherlands, which has a low rate of antibiotic resistance. Will SDD be as effective elsewhere, where the background rates of antibiotic resistance are higher? Indeed, the Daneman study showed a notable (although no statistically significant) increase in the prevalence of MRSA on units using SDD. Is this a case of ‘squeezing the MDRO balloon’?
  • SDD temporarily suppresses gut colonization with multidrug-resistant Gram-negative rods but rarely decolonizes permanently. From an infection control viewpoint, it should not change the ‘once positive, always postive’ status quo for resistant Enterobacteriaceae.

I appreciate that I’m presenting a polarized and rather one-side case against SDD here. But for me, whether SDD works is the wrong question: is it the right thing to do? If (perish the thought) I’m a patient in the ICU, then SDD works for me. However, if I’m working on an ICU in 2024 wondering what to do with a pan-drug resistant Gram-negative bacterium, SDD (in 2014) doesn’t work for me.

Mupirocin

Dr Huang presented the key findings from her impressive study of universal CHG combined with MUP. Mupirocin resistance in S. aureus can be low-level mutational) or high-level (acquisition of the Mup resistance genes). The use of MUP has been associated with the development of both high- and low- level resistance. Indeed, several updates from ECCMID show this. For example, Dr Sarah Deeny’s poster showed that low-level resistance appeared to develop during hospitalization. Plus, a study from our group showing that detection of phenotypic high- or low-level mupirocin resistance only represents three quarters of the picture, since carriage of mupirocin resistance determinants remains “silent” 25% of the time.

So, the key question hanging over Dr Huang’s study is the value of universal MUP over and above universal use of CHG. Dr Huang presented an excellent analysis table on this point, which I’ve reproduced below:

Table: Weighing the pros and cons of universal mupirocin use (reproduced with permission from Dr Huang).Huang ECCMID table

One of Dr Huang’s most powerful arguments was that the burden of mupirocin use is in decolonizing a large number of patients prior to elective surgery. Ergo, if you’re going to save MUP, then save it for the highest risk patients (e.g. ICU patients). However, the counter here is that local use of MUP is likely to drive local MUP resistance on the ICU. So, I still feel that we should not recommend the universal use of MUP.

Chlorhexidine

Dr Bonden, Dr Huang and Dr Harbarth reviewed the impressive studies that CHG bathing provides strong protective effects against a range of MDROs (for example Climo, Milstone and Vernon studies). These studies are not without their critics – some say that the effect on reducing relatively benign coagulase-negative staphylococci BSI amplifies the overall effect. However, both the data and rationale are stong: if you reduce the amount of MDRO on the patients’ skin (‘source control’), you reduce the chances of endogeneous infection, and transmission to others. Unlike antibiotics, CHG is a biocide with a less specific molecular target, which makes resistance more challenging from a bacterial viewpoint. However, reduced susceptibility to chlorhexidine must be monitored carefully. A number of studies have hinted that reduced susceptibility to chlorhexide may be an emerging problem, (for example Batra, Otter and Lee.) But increases in bacterial MICs (for Gram-positive bacteria at least) appear to be a long way below the applied concentration. However, it’s worth noting that the measured CHG skin concentration in one study (15-312 mg/L before the daily bath and 78-1250 mg/L after the daily bath) was much lower than the applied CHG concentration (10,000 mg/L). This is around the CHG MIC for some Gram-negatives and potentially brings the subtly reduced susceptibility to CHG reported in MRSA into play. On balance though, the rationale and data on reduced susceptibility are cautionary but not enough to recommend against universal use in the ICU given the clinical upside.

Conclusion

What shoud be the standard of care for ICU patients? My current view is: universal CHG, targeted mupirocin for MRSA decolonization and absolutely no SDD!

Headlines from ECCMID

I’ll be posting some blogs on some of these topics over the coming days. You can view some other ‘Perspectives from ECCMID’ here.

  • We are still no closer to figuring out what works to control multidrug resistant Gram-negative rods (including CRE).
  • CDI does not seem to be emerging as a community pathogen, despite apparent increases.
  • Bacteriotherpy for synthetic faecal microbiota transplant (FMT, aka transpoosion) is getting close.
  • We need to stop polluting our plant by pumping antibiotics into our environment.
  • As one tweeter (@marina_manrique) put it, whole genome sequencing (WGS) has becoming a bit like the One Ring from Tolkein’s Lord of the Rings: ‘one ring to rule them all, one ring to bind them, one ring to bring them all, and in the outbreak find them (out – the other methods that is)’.

Picture credit: ‘Antibiotics’.

What does lab diagnosis of MDR-GNR have to do with SURFing?

I met the Service Users Research Forum (SURF) yesterday, and they asked me to give a presentation on the emergence and detection of multidrug-resistant Gram-negative bacteria (you can download my slides here). I found these slides by Dr Katie Hopkins (PHE) useful in preparing mine. It was my first interaction with a patient-led research group and I enjoyed the meeting very much. I found the SURF members and their academic support team from the University of West London to be engaged, engaging, knowledgeable and thirsty for knowledge. Their questions were insightful and their suggestions were thought-provoking. Informal discussions on a current research proposal (for enhanced surveillance of carbapenem-resistant Gram-negatives) gave me some useful ideas; researchers can easily lose sight of the patient perspective. I can see why funders such as NIHR now insist on seeing patient involvement in the development of research proposals and I am sure I will be SURFing again in the near future!

I put together the flow chart below to try and summarise the diagnostic approach to the lab detection of MDR-GNR. I would appreciate any thoughts you have on this flow chart…

surf mdrgrn

Chronic wound? No problem – a splash of oxygen peroxide should do the trick

JWC

I was involved in a recently published RCT of a novel wound care system (‘BioxyQuell’), which trailed the application of an aqueous oxygen peroxide (AOP, aka aqueous ozone) lavage on venous leg ulcers. The study isn’t perfect, but the results are encouraging and should serve as a basis for further evaluations.

The RCT was performed in the community setting. Sixty-one patients were enrolled if they had chronic venous leg ulcers, and randomised to either 6 AOP treatments or sham placebo treatments with sterile water over 2 weeks. AOP and placebo arms were both treated with standard-of-care compression bandaging between treatments and for the duration of the study. Patients who completed the initial 8 week RCT were enrolled into a follow-up study evaluating wound healing at 12 weeks, 6 and 12 months. The key results are:

  • No significant difference in wound healing at 8 weeks (the primary outcome). But, something surprising happened during the follow-up study: every single one of the AOP treated patients had healed by 12 months vs. 50% in the placebo arm (Figure 1). Could it really be that a 2 week AOP treatment period has such a remarkable impact on long-term wound healing? It’s just about plausible if there was some pretty fundamental change to the wound bed. But I’d need to see some more data to be convinced!
  • Almost 50% of patients in the placebo arm were healed by 8 weeks. The inclusion criteria required a chronic leg ulcer more than 3 months old, so why did 50% of placebo patients heal? I suspect the answer lies in good compression bandaging, which was mandated during the trial. There’s a temptation to apply the bandages too loosely to be effective to improve patient comfort.
  • AOP patients were significantly less likely to be withdrawn from the study due to wound infection during the first 4 weeks of the RCT (Figure 2). This stark difference makes sense: AOP is a biocide with the capacity to dismantle biofilms, which are involved in wound infection.
  • Pain and bacterial contamination were significantly lower in the AOP arm during the treatment phase. Wound size was also significantly reduced in the AOP arm when accounting for patients with deteriorating infected wounds.

Figure 1: Wound healing in the AOP and placebo arm during the RCT (up to 8 weeks) and follow-up study (up to 12 months).

BxQ healing

Figure 2: Patient withdrawals due to wound infection during the first 8 weeks of the RCT.

BxQ withdrawal

As ever, the study raises as many if not more questions than it answers:

  • What is the optimal treatment regimen for AOP? Is two weeks enough? Is three treatments per week about right (and feasible)? Put another way, how would this study have looked if the AOP treatment was performed for 4 weeks instead of 2? The fact that improvements in wound size reduction, pain, bacterial contamination and withdrawal due to wound infection trailed off after the 2 week treatment period indicates that a longer treatment period should be explored.
  • What is the best application of AOP in healthcare? Acute care? Primary care? Trauma? Diabetic foot?
  • Is ‘wound healing’ the right outcome measure? Perhaps prevention of withdrawals due to wound infection is the most important finding of this study. And don’t underestimate the value of the reduced pain in the AOP arm – the ulcers are very painful and make sufferers pretty miserable!

This was a fair sized randomised, sham placebo controlled, double blinded study performed with tight inclusion and exclusion criteria in the community setting. This sort of RCT is surprisingly rare in the wound care literature. However, it was terminated early (due to lack of significant difference in wound healing and the fact that it took years to recruit a decent number of patients) and the primary outcome (wound healing at 8 weeks) was non-significant. Nonetheless, wound size, pain and bacterial contamination were reduced by AOP treatment, the long-term reductions in wound healing are tantalizing, and reduced withdrawals due to wound infection are tangible.

Article citation: O’Halloran PD, Winter PK, Otter JA, Adams NM, Chewins J. Aqueous oxygen peroxide treatment of VLUs in a primary care-based randomised, doubleblind, placebo-controlled trial. J Wound Care 2014;23:176-90.

Post script: in memoriam

The statistician who formulated and executed the statistical analysis for this study, Carla White, sadly died (too young) during the latter stages of getting this article published. I am sorry not to have the opportunity to work with her again.