Skip to main content

38th International Symposium on Intensive Care and Emergency Medicine

Brussels, Belgium. 20-23 March 2018

    P001 Reduced cellular respiration and ATP production in an in vitro model of sepsis

    V Herwanto1, Y Wang1, M Shojaei1, B Tang2, AS McLean2

    1University of Sydney, Westmead, Australia; 2Nepean Hospital University of Sydney, Nepean Clinical School, Kingswood, Australia

    Introduction: Leukocyte dysfunction may play a role in sepsis pathogenesis. Established evidence showed that leukocyte dysfunction leads to reduced immune response and consequently an increased sepsis-related mortality. Impaired metabolism has been recently proposed as one possible mechanisms underpinning leukocyte dysfunction in sepsis. In this study, we investigated the global changes in leukocyte metabolism in sepsis, using an established in vitro model of lipopolysaccharide (LPS) stimulation.

    Methods: Peripheral blood mononuclear cells (PBMC) were isolated from healthy volunteers (n=4) and incubated with 62.5 ng/mL LPS. Mitochondrial respiration was measured using Agilent Seahorse XF Analyzer (Cell Mito Stress Test Kit). Total cellular oxidative stress was measured using DCFDA Cellular Reactive Oxygen Species (ROS) Detection Assay Kit (Abcam) and mitochondrial superoxide was measured using MitoSOXTM (Life Technology). Apoptosis was measured by Annexin V-FITC Apoptosis Detection Kit (Abcam). Evaluation of oxidative stress and apoptosis were performed using BD FACSCanto flow cytometer and flow cytometry data was analyzed using FlowJo Software V10.

    Results: LPS stimulation of PBMC from healthy volunteers showed a trend of decrease in both oxidative phosphorylation and cellular respiration (Fig. 1). This decrease in cellular metabolism was accompanied by a trend towards an increase in cell death in the stimulated leukocytes (Fig. 2). The increase in cell death was associated with an increase in oxidative stress (total and mitochondria) (Fig. 2), suggesting that the adverse effect of LPS on cellular metabolism may be mediated by an imbalance in redox potential.

    Conclusions: The LPS stimulation model could provide a useful approach to study the effect of sepsis on leukocyte metabolism. Further study is required to better understand the mechanism of reduced leukocyte metabolism, including the possible role of oxidative stress in reducing cellular respiration and causing leukocyte cell death.

    Fig. 1 (abstract P001).

    Cellular metabolism as measured in oxygen consumption rate (OCR) (n = 4). Basal denotes energetic demand of the cell under baseline condition; spare respiratory capacity denotes the capability of the cell to respond to energetic demand; proton leak denotes remaining basal respiration not coupled to ATP production, can be a sign of mitochondrial damage; ATP production shows ATP produced by the mitochondria to meet the energetic need of the cell.

    Fig. 2 (abstract P001).

    Number of apoptotic cells, total cellular ROS, and mitochondrial superoxide as measured by Annexin V, DCFDA, and MitoSOX (n = 4).

    P002 Noninvasive technique using fluorescence imaging to assess vascular permeability in sepsis

    T Shimazui1, T Nakada1, L Fujimura2, A Sakamoto2, M Hatano2, S Oda1

    1Chiba University Hospital, Chiba; Japan, 2Chiba University, Chiba, Japan

    Introduction: Conventional assay technique to quantify vascular permeability in animal studies requires sacrifice animals; this becomes a barrier to evaluate of temporal changes or responses to therapeutic approaches in a single individual. In vivo fluorescence imaging potentially quantifies vascular permeability without sacrifice animals. However, the use of this noninvasive approach for the assessment of vascular permeability in remote organ injury caused by systemic inflammatory disease such as sepsis has not been reported.

    Methods: Cecal ligation and puncture (CLP)-induced septic mouse model was compared to sham and hydrocortisone pretreated (CLP + HC) mouse models. The lung was assumed as an injured remote organ and the footpad was assumed as a noninvasive observational site. The mixture of Evans blue (EB) and fluorescent dye of Genhance 750 were injected into mice, and the extraction of EB in harvested lung was assessed as a conventional indicator of vascular permeability. Fluorescent intensities in the harvested lung or footpad were assessed and their correlation was analyzed to investigate this novel, noninvasive approach to estimation of lung vascular permeability.

    Results: EB extraction in the harvested lung in the CLP group was significantly higher than in the other groups (CLP vs. sham, P=0.0012; CLP vs. CLP + HC, P=0.011). Fluorescent intensity in the footpad and harvested lung in the CLP group was also significantly higher than in the other groups (footpad, CLP vs. sham, P<0.0001; CLP vs. CLP + HC, P=0.0004; lung, CLP vs. sham, P<0.0001; CLP vs. CLP + HC, P<0.0001). The fluorescent intensity of the footpad was strongly correlated with that of the lung (r=0.95).

    Conclusions: The fluorescence imaging technique may be useful for assessment of vascular permeability based on EB quantification. The footpad fluorescent intensity was strongly correlated with that of the lung, and may be a suitable indicator in noninvasive estimation of lung vascular permeability.

    P003 Correlation of systemic and regional glycocalyx injury parameters in non-septic critically ill patients

    T Tamosuitis1, A Pranskunas1, N Balciuniene1, D Damanskyte1, E Sirvinskas1, A Vitkauskiene1, E Sneideris1, C Boerma2

    1Lithuanian University of Health Sciences, Kaunas, Lithuania, 2Medical Center Leeuwarden, Leeuwarden, Netherlands

    Introduction: The relationship between systemic glycocalyx degradation markers and regional glycocalyx thickness in non-septic critically ill patients is unclear. Conjunctival sidestream dark field-imaging for the purpose of glycocalyx thickness estimation has never been performed. We aimed to investigate whether changes in glycocalyx thickness in conjunctival and sublingual mucosa are associated with global glycocalyx shedding markers.

    Methods: In this single-centre prospective observational study, using techniques for direct in-vivo observation of the microcirculation, we performed a single measurement of glycocalyx thickness in both ocular conjunctiva and sublingual mucosa in mixed cardio surgical (n=18) and neurocritical patients (n=27) and compared these data with age-matched healthy controls (n=20). In addition we measured systemic syndecan-1 levels

    Results: In the sublingual and conjunctival region we observed a significant increase of the perfused boundary region (PBR) in both neuro critical and cardiac surgical ICU patients, compared to controls (2.20[2.04-2.42] vs 1.76[1.63-2.08] and 2.19[2.01-2.36] vs. 1.70[1.61-2.00], p<0,05).There was a significant increase of syndecan-1 in ICU patients comparing with controls and in cardiac patients comparing with neurological (120.0[71.0-189.6] vs. 18.0[7.2-40.7], p<0,05). We detected a weak correlation between syndecan-1 and sublingual PBR(r=0.40, p=0.002) but no correlations between global glycocalyx damage markers and conjuctival glycocalyx thickness.

    Conclusions: Conjunctival glycocalyx thickness evaluation using SDF videomicroscopy is suitable and is impaired in non-septic ICU patients but only measurements in sublingual mucosa are correlating with systemic glycocalyx shedding markers. Global glycocalyx damage is more severe in cardiac comparing to neuro critical patients.

    P004 Glycocalyx degradation in sepsis at admission

    D Beurskens1, M Bol2, B Broddin2, C Reutelingsperger1, T Delhaas1, M Poll2, G Nicolaes2, J Sels2

    1Maastricht University, Maastricht, Netherlands; 2MUMC, Maastricht, Netherlands

    Introduction: The purpose of this study is to investigate the endothelial glycocalyx at the onset of sepsis. We hypothesize that the perfused boundary region (PBR) in microvessels (5-25μm) measured by side stream darkfield imaging (SDF) and plasma markers of glycocalyx shedding is increased in non-survivors.

    Methods: We studied 31 sepsis patients and divided them into survivors (n=17) and non-survivors (n=14) (30-day mortality). SDF measurements and blood sampling were performed within 24h of ICU-admission. ELISAs were used to quantify syndecan-1, angiopoietin-1 (Ang-1) and angiopoietin-2 (Ang-2). Non-parametric tests (Spearman for correlations, Mann Whitney for group comparison) were used to assess statistical significance. A p<0.05 was considered significant. Results are presented as median (25th-75th percentile).

    Results: Syndecan-1 levels were higher in non-survivors compared to survivors (519.2 (255.6-1056.7) vs. 178.0 (58.1-298.2) ng/ml, p=0.005) (Fig. 1). PBR tended to be higher in non-survivors (2.0 (1.9-2.2)μM) than in survivors (1.9 (1.7-2.1)μM) (p=0.05) (Fig. 2). Syndecan-1 correlated positively with APACHE II (ρ =0.60; p=0.02) and Ang-2/Ang-1 ratio (ρ =0.59; p=0.004), but not with PBR.

    Conclusions: Plasma markers of glycocalyx shedding at ICU admission are predictors of 30-day mortality and correlate with APACHE II score. However, there is no correlation between these plasma markers and glycocalyx thickness measured by SDF imaging on ICU admission. Further studies should address the cause of this apparent discrepancy to define the role of SDF imaging in the assessment of glycocalyx shedding in sepsis.

    Fig. 1 (abstract P004).

    boxplot of Syndecan-1 for survivors and non-survivors

    Fig. 2 (abstract P004).

    boxplot of PBR for survivors and non-survivors

    P005 The role of glycocalyx shedding and platelet adhesion in sepsis-induced microvascular dysfunction

    CL Manrique-Caballero, CJ Baty, MH Oberbarnscheidt, A Frank, FG Lakkis, BS Zuckerbraun, MR Pinsky, JA Kellum, H Gomez

    University of Pittsburgh, Pittsburgh, PA, USA

    Introduction: Sepsis is associated with endothelial activation leading to alterations in global microcirculatory blood flow distribution. These effects might constitute key mechanisms leading to organ dysfunction. The aim of this study was to investigate the role of glycocalyx shedding and platelet adhesion on the development of microvascular dysfunction in the renal peritubular capillary system, and its association to the development of acute kidney injury (AKI).

    Methods: C57BL/6 (n=6-8/group) mice received vehicle (control), hyaluronidase (140 IU IA q8h during 24 hours) or underwent cecal ligation and puncture (CLP). Platelet adhesion and rolling in renal peritubular capillaries were assessed and quantified using multiphoton IVM in five different areas [1]. Plasma syndecan (SDC-1) and hyaluronan were measured to assess glycocalyx shedding. Neutrophil Gelatinase-Associated Lipocaline (NGAL) and creatinine levels were used as markers of AKI and measured using commercially available assays.

    Results: Hyaluronidase and CLP resulted in shedding of the glycocalyx (Fig. 1A) and increased platelet adhesion and rolling as compared to control (Fig. 2). Enzymatic-induced shedding did not involve a systemic inflammatory response, as shown by IL-6 levels (Fig. 1B). Irrespective of a systemic inflammatory response, shedding of the glycocalyx and increased platelet adhesion and rolling was associated with increased markers of AKI (Fig. 1C, D).

    Conclusions: Shedding of the glycocalyx increases platelet adhesion and rolling, resulting in AKI, suggesting this as a potential mechanism of organ injury. The increase in AKI markers in the noninfectious hyaluronidase model without IL-6 elevation indicate that glycocalyx denudation and secondary platelet adhesion may be mechanisms of organ injury independent from sepsis-induced systemic inflammation.


    1. Singer, G., et al. Microcirculation 13(2):89-97, 2006.

    Fig. 1 (abstract P005).

    Glycocalyx components, IL-6 and AKI markers

    Fig. 2 (abstract P005).

    Platelet adhesion and rolling

    P006 AMPK protects against sepsis-induced endothelial dysfunction through cytoskeleton modulation

    M Angé1, L Bertrand1, C Beauloye1, S Horman1, D Castanares-Zapatero2

    1Institut de Recherche Experimentale et Clinique - Pôle de Recherche Cardiovasculaire, Brussels, Belgium; 2Cliniques Universitaires Saint Luc - Intensive Care Unit, Brussels, Belgium

    Introduction: Endothelial dysfunction plays a major role in the sepsis related organ dysfunction, and is featured by vascular leakage. AMP-activated protein kinase (AMPK) is known to regulate actin cytoskeleton organization and interendothelial junctions (IEJs), contributing to endothelial barrier integrity. We have already demonstrated its role in defence against sepsis induced hyperpermeability [1], but the underlying mechanisms remain unknown. This project aims to identify molecular targets involved in the beneficial action of AMPK against endothelial barrier dysfunction.

    Methods: Experiments have been performed in human microvascular dermal endothelial cells. α1AMPK activity has been modulated via the use of a specific siRNA or treatment by two pharmacological AMPK activators (AICAr, 991). We have investigated the effect of this modulation on the expression/phosphorylation of Connexin 43 (Cx43) and Heat shock protein 27 (HSP27), two proteins playing a key role in maintenance of IEJs and actin dynamics respectively.

    Results: We show that α1AMPK is required to sustain the level of Cx43 expression as it was drastically reduced in cells transfected with a siRNA targeting specifically α1AMPK. Regarding HSP27, its expression level was not affected by α1AMPK deletion. However, both AMPK activators increased its phosphorylation on Ser82, in a α1AMPKdependent manner, while they had no effect on Cx43. Our results also reveal that HSP27 phosphorylation concurred with the appearance of actin stress fibers at the periphery of cells, suggesting a beneficial role for pHSP27 as well as F-actin stress fibers in vascular barrier function through reinforcing the endothelial tethering.

    Conclusions: Our work identifies the regulation of Cx43 expression and HSP27 phosphorylation as potential protective responses underlying the beneficial action of AMPK against endothelial barrier dysfunction. AMPK could consequently represent a new therapeutic target during sepsis.


    1. Castanares D et al. Crit Care Med. 41(12):e411-e422, 2013

    P007 Melusin protects from LPS-induced cardiomyopathy through modulation of calcium channel signalling

    P Arina1, A Costamagna1, L Brazzi1, M Brancaccio2, R Giuseppe1, N Vitale2, L Del Sorbo1, P Capello3, A Mazzeo4, L Mascia5, VM Ranieri6, M Sorge2, V Fanelli4

    1Università di Torino, Dipartimento di Anestesia e Rianimazione, Torino, Italy; 2Università di Torino, Molecular Biotechnology Center - Torino, Torino, Italy; 3Università di Torino, Department of Molecular Biotechnology and Health Sciences, Torino, Italy; 4Università di Torino, Dipartimento di Scienze Chirurgiche, Torino, Italy; 5Università degli Studi di ROMA "La Sapienza", Dipartimento di Scienze E Biotecnologie Medico-Chirurgiche, Roma, Italy; 6Università degli Studi di ROMA "La Sapienza", Dipartimento di Scienze Cardiovascolari, Respiratorie, Nefrologiche, Anestesiologiche E Geriatriche, Roma, Italy

    Introduction: Sepsis Induced Cardiomyopathy (SIC) is a serious condition during sepsis with a mortality rate up to 70% (1). SIC is clinically manifested with left ventricle impaired contractility (2). Melusin is a muscle-specific protein involved in sustaining cardiomyocyte survival thorough the activation of AKT signaling pathways (3). PI3K– AKT signaling pathway plays a pivotal role in regulating calcium channel activity (4). We hypothesized that Melusin overexpression could exert a protective effect on cardiac function during septic injury.

    Methods: Animals were treated with an intraperitoneal injection of lipopolysaccharide (LPS) at 12 mg/kg. SV129 strain Knockout mice (KO) for Melusin gene and FVB strain with cardiac-specific overexpression (OV) of Melusin were compared. Each group was studied together with a control group (WT). Hemocardiac parameters were studied at 0 hour and 6 hours through echocardiography. Another cohort of animals was sacrificed 6 hours after 20 mg/kg LPS treatment and cardiac tissues and blood sample were harvested for Wb analysis to quantify the expression of AKT, P-AKT and CACNA1C and Elisa analysis for Troponin levels.

    Results: SV129 WT, KO Melusin and FVB WT mice groups, fractional shortening (FS) was significantly impaired after LPS challenge and was associated with compensatory tachycardia (Fig. 1). FVB OV mice group didn’t show decrease in FS. Consistent with the increased AKT phosphorylation observed in OV mice, the expression of CACNA1C was also significantly higher both at basal levels and after LPS treatment in OV mice compared to WT mice (Fig. 2). Troponin levels didn’t differ between mice groups after LPS treatment

    Conclusion: Melusin has protective role in LPS induced cardiomyopathy, likely through Akt phosphorylation controlling the CACNA1C protein density.


    1. Neri M et al. Mediators Inflamm. 2016:3423450, 2016

    2. Sato R et al. J Intensive Care. 3:48, 2015

    3. De Acetis M et al. Circ Res. 96(10):1087–94, 2005

    4. Catalucci D et al. J Cell Biol. 184(6):923–33, 2009.

    Fig. 1 (abstract P007).

    Panel A A. Changes in Fractional Shortening (FS) versus time. Experimental groups are: SV129 WT, SV129 KO for Melusin expression, where n=12 for both groups. B. Changes in Heart Rate (HR) versus time. The groups were the same in A. C. M-mode echography of Left Ventricle (LV) before and after LPS challenge. Panel B A. Changes in Fractional Shortening (FS) versus time. Experimental groups are: FVB WT, FVB OV for Melusin expression, where n=8 for FVB WT and n=12 for FVB OV. B. Changes in Heart Rate (HR) versus time. The groups were the same in A. C. M-mode echography of Left Ventricle (LV) before and after LPS challenge.

    Fig. 2 (abstract P007).

    A. Changes in CACNA1C expression versus LPS exposure. Experimental groups are: FVB WT sham and FVB OV for Melusin Expression sham, where n=2 for both groups, B. Changes in pAKT expression versus LPS exposure. Experimental groups are the same in A C. Changes in pAKT/AKT expression versus LPS exposure. Experimental groups are the same in A. D. Western Blot E. Changes in plasmatic Troponin level before and after LPS challenge. Experimental groups are: FVB WT (in white sham and treated), FVB OV (in gray sham and treated).

    P008 Hepatic function is impaired after abdominal surgery and prolonged sedation and mechanical ventilation

    S Liu1, M Jakob2, A Kohler2, D Berger1, S Jakob1

    1Inselspital, Bern University Hospital, University of Bern, Department of Intensive Care Medicine, Bern, Switzerland; 2Inselspital, Bern University Hospital, University of Bern, Department of Visceral Surgery and Medicine, Bern, Switzerland

    Introduction: Liver dysfunction is frequent in sepsis, but its pathophysiology remains incompletely understood. Since altered liver function has also been described in ICU patients without sepsis [1,2], the influence of sepsis may be overestimated. We hypothesized that sedation and prolonged mechanical ventilation after abdominal surgery is associated with impaired liver function independent of sepsis.

    Methods: Sedated and mechanically ventilated pigs underwent abdominal surgery for regional hemodynamic monitoring and were subsequently randomized to fecal peritonitis and controls, respectively (n=4, each), followed by 80 h observation. Indocyanine green (ICG) retention rate 15 minutes after injection of 0.25mg/kg ICG (ICG R15) was determined at baseline, and 11, 32 and 80 h after sepsis induction (SI), and at the same time points in controls. Concurrent with ICG R15, plasma volume, total hepatic perfusion (ultrasound transit time), and bilirubin and liver enzymes were measured. ANOVA for non-parametric repeated measurements was performed in both groups separately.

    Results: ICG R15 increased over time without significant differences between groups (Table 1). There was a parallel increase in bilirubin in septic but not control animals. The other measured parameters were similar in both groups at the end of the experiment.

    Conclusion: Liver function was impaired under sedation and prolonged mechanical ventilation after abdominal surgery, even in animals without sepsis. The underlying reasons should be further explored.


    1. Koch A, et al. Crit Care 15:R266, 2011.

    2. Sander M, et al. Crit Care 13:R149, 2009.

    Table 1 (abstract P008). Friedman test was used to evaluate the time effect on parameters in each time point, but only Baseline and SI + 80h were shown in this table. No differences between groups. SI, sepsis induction; PV, plasma volume; HF, hepatic blood flow; ALT, Alanine amino transferase; AST, Aspartate amino transferase; PT, prothrombin time

    P009 Preclinical evaluation of non-anticoagulant heparin in mouse models of inflammation and sepsis.

    GA Nicolaes1, DM Beurskens1, P Garcia de Frutos2, J Van Daal3, R Schrijver3, B Kool3, C Aresté2, S De Kimpe3, CP Reutelingsperger1

    1Cardiovascular Research Institute Maastricht, Maastricht, Netherlands; 2Department of Cell Death and Proliferation, Institute of Biomedical Research of Barcelona, Barcelona, Spain; 3Matisse Pharmaceuticals BV, Geleen, Netherlands

    Introduction: Previous work has shown the cytoprotective properties of antithrombin-affinity depleted heparin (AADH), by neutralization of cytotoxic extracellular histones [1], major mediators of death in sepsis [2,3]. AADH was produced from clinical grade heparin, resulting in preparations that have lost >99,5% of their anticoagulant activity. To gain insight into the mechanisms and the basic pharmacological aspects of AADH protective properties, we performed a systematic analysis of how AADH is tolerated in mice and ascertained its effects in three different in vivo models of inflammation and infection.

    Methods: Dose ranging studies, short term and medium term, were performed in C57BL/6 mice. The effects of i.v. administration of extracellular histones in the presence or absence of AADH were assessed in mice. We further analysed the effect of AADH in models of Concanavalin A- and MRSA-mediated lethality. In all studies we assessed clinical signs, lab parameters and histology.

    Results: AADH was well tolerated in both short term and intermediate term (till 7 days) experiments in mice, in the absence of any signs of tissue bleeding. AADH was able to revert the cytotoxic properties of i.v. administered histones.

    In a Concanavalin A mediated model of sterile inflammation, we confirmed that AADH has protective properties that counteract the cytotoxic effects of extracellular histones. In an in vivo lethal MRSA model, for the first time, AADH was shown to induce a survival-benefit.

    Conclusions: We conclude that AADH contributes to the overall increased survival by means of neutralization of extracellular histones and represents a promising product for further development into a drug for the treatment of inflammatory diseases and sepsis.


    [1] Wilhagen et al, Blood 123:1098-1101, 2014

    [2] Xu et al, Nat Med 15:1318-1321, 2009

    [3] Thromb Res 136:542-7, 2015

    P010 Differential modulation of plasminogen activator mediated thrombolysis by recombinant thrombomodulin and activated protein c

    Z Siddiqui1, S Raghuvir1, J Fareed1, R Wasmund1, O Iqbal1, W Jeske1, D Hoppensteadt1, K Tanaka2, K Tsuruta2

    1Loyola University Medical Center, Maywood, IL, USA; 2Asahai Kasei Pharma America Corporation, Waltham, MA, USA

    Introduction: Urokinase (UK) and tissue plasminogen activator (tPA) mediate thrombolytic actions by activating endogenous plasminogen. Thrombomodulin (TM) complexes with thrombin to activate Protein C and thrombin activatable fibrinolysis inhibitor (TAFI). Activated Protein C (APC) modulates coagulation by digesting factors V and VIII and activates fibrinolysis by decreasing PAI-1 functionality.

    Methods: The purpose of this study is to compare the effects of rTM and APC on urokinase and tPA mediated thrombolysis utilizing thromboelastography.

    Results: Native whole blood was activated using a diluted intrinsic activator (APTT reagent, Triniclot). The modulation of thrombolysis by tPA and UK (Abbott, Chicago, USA) was studied by supplementing these agents to whole blood and monitoring TEG profiles. APC (Haematologic Technologies, VT, USA) and rTM (Asahi Kasai Pharma, Tokyo, Japan) were supplemented to the activated blood at 0.02 – 3.0 ug/ml. The modulation of tPA and UK induced thrombolysis by APC and rTM was studied in terms of thromboelastograph patterns. The effect of both APC and rTM on plasma based systems supplemented with tPA was also investigated.

    Conclusions: In comparison to rTM, APC produced a stronger anticoagulant effect in terms of r time, k time, angle and MA. 3.0ug/ml rTM and APC did not produce any direct fibrinolytic effects. APC also produced strong augmentation of the lytic of effects of tPA and urokinase. rTM at lower concentrations produced stabilization of clot resisting fibrinolysis.

    P011 Anticoagulant actions of recombinant thrombomodulin and activated protein c and their neutralization by factor viii inhibitor bypass activity (FEIBA)

    Z Siddiqui1, W Jeske1, M Lewis1, P Aggarwal1, O Iqbal1, D Hoppensteadt1, K Tsuruta2, J Fareed1, S Mehrota1, R Wahi1

    1Loyola University Medical Center, Maywood, IL, USA; 2Asahai Kasei Pharma America Corporation, Waltham, MA, USA

    Introduction: Thromboelastographic (TEG) analysis represents a global approach to monitor the clotability of the native whole blood. rTM (recombinant Thrombomodulin) is a mild anticoagulant with pleiotropic actions which are modulated through its complexation with thrombin. Activated protein C (APC) is a stronger anticoagulant which mediates its action via digestions of factor Va and VIIIa. FEIBA (Factor VII Inhibitor Bypass Activity) contains non activated factors II, IX and X and activated factor VII. The purpose of this study is to compare the relative anticoagulant effects of rTM and APC, employing the thromboelastographic analysis and their neutralization by graded amounts of FEIBA.

    Methods: Citrated whole blood samples were supplemented with rTM and APC at a concentration of 3 ug/mL (n=20). TEG analysis was performed on a TEG 5000 system in which clotting was initiated by re-calcification of the whole blood and set parameters as R time, K time, MA and Angle were measured. The relative neutralization profiles of the APC and rTM by FEIBA at 1.0, 0.1, and 0.01 U/ml were investigated.

    Results: At 3ug/ml rTM produced a mild anticoagulant effect as evident by its TEG profile in terms of prolongation of R and K times and marked decrease in angle and maximum amplitude. APC at 3 ug/mL produced a relatively stronger anticoagulant effects on all of the parameters in the TEG profile. The supplementation of FEIBA at 0.1 U/mL to the whole blood mixtures containing 3 ug/mL completely neutralized rTM and resulted in the partial neutralization of APC.

    Conclusions: These results indicate APC is a stronger anticoagulant in comparison to rTM as measured by the TEG analysis. FEIBA is very effective in the neutralization of the anticoagulant effects of rTM and results in a weaker neutralization of APC. These results suggest that FEIBA can be used to neutralize rTM at much lower levels than the proposed dosing for the bleeding control of hemophilia.

    P012 HLA-DRA and CD74 on intensive care unit admission related to outcome in sepsis

    S Cajander1, B Stammler Jaliff2, A Somell2, A Bäckman1, H Alpkvist2, V Özenci2, J Wernerman2, K Strålin2

    1Faculty of Medicine and Health, Örebro University, Örebro, Sweden; 2Karolinska University Hospital, Stockholm, Sweden

    Introduction: mRNA expressions of the major histocompatibility complex class II-related genes HLA-DRA and CD74 have been found to be promising markers for sepsis-induced immunosuppression. In the present study we aimed to study how expression of HLA-DRA and CD74 on intensive care unit (ICU) admission were related to death and/or secondary infections in patients with sepsis.

    Methods: During a full year adult patients admitted to the ICU of Karolinska University Hospital Huddinge were consecutively subjected to blood sampling within 1 hour from ICU admission. Patients treated with antibiotic therapy were eligible for inclusion. The plausibility of infection (definite, probable, possible, none) was determined based on the Centers for Diseases Control (CDC) criteria. Patients with sepsis (definite/probable/possible infection and a SOFA score increase of >=2) were screened for death within 60 days and secondary infections 48 h to 60 days after ICU admission, using the CDC criteria. HLA-DRA and CD74 mRNA expressions were determined by reverse transcription quantitative PCR.

    Results: Among 579 ICU admissions, a blood sample for RNA analysis was collected in 551 cases. Two hundred fifty-seven patients met the inclusion criteria and provided written informed consent. Sepsis was noted in 134 patients. The sepsis patients experienced death in 36 cases (27%), secondary infection in 32 cases (24%), and death and/or secondary infection in 60 cases (45%). Table 1 shows the results of HLA-DRA and CD74 expression related to death and secondary infections.

    Conclusions: The mRNA expression of HLA-DRA on ICU admission was significantly decreased in patients with sepsis who died or contracted secondary infections within 60 days. CD74 expression was not significantly decreased in patients with negative outcome.

    Table 1 (abstract P012). Median values and interquartile ranges (IQR) of HLA-DRA and CD74 in 134 sepsis patients. Calculations were performed on logarithmic scale due to log-normal distribution

    P013 Acid-base profile of patients with infection during the first 24 hours of intensive care unit admission

    RM Roepke, BA Besen, PV Mendes, LU Taniguchi, M Park

    Hospital das Clínicas HCFMUSP, Faculdade de Medicina, Universidade de São Paulo, São Paulo, Brazil

    Introduction: Acid-base disturbances are common in patients with infection admitted to the intensive care unit (ICU). More attention is given to hyperlactatemia in this patient population as a prognostic factor, although other acid-base disturbances may also have an impact on patient outcomes. Our objective is to describe the acid-base profile of this patient population and determine the association between different acid-base abnormalities and ICU mortality.

    Methods: Retrospective cohort of patients admitted with infection to an intensive care unit. Patients were stratified according to pH (<7.35; 7.35 – 7.45; > 7.45) and, then, according to the standard base excess (SBE) (< -2; -2 – +2; > +2). In each of these strata and the whole population, the proportions of acid-base disturbances were quantified during the first 24 hours of ICU admission. To assess the association between acid-base disturbances and outcome, a logistic regression model was fit, adjusting for age, sex and SAPS 3 score.

    Results: 605 patients were analysed. 304 (50%) patients were acidemic and 244 (40%) presented with a normal pH. Metabolic acidosis (as assessed by SBE) was observed in all subgroups, regardless of pH levels (pH < 7.35: 287/304 [94%]; pH 7.35 – 7.45: 184/244 [75%]; pH > 7.45: 34/57 [60%]). Lactic acidosis was observed in 71% of the whole population; SIG (Strong ion gap) acidosis, in 75%; SID (hyperchloremic) acidosis, in 58%; metabolic alkalosis, in 7%; and respiratory acidosis, in 13% of the patients. In multivariate analysis, lactic acidosis (OR 1.85 [95% CI 1.19 – 2.88]), albumin (OR 0.49 [95% CI 0.34 – 0.69]) and phosphate (OR 1.15 [95% CI 1.05 – 1.26]) were the acid-base variables independently associated with ICU mortality.

    Conclusions: The most common form of acid-base disturbance in patients with infection is SIG acidosis, although only lactic acidosis is independently associated with worse outcomes among strong ions. Weak anions variations are also independently associated with worse outcomes.

    P014 The relationship between serum zinc level and sepsis-induced coagulopathy

    Y Irie, H Ishikura, R Hokama, M Koie, K Muranishi, K Hoshino, R Yuge, T Kitamura, M Iwaasa

    Fukuoka University Hospital, Fukuoka city, Japan

    Introduction: Recently, it was clarified that zinc plays a pivotal role of inflammation and its deficiency resulted in deterioration of severe organ dysfunction including coagulopathy in patients with sepsis.

    The purpose of present study is to clarify the relationship between serum zinc level and organ dysfunction especially sepsis-induced coagulopathy.

    Methods: The present study was conducted a single-center retrospective observational study from June 2016 to September 2017. Blood samples were collected on ICU admission.

    Results: 128 patients were enrolled. Of the 108 patients, 100 patients were sepsis and 8 patients were non-sepsis. The serum zinc levels were significantly lower in the sepsis group than in the non-sepsis group (33.5±19.2 vs 66.5±14.2μ g/dL, p<0.01). Next, we divided sepsis group into two groups using SOFA score (SOFA8 and SOFA <8). Zinc level was significantly lower in group of SOFA>8 group than in group of SOFA <8. However, there were no significant between two groups such as calcium, phosphorus and magnesium. We analyzed the relationships between zinc and each factor of SOFA score. There was the most significant correlation between zinc level and Coagulation (r=-0.49, p<0.001). We performed ROC analysis to predict of DIC. The area under the curve of serum zinc level was 0.700, and cut off value of zinc was 25μ g/dL (sensitivity 0.629 and specificity 0.783, p<0.001). Using this zinc cutoff value, the sepsis group was divided into two groups. The 28-day mortality rate of zinc<25μ g/dL group was significantly higher than that of zinc>25 μ g/dL group.

    Conclusions: Our result suspected that zinc levels are closely related to sepsis induced coagulopathy.

    P015 Association between inflammatory markers and cognitive outcome in patients with acute brain dysfunction due to sepsis

    G Orhun1, E Tuzun2, P Ergin Ozcan1, C Ulusoy2, E Yildirim2, M Kucukerden2, H Gurvit3, F Esen1

    1Istanbul University; Medical Faculty of Istanbul, Anesthesiology and Intensive Care, Istanbul, Turkey,2Istanbul University, Institute of Experimental Medicine, Neuroscience, Istanbul, Turkey,3Istanbul University; Medical Faculty of Istanbul, Department of Neurology, Behavioral Neurology and Movement Disorders Unit, Istanbul, Turkey

    Introduction: Sepsis-induced brain dysfunction has been neglected until recently due to the absence of specific clinical or biological markers. There is increasing evidence that sepsis may pose substantial risks for long term cognitive impairment.

    Methods: To find out clinical and inflammatory factors associated with acute sepsis-induced brain dysfunction (SIBD) serum levels of cytokines, complement breakdown products and neurodegeneration markers were measured by ELISA in sera of 86 SIBD patients and 33 healthy controls. Association between these biological markers and cognitive test results was investigated.

    Results: SIBD patients showed significantly increased IL-6, IL-8, IL-10 and C4d levels and decreased TNF-α, IL-12, C5a and iC3b levels than healthy controls. No significant alteration was observed in neuronal loss and neurodegeneration marker (neuron specific enolase (NSE), amyloid β, tau) levels. Increased IL-1β, IL-6, IL-8, IL-10, TNF-α and decreased C4d, C5a and iC3b levels were associated with septic shock, coma and mortality. Transient mild cognitive impairment was observed in 7 of 21 patients who underwent neuropsychological assessment. Cognitive dysfunction and neuronal loss were associated with increased duration of septic shock and delirium but not baseline serum levels of inflammation and neurodegeneration markers.

    Conclusions: Increased cytokine levels, decreased complement activity and increased neuronal loss are indicators of poor prognosis and adverse events in SIBD. Cognitive dysfunction and neuronal destruction in SIBD do not seem to be associated with systemic inflammation factors and Alzheimer disease-type neurodegeneration but rather with increased duration of neuronal dysfunction and enhanced exposure of the brain to sepsis-inducing pathogens.

    P016 Altered serum profile of aromatic metabolites reflects the biodiversity reduction of gut microbiota in critically ill patients

    N Beloborodova, E Chernevskaya, A Pautova, A Bedova, A Sergeev

    Federal Research and Clinical Center of Intensive Care Medicine and Rehabilitology, Moscow, Russia

    Introduction: High levels of some aromatic microbial metabolites (AMM) in serum are related to the severity and mortality of critically ill patients [1]. Several studies have discussed the imbalance and loss of the diversity of gut microbiota but there are practically no data on the gut microbial metabolites in critical conditions, only a little - in healthy people [2, 3]. The aim of this work is to analyze the connection between serum and fecal levels of AMM in ICU patients.

    Methods: 13 simultaneously serum and fecal samples (SFS) from ICU patients with nosocomial pneumonia (group I), 21 SFS from ICU neurorehabilitation patients (group II) and 5 SFS from healthy people were taken for GC/MS analyses. The following AMM were measured: phenylpropionic (PhPA), phenyllactic (PhLA), p-hydroxybenzoic (p-HBA), p-hydroxyphenyllactic (p-HPhLA), p-hydroxyphenylacetic (HPhAA), p-hydroxyphenylpropionic (p-HPhPA) and homovanillic (HVA) acids. Data were presented as medians with interquartile range (IR, 25-75%) using STATISTICA 10.

    Results: The sum of the level of 4 most relevant metabolites (4AMM) - PhLA, p-HPhLA, p-HPhAA, and HVA - in serum samples from group I and group II were equal to 0.9 (0.6-9.6) μ M and 0.7 (0.5-1.0) μ M, respectively, and were higher than in healthy people – 0.4 (0.4-0.6) μ M (p<0.05). We suppose the presence of the correlation of AMM profile in blood and intestine. Particularly, SFS of healthy people are characterized by the prevalence of PhPA; AMM are not detected in feces of non-survivors but only HVA dominates in their serum in the absence of other (Fig. 1).

    Conclusions: The AMM profiles in gut and serum are interrelated; AMM in serum probably reflect the violation and loss of biodiversity of the gut microbiota in critically ill patients.


    1. Beloborodova NV. Sepsis. Chapter 1. 2017. DOI: 10.5772/68046

    2. McDonald D et al. mSphere 1(4): e00199-16, 2016

    3. Jenner AM et al. Free Radic Biol Med 38:763–772, 2005

    Fig. 1 (abstract P016).

    Comparison of the quality profiles of AMM in serum and feces

    P017 Can biomarkers help identifying the type of blood stream infection in septic patients?

    H Brodska 1, V Adamkova1, K Pelinkova1, A Studena1, J Zavora1, T Drabek2

    1Charles University, Prague, Czech Republic,2University of Pittsburgh, Pittsburgh, PA, USA

    Introduction: Sepsis is one of the most prevalent causes of morbidity and mortality in hospitalized patients worldwide. Early initiation of targeted antibiotic therapy is crucial. Blood stream infections are commonly divided to Gram positive (G+) or Gram negative (G-). However, blood cultures (BC) read-out may be delayed or cultures may be negative (NEG). Biomarkers may help to guide antibiotic therapy prior to BC results [1]. We tested the hypotheses that 1) biomarkers will discriminate between BC- vs. BC+ patients; 2) biomarkers will discriminate between G+ and G- sepsis; 3) biomarkers will correlate with severity of illness.

    Methods: With IRB approval, a patient cohort (n=60) admitted to mixed ICU for suspected sepsis were enrolled in a prospective observational study. BC and biomarkers of sepsis (C-reactive protein, CRP; procalcitonin, PCT; presepsin, PRE; leukocytes, LEU) were assessed and SOFA and qSOFA were determined on admission. Data are displayed as mean±SD or median [IQR]. One-way ANOVA with post-hoc Tukey’s test, Kruskal-Wallis test or Mann-Whitney test were used as appropriate. Pearson’s test was used to assess correlation between SOFA and biomarkers.

    Results: CRP was the only biomarker different between BC- (33 [3, 64] mg/L) vs. BC+ (147 [51, 256] mg/L) patients (p=0.003). Numerically higher values were observed in G- patients. CRP was higher in G+ (p=0.006) and G- (p=0.023) vs. NEG. PCT was higher in G+ vs. NEG (p=0.037). LEU were higher in G- vs. NEG (p=0.044) and vs. G+ (p=0.015). PRE was not different between groups (Table 1). In BC- patients, SOFA score did not correlate with any biomarkers. In BC+ patients, SOFA correlated with CRP (p=0.005) and LEU (p=0.023). PRE correlated with SOFA in G+ patients (p=0.032).

    Conclusions: In our limited sample-size pilot study, tested biomarkers showed limited capacity to identify BC+ patients and the type of infection. Higher values of biomarkers observed in G- sepsis warrant further study.


    [1] Brodska H et al. Clin Exp Med 13(3):165-70, 2013.

    Table 1 (abstract P017). See text for description

    P018 Clinical and diagnostic value of plasma concentration of nitrogen oxide in newborns with respiratory diseases

    MG Pukhtinskaya, V Estrin

    State Medical University, Rostov-on-Don, Russia

    Introduction: Since nitrogen oxide (NO) is an essential component of the immune system, the dynamics of plasma NO concentration was studied in order to predict the development of sepsis [1, 2].

    Methods: With the permission of the Ethics Committee included the 200 full-term newborns with respiratory diseases on a ventilator, retrospectively divided into two groups (I, n=46 - sepsis 4-5 days; II, n=154 without sepsis), at 1, 3-5, 20 days was studied by ELISA the plasma concentration of NO, NOS-2, NOS-3, ADMA (Multilabel Coulter Victor-21420, Finland). To select points "Cut-Off" used the method of ROC-Lines.

    Results: The statistical power of the study was 86.7% (α<0.05). At admission in patients of groups I and II decrease the concentration of NO and increased ADMA in plasma (p<0.05) relative to healthy newborns. After 3-5 days, relatively in patients of groups I and II increased (p<0.05) plasma concentrations of NO, NOS-2, NOS-3, ADMA. NO concentration in patients with sepsis (I) was lower (p<0.05) compared to group II patients at all stages of observation.

    NO concentration in plasma of less than 7.30 μmol/l at admission predicted the development of sepsis with a sensitivity of 88.00% and a specificity of 82.66%.

    Conclusions: The significance of a low concentration of NO in the development of sepsis confirms the relevance of further study of the efficacy, safety and cost-effectiveness of prevention of sepsis, inhaled nitric oxide or other donators NO.


    1. Ryazantseva NV. Cytology 54:105-111.105-111, 2012.

    2. Puhtinskaya M. Critical Care 18:288, 2014.

    P019 Plasma myeloperoxidase conjugated-DNA level as predictors of outcome and organ dysfunction in septic shock patients: possible therapeutic effect of hemoperfusion with a polymyxin B immobilized fiber column

    N Takeyama, Y Maruchi, T Gocho, H Mori, N Takenaka, M Tsuda, H Noguchi

    Aichi Medical University, Aichi, Japan

    Introduction: We have previously reported that hemoperfusion with polymyxin B immobilized on polystyrene fibers in a cartridge (PMX-DHP) reduces endothelial damage by selective removal of activated neutrophils. Ex vivo perfusion experiments demonstrated that activated neutrophils adhered preferentially to PMX filters and that the remaining neutrophils caused less endothelial damage. There is, however, no report about the effect of PMX-DHP on neutrophil extracellular traps (NETs) formation. The objectives of this study were to investigate the correlations between plasma myeloperoxidase (MPO) conjugated-DNA level with degree of organ dysfunction, disease severity, and ICU mortality in septic shock patients. We also investigated the effect of PMX-DHP on MPO-DNA level.

    Methods: Sixty-five septic shock patients admitted at the ICUs of 35 Japanese hospitals treated with PMX-DHP were enrolled. Septic shock was identified using old definition according to the ACCP/SCCM in 1997. Plasma MPO-DNA was measured by sandwich ELISA with anti-MPO and anti-DNA monoclonal antibodies.

    Results: On day 1, septic shock patients displayed a marked increase of plasma MPO-DNA level compared with the healthy volunteers (p=0.008). Plasma MPO-DNA levels were significantly decreased on days 3 and 7 after PMX hemoperfusion. By correlation study, the MPO-DNA level on 7th day was inversely correlated with both the mean arterial pressure (p=0.048) and P/F ratio (p=0.003) at 7th day. Positive correlation was observed between plasma MPO-DNA level on 7th day and SOFA at 7th day (p=0.017). Using Wilcoxon signed-rank test the high MPO-DNA on 3rd day was found to be associated with the hospital mortality (p=0.019).

    Conclusions: High MPO-DNA levels at 3rd and 7th days of septic shock patients are associated with the degree of organ dysfunction and hospital mortality. The beneficial effects of PMX-DHP may be at least partially due to the inhibition of excessive NETs formation.

    P020 Prognostic role of neutrophil lymphocyte ratio (NLR) in critical illness

    M Ferrari, G Keegan, K Williams, ID Welters

    Royal Liverpool University Hospital, Liverpool, UK

    Introduction: Neutrophil-to-lymphocyte ratio (NLR) has been used to predict patient outcomes, with higher values linked to negative prognosis [1]. However, most studies investigate NLR on admission to Critical Care only [2]. In this study we analysed NLR values and trends over a 7-day observation period to define different time courses in survivors and non-survivors of critical illness.

    Methods: This retrospective study included Intensive Care Unit (ICU) patients admitted to the Royal Liverpool University Hospital over a 4-year period. Age, gender, sepsis status (type of sepsis and date of sepsis), length of ICU stay, 28-day mortality and ICU outcome were collected. WCC, lymphocytes and neutrophils values were recorded for the first 7 days after ICU admission. NLR values were calculated for the first 7 days of intensive care admission. Patients with haematological malignancies and readmissions were excluded.

    Results: Data were available for 542 patients. 28-day mortality was 20.3% and ICU mortality was 16.6%. Mean NLR in survivors decreased over the 7-day period, whereas mean NLR in non-survivors remained high throughout the 7-day period. Comparing the mean NLR value of the first 2 days (days 1 and 2) with the mean NLR value of the last 2 days (days 6 and 7), patients were divided into 3 groups: decreasing, stable and increasing NLR. The 28-day mortality was respectively 12,4%, 29,9% and 37,8% (P value < 0,01) and the ICU mortality was 9,7%, 24,8% and 32,4% (P-value < 0,01).

    Conclusions: NLR trend over the first 7 days correlates with mortality in ICU, differing significantly between survivors and non-survivors, and could be used as an outcome predictor.


    1. Hwang SY et al. Am J Emerg Med 35(2):234-239, 2017.

    2. Salciccioli JD et al. Crit Care 19:13, 2015.

    P021 The platelet, eosinophil, age, red cell distribution width (PEAR) score and out-of-hospital outcomes in critical illness survivors

    L Maher1, L Roeker2, T Moromizato3, F Gibbons4, KB Christopher5

    1The Lahey Hospital & Medical Center, Burlington, VT, USA,2Memorial Sloan Kettering Cancer Center, New York, NY, USA,3Okinawa Southern Medical Center and Children’s Hospital, Naha, Japan,4Massachusetts General Hospital, Boston, MA, USA,5Brigham and Women's Hospital, Boston, MA, USA

    Introduction: Simple and accurate identification of high-risk critical illness survivors may allow for targeted monitoring and interventions that may improve outcomes. Our primary study objective was to determine if a risk prediction score based on demographics and surrogates for inflammation measured at ICU admission was predictive for post-hospital outcomes.

    Methods: The PEAR score was previously derived and validated for mortality utilizing ICU admission data (age, gender, surgical vs. medical patient type, red cell distribution width, platelet count and peripheral blood eosinophil count). We performed a 2 center cohort study of 67,591 patients admitted to a MICU or SICU from 1997-2012, who survived to hospital discharge. The primary outcome was 90-day post-hospital discharge mortality. Adjusted odds ratios were estimated by logistic regression models including terms for the PEAR score, sepsis, number of acute organ failures, Deyo-Charlson comorbidity index, and the actual length of stay less the national geometric mean length of stay.

    Results: The cohort was 58% male, 49% surgical ICU with a mean age of 62 years and a 90-day mortality rate of 7.8%. 10% were diagnosed with sepsis. Mean length of stay was 11.2 days. Unplanned 30-day readmission rate was 14%. Patients with the second highest and highest quartile of PEAR Score have an adjusted OR of 90-day post-discharge mortality of 3.95 (95%CI, 3.45-4.52; P< 0.001) and 8.21 (95%CI, 7.16-9.43; P< 0.001) respectively, relative to patients with the lowest quartile of PEAR Score. The AUC for the adjusted prediction model was 0.77. Further, patients with the second highest and highest quartile of PEAR Score have an adjusted OR of 30-day readmission of 1.38 (95%CI, 1.29-1.48; P< 0.001) and 1.64 (95%CI, 1.52-1.78; P< 0.001) respectively, relative to patients with the lowest quartile of PEAR Score.

    Conclusions: Our simple PEAR score robustly predicts the risk of out of hospital outcomes in critical illness survivors.

    P022 Validation of the neutrophil-lymphocyte count ratio and leukocyte score as prognostic markers in community acquired pneumonia

    M Morales-Codina, R Subirana, A Pérez, N Bacelar, J Font, N Angrill, M Gallego, E Diaz, J Vallés

    CSU Parc Tauli, Sabadell, Spain

    Introduction: Some studies suggest the Neutrophil-Lymphocyte Count Ratio (NLCR) and Leukocyte Score (LS) are better in stratifying Community Acquired Pneumonia (CAP) severity than traditional methods. Both can be quickly performed with standard practice cost-effective exams but require further investigation [1, 2].

    Methods: A retrospective analysis was performed on all ED adults with CAP from Oct. 2009 to Jan. 2011. Demographics, FINE, CURB-65, ATS criteria and initial blood tests were collected, admission to ICU and outcome at 30 days evaluated. The scoring systems prognostic value were compared through ROC curves. Cutoffs used were >=10:1 for NLCR and a score >=2 points for LS.

    Results: 1059 patients were enrolled (mean age 65.3 ± 19.7 years, 61.7% male). The most prevalent comorbidities were COPD (18.2%), chronic renal disease (10.3) and solid neoplasm (7.2). ICU admission rate was 6.2% with an average APACHE II of 17.6 points. Overall mortality was 8.3%, 16.7% in ICU. The average CURB-65 scoring was 1.4 ± 1.2 points and FINE 92.6 ± 43.5. 21.3% met >=3 minor ATS criteria.

    In our population the area under the ROC curve were, respectively for predicting admission to ICU and mortality at 30 days, 0.616 and 0.530 for NLCR, 0.559 and 0.615 for IS; versus 0.654 and 0.875 for FINE score, 0.756 and 0.739 for the ATS criteria, whereas 0.706 and 0.796 for the CURB-65.

    Conclusions: The NLCR and LS showed a lower discriminative power compared to traditional FINE, ATS criteria and CURB-65 when applied in a much larger population than their original studies.


    1. Cornelis PC et al. PLoS One 7:e46561, 2012

    2. Blot M et al. Open Forum Infectious Diseases 1:ofu075, 2014

    P023 Biomarkers of platelet activation and their prognostic value in patients in sepsis associated coagulopathy

    D Hoppensteadt1, G Wegryzn1, A Walborn1, P Maia1, S Walborn1, R Green1, M Mosier1, M Rondina2, J Fareed1

    1Loyola University Medical Center, Maywood, IL USA, 2University of Utah School of Medicine, Salt Lake City, UT, USA

    Introduction: Sepsis-associated disseminated intravascular coagulation (SAC) is associated with decreased platelet counts and formation. The widespread activation of platelets contribute to vascular occlusions, fibrin deposition, multi-organ dysfunction, contributing to a two-fold increase in mortality. The purpose was to measure markers of platelet function in the plasma of patients with clinically established SAC and to determine association to disease severity and outcome.

    Methods: Plasma samples from 103 adult intensive care unit (ICU) patients with sepsis and suspected SAC were collected at baseline and on days 4 and 8. DIC scores were calculated using platelet count, D-Dimer, INR, and fibrinogen. Patients were categorized as having no DIC, non-overt DIC, or overt DIC. Plasma levels of CD40L, von Willebrand Factor (vWF), platelet factor-4 (PF-4), and microparticles (MP) were quantified using commercially available ELISA methods.

    Results: Markers of platelet activation were significantly elevated in patients with sepsis alone and with suspected DIC compared to normal healthy individuals on ICU day 0 (p<0.001). Levels of platelet-associated biomarkers were compared between survivors and non-survivors. PF-4 was significantly decreased in non-survivors compared to survivors (p = 0.0156). Patients were stratified based on platelet count and levels of markers were compared between groups. CD40L, vWF, PF4, and MP showed significant variation based on platelet count, with all markers exhibiting stepwise elevation with increasing platelet count.

    Conclusions: Markers of platelet activation were significantly elevated in patients with SAC compared to healthy individuals. PF4 levels showed significant difference based on DIC score or mortality, and differentiated the non-survivors compared to survivors. CD40L, vWF, PF4, and MP showed significant association with platelet count, increasing in a stepwise manner with increases in platelet count (Table 1).

    Table 1 (abstract P023). Markers of platelet activation on Day 0 vs. platelet count

    P024 Prognostic value of mean platelet volume in septic patients: a prospective study

    A Chaari

    King Hamad University Hospital, Bussaiteen, Bahrain

    Introduction: Mean Platelet Volume (MPV) has been reported as a valuable marker of inflammatory diseases. The aim of the current study is to assess the prognostic value of MPV in septic patients.

    Methods: Prospective study including all patients admitted to the intensive care unit (ICU) with sepsis or septic shock. Demographic, clinical and laboratory data were collected. The MPV was checked on admission and on day 3. Two groups were compared: Survivors and non-survivors.

    Results: Thirty-four patients were included. Median age was 69[62-77] years. sex-ratio was 1.8. Median APACHEII score was 21[16-28]. Platelets count on admission was 264[177-391] with a MPV of 8.4[8-9.3] FL. On day3, platelets count was 183[101-265] with a MPV of 8.6[8-9.4] FL. MPV increased on day 3 in 19 patients (55.9 %). Mechanical ventilation was required for 20 (58.8 %) patients and CRRT was required for 14 (41.2 %) patients. The ICU length of stay was 7[3-12] days. Twelve patients died in the ICU (35.3 %).

    Survivors were younger than non-survivors (66[60-76] versus 73[69-86] years; p = 0.016) and had lower APACHEII score (20[15.8-25.5] versus 27[20-33.5]; p = 0.04). MPV on admission and on day 3 were comparable between the two groups (respectively 8.3[8-9.5] versus 8.5[7.6-9.3] FL; p = 0.999 and 8.5[7.9-9.3] versus 9[8.2-9.6] FL; p = 0.369). However, the platelets count on D3 was significantly lower in the non-survivors (227[132-336] versus 113[44-231]; p = 0.049). The ICU length of stay was 7[3-12] days in survivors and 8.5[3.5-12] days in non-survivors (p=0.623).

    Conclusions: The decrease of the platelet count but not the increase of the MPV was associated with increased mortality in critically-ill septic patients.

    P025 Endotoxin activity assay levels measured within 24 hours after ICU admission affected patients’ severity assessments

    A Kodaira1, T Ikeda2, S Ono2, S Suda2, T Nagura2

    1Tokyo Medical University, Tokyo, Japan,2Tokyo Medical University, Hachioji Medical Center, Tokyo, Japan

    Introduction: Endotoxin is a major component of the cell wall of Gram-negative bacteria and is the principal molecule responsible for the induction of septic shock. A prospective cohort study (MEDIC study) of 857 consecutive new ICU patients evaluated the usefulness of endotoxin activity assay (EAA) as a diagnostic tool in sepsis and septic shock.

    Methods: This study was performed to classify EAA values measured within 24 hours of ICU admission into a high risk (H) group (EAA>0.6; N=154; mean age ± SD = 64±13; median 70) and a low risk (L) group (EAA<0.4; N=174; 68±16; 72), and then, to evaluate patient severities (APACHE 2 score, SOFA score) and sepsis-related biomarkers (procalcitonin, IL-6, angiopoietin 2), comparing groups. Results were expressed as the mean ± SD (median). The Mann-Whitney U-test and chi-square test or Fisher’s test were used for statistical analysis.

    Results: The APACHE 2 score of the H-group was 26.5±9.5 (27.0), while that in the L-group was 19.9±9.1 (18.0), and the difference between the groups was statistically significant (p<0.05). The SOFA score of the H-group was 9.6±4.1 (10.0) and that of the L-group was 7.2±4.6 (6.5) (p<0.05).

    PCT of the H-group was 37.8 ±58.5(11.7)and that in the L-group was 9.6±25.5 (1.6), but there was no significant difference between the groups. IL-6 of the H-group was 19,483±61,281 (1160) and that of the L-group was 6256±39,321 (144). Angiopoietin 2 of the H-group was 12,822±10,593 (10,100) and that in the L-group was 6004±4441 (4105). These two biomarkers indicated significant differences between the groups.

    Survival rate of the H-group was 78.9% (153 survived, 41 died), and that of the L-group was 85.5% (147 survived, 25 died). Statistically, there was no significant difference between the groups.

    Conclusions: These results indicate that the EAA value measured within 24 hours after ICU admission is a useful marker for a patient’s severity assessment, but not for outcome prediction.

    P026 Interleukin 10 release after ex vivo stimulation of whole blood predicts clinical outcomes in sepsis

    H Perrichet1, C Martin-Chouly2, JT Ross3, C Rousseau2, P Seguin1, N Nesseler1

    1University Hospital Pontchaillou, Rennes, France,2Rennes 1 University, Rennes, France,3University of California, San Francisco, CA, USA

    Introduction: Sepsis profoundly alters immune homeostasis by inducing first a systemic pro-inflammatory, then an anti-inflammatory state. We evaluate the prognostic value of ex vivo lipopolysaccharide (LPS) stimulation of whole blood in septic patients, at day 1 and 7 after intensive care unit (ICU) admission.

    Methods: This prospective cohort study included patients with severe sepsis or septic shock admitted to a surgical ICU of a university hospital. Blood was drawn on day 1 and day 7, and stimulated ex vivo with LPS for 24 hours. Tumor necrosis factor alpha (TNF), interleukin (IL) 1, IL6 and IL10 were measured. Twenty-three healthy adults served as controls. Outcomes were ventilator and ICU-free days, SOFA score at day 1 and 7, and need for dialysis during the course of sepsis.

    Results: Forty-nine patients were included (mean age 62 ± 15 years). The blood of septic patients was less responsive to ex vivo stimulation with LPS than that of healthy controls, as demonstrated by lower TNF, IL1, IL6 and IL10 release (Fig. 1). At day 1, patients above the 50th percentile of IL10 release had significantly fewer ventilator and ICU-free days than those in the lower 50th percentile (Fig. 2). In contrast, patients in whom IL10 release increased between day 1 and day 7 had significantly lower SOFA scores at day 1 and 7 and need for dialysis, and more ICU-free days than patients in whom IL10 release decreased (Table 1).

    Conclusions: Greater LPS-stimulated IL10 release in septic patients at day 1 was associated with poorer clinical outcomes and may reflect the severity of the forthcoming immunoparalysis. However, an increase in IL10 release between day 1 and day 7 was associated with favorable outcomes, perhaps signaling immune restoration.

    Table 1 (abstract P026). Clinical outcomes of septic patients according to interleukin 10 release evolution between day 1 and 7 after ICU admission
    Fig. 1 (abstract P026).

    LPS-stimulated IL10 levels in septic patients and controls at day 1 and 7 after ICU admission. Results are presented in pictograms per milliliter (pg/mL)

    Fig. 2 (abstract P026).

    Number of ICU-free days to day 28 between the lowest group of LPS-stimulated IL10 production (quartiles I and II) and the highest group of LPS-stimulated IL10 production (quartiles III and IV)

    P027 Is serum procalcitonin a reliable marker of bacterial sepsis after hyperthermic intraperitoneal chemotherapy with cytoreductive surgery (HIPEC-CRS)?

    Y Al Drees, A Alrbiaan, A Elhazmi, T Amin, N Salahuddin

    King Faisal Specialist Hospital and Research Centre, Riyadh, Saudi Arabia

    Introduction: Hyperthermic Intraperitoneal Chemotherapy with Cytoreductive Surgery (HIPEC-CRS) is a curative treatment modality for peritoneal carcinomatosis. Extensive debulking surgery, peritoneal stripping and multiple visceral resections followed by intraperitoneal installation of heated high-dose chemotherapeutic agents, a process leads to a ‘high-inflammatory’ syndrome. Serum procalcitonin (PCT), a biomarker for bacterial sepsis, in the heightened inflammatory state after HIPEC-CRS might be of limited utility. Our aim is to determine the trends of PCT in the early postoperative phase of HIPEC-CRS and to identify trends in patients with and without bacterial sepsis

    Methods: In a case-control design, we reviewed all patients undergoing HIPEC-CRS over a 24-month period (2015-2017). Patients were divided into 2 groups based on whether they developed bacterial sepsis in the first 5 days after surgery (infected v/s non-infected). Summary data are expressed as medians and ranges. Two-tailed nonparametric tests were performed and considered significant at p values of less than 0.05

    Results: 82 patients’ data was analyzed. Infections developed in 16% (13 patients) with Escherichia coli as the predominant pathogen isolated (36% isolates). PCT levels (ngm/ml) were elevated postoperatively in both infected and non-infected patients; Day 1 infected 0.97 (IQR 0.5, 3.2) v/s non-infected 0.68 (0.2, 1.6) p=not significant (ns), Day 2 infected 1.14 (0.8,4.2) v/s 1.2 (0.5, 3.4) p=ns, Day 3 infected 1.82 (0.6, 11.5) v/s 0.73 (0.3, 2.2) p=0.05. The differences became statistically significant only by the 4th day (Fig. 1); Day 4 infected 1.32 (0.4, 8.2) v/s 0.53 (0.19, 1.1) p=0.012, Day 5 infected 0.85 (0.09, 3.9) v/s 0.28 (0, 0.7) p=0.047

    Conclusions: HIPEC-CRS is associated with an early postoperative increase in PCT levels, independent of the presence of bacterial sepsis. The study demonstrate that HIPEC-CRS is a stimulus for PCT release and that decisions for antimicrobial therapy should not be based solely on elevated PCT values

    Fig. 1 (abstract P027).

    Daily serum Procalcitonin levels post HIPEC-CRS shows statistically significant difference between infected and non-infected patients only on the 4th post-operative day

    P028 Early prognostic value of serum procalcitonin in post cardiac surgery patients with fever.

    S Sudarsanan, A Pattath, A Omar, P Sivadasan, S Aboulnaga, F Hamwi, A Al Khulaifi

    Heart Hospital, Hamad Medical Corporation, Doha, Qatar

    Introduction: Early outcome in cardiac surgery has been an area of growing interest where the given risks raise several predictive models for assessment of postoperative outcome [1]. Procacitonin (PCT) emerges as a possible predictive tool in cardiothoracic intensive care unit (CTICU).We aim at testing the predictive power of PCT for early morbidity, prolonged ventilation, ICU and hospital stay, in patients developing early fever after cardiac surgery

    Methods: A retrospective descriptive study done in tertiary cardiac center, enrolling patients who stayed for more than 24 hours post-operatively in the CTICU Risk stratification included additive Euro score and PCT immunoluminometricaly prior to surgery and every 48 hours in response to onset of fever.

    Results: We screened 501 consecutive patients who underwent open heart cardiac, of which 119 patients were enrolled in the study. Patients were divided into two groups based on the level of PCT, those with value > 2 ng/ml (Group 1) and those with level < 2 ng/ml (Group 2). Patients in group 1 as compared to Group 2, over the postoperative course was associated with prolonged ICU stay (P=0.04), length of mechanical ventilation (P=0.05), length of hospitalization (p=0.05), acute kidney injury (P=0.04) and culture positivity (P=0.02). Multivariate analysis showed that PCT >2ng/ml was was significantly associated with positive cultures. (p=0.023)

    Conclusions: A rise of serum PCT carries the signals of early ICU morbidity and lengths of ventilation, ICU stay and hospital stay


    1. Iyem H. Cardiovasc J Afr 20(6):340-3, 2009.

    P029 Analysis of the trend of procalictonin levels in post cardiac surgery patients

    S Sudarsanan, A Pattath, A Omar, P Sivadasan, S Aboulnaga, F Hamwi, A Al Khulaifi

    Heart Hospital, Hamad Medical Corporation, Doha, Qatar

    Introduction: The diagnostic utility of procacitonin (PCT) in cardiac surgery remains controversial [1] where the systemic inflammatory response (SIRS) induced by the cardiopulmonary bypass is claimed to be associated with elevated levels of PCT [2]. We aim to find a correlation between the level of PCT and the yield of positive blood culture in post operative fever in patients with intensive care unit (ICU) stay more than 24 hours post cardiac surgery.

    Methods: Single center retrospective descriptive study over five years, enrolling patients who stayed for more than 24 hours post-operative, in the cardiothoracic ICU. PCT was assayed immunoluminometricaly prior to surgery and every 48 hours in response to onset of fever

    Results: We screened 501 patients, of which 119 were enrolled in our study.Patients were divided into two groups according to the presence (Group 1), or absence of positive culture (Group 2).The mean PCT was significantly higher in Group 1 (19.0±4.6 versus 9.9±2.7, p=.033). Moreover, patients in Group 1 were associated with prolonged ICU stay,hospital stay and length of mechanical ventilation (p=0.00, 0.00, and 0.01 respectively)

    Conclusions: The results showed that post cardiac surgery bacterial infections were associated with rise of PCT in contrast with patients who develop SIRS. The outcome measures were significantly worse in the culture positive group.


    1. Boeken U, et al. Cardiovasc Surg 8(7):550-4, 2000.

    2. Prat C, et al. J Cardiac Surg 23(6):627-32, 2008

    P030 Prognostic value of procalcitonin, pro-adrenomedullin, copeptin and atrial natriuretic peptide as predictors of duration of artificial lung ventilation and length of stay in intensive care unit for newborns and children of the first year of life after cardiac surgery with cardiopulmonary bypass

    A. Khrustalev, D. Popov, O. Stepanicheva

    Bakoulev Scientific Center for Cardiovascular Surgery, Moscow, Russia

    Introduction: Due to sequalae in the postoperative period children of the first year of life may require prolonged artificial lung ventilation (ALV) and prolonged length of stay (LOS) in the intensive care unit (ICU) after cardiac surgery. Procalcitonin (PCT), pro-adrenomedullin (MR-proADM), copeptin (CT-proAVP) and atrial natriuretic peptide (MR-proANP) can be predictors of duration of ALV and LOS in ICU.

    Methods: 42 patients aged 153 (44-252) days (4-360 days) underwent cardiac surgery with cardiopulmonary bypass for severe congenital heart disease. In the dynamics levels of PCT, MR-proADM, CT-proAVP and MR-proANP were measured before surgery and on the 1, 2, 3 and 6 days after the operation with the Kryptor compact plus analyzer. Data are presented as medians with interquartile range. The Mann-Whitney U-test was used to compare the data. Values of p <0.05 were statistically significant.

    Results: 24 patients (57%) required ALV for more than 72 hours. In this group statistically significant higher levels of PCT, MR-proADM and MR-proANP were found throughout the period (Table 1). The level of CT-proAVP had increased to statistical significance since the 3 day after the operation. 23 patients were in the ICU for more than 168 hours. In this group statistically significant higher levels of PCT, MR-proADM were found throughout the whole period (Table 2). The higher level of MR-proANP was statistically significant on the 1st and 6th days after surgery, MR-proANP had a tendency of increasing values on 2nd and 3rd days. CT-proAVP increased to statistical significance since the 2nd day after the operation and persisted throughout the studied period.

    Conclusions: PCT, MR-proADM and MR-proANP can be used as predictors of prolonged ALV for children of the first year of life after cardiac surgery with cardiopulmonary bypass. The level of CT-proAVP can be considered since the 3 day after surgery. PCT and MR-proADM may be used to predict the LOS in the ICU. MR-proANP and CT-proAVP can be considered since the 1 and 2 days after surgery respectively.

    Fig. 1 (abstract P030).

    Duration of ALV

    Fig. 2 (abstract P030).

    Length of stay in ICU

    P031 Association of inflammatory and hemostatic biomarkers with inflammasomes in septic patients at risk of developing coagulopathy

    D Hoppensteadt, R Green, A Walborn, G Wegrzyn, S Walborn, M Mosier, J Fareed

    Loyola University Medical Center, Maywood, IL, USA

    Introduction: Inflammasome contributes to the innate immune response identification of pattern recognition receptors (PRRS) on pathogens including bacterial and viruses. The purpose of this study is to quantitate inflammasome levels in defined sepsis associated patients and to determine its potential relevance to various biomarkers of hemostatic dysregulation.

    Methods: Plasma samples from 52 adults with sepsis and suspected coagulopathy were analyzed. Fibrinogen was measured using a clot based method on ACL-ELITE coagulation analyzer. Cortisol, D-dimer, PAI-1, NLRP-3 inflammasomes, MP-TF, Fibronectin, and CD40L were measured using commercially available ELISA assays.

    Results: When comparing patients with sepsis and suspected DIC to the normal plasma samples, there was a significant elevation in NLRP-3 inflammasome levels in the sepsis cohort (p = < 0.0001) (Fig. 1). The NLRP-3 inflammasome concentration in the sepsis cohort did not correlate with other biomarkers. An elevated level of NLRP-3 inflammasomes was significantly associated with an increased levels of PAI-1 (p < 0.0004) (Table 1).

    Conclusions: The current study shows a significant relationship between inflammasomes and PAI-1 levels in patients with sepsis associated coagulopathy. The positive correlation between NLRP-3 inflammasomes and PAI-1 shows that the activation of inflammasomes may have a role in the upregulation of PAI-1.

    Table 1 (abstract P031). Inflammatory and hemostatic biomarkers correlated with NLRP-3 inflammasome levels in patients with sepsis and suspected DIC
    Fig. 1 (abstract P031).

    NLRP-3 inflammasomes in patients with sepsis and suspected DIC on Day 0 (n=52) compared to normal healthy controls (n=24)

    P032 Heparin-binding protein (HBP) as an index of pro-inflammation in sepsis

    T Gkavogianni, E Tzouveli, D Benas, G Papagiannopoulou, S Grigoropoulou, A Spanos, E Giamarellos-Bourboulis

    ATTIKON University Hospital, Athens, Greece

    Introduction: Easily measurable biomarkers to indicate the state of immune activation in sepsis remain an unmet need. HBP is secreted from neutrophils and it is increased in sepsis. However, its association with the innate immune function is poorly understood.

    Methods: Plasma was isolated on three consecutive days from 30 patients with ventilator-associated pneumonia meeting the Sepsis-3 definitions. Monocytes were also isolated on day 1 and stimulated with lipopolysaccharide (LPS) for the production of tumour necrosis factor-alpha (TNFalpha). HBP, ferritin and TNFalpha were measured by an enzyme immunoassay. Over-time changes of HBP were associated with final outcome.

    Results: A positive association was found between ferritin concentrations and circulating HBP on day 1 (rs: +0.371, p: 0.0002). The median value of HP on day 1 was 177 ng/ml. The stimulated production of TNFalpha in relation to the median HBP level is shown in Fig. 1. Among 20 survivors, mean change of HBP from the baseline was -12.2% +/- 20.2%; this was +146.8% +/- 89.4% among 10 non-survivors (p: 0.028). After ROC curve analysis it was found that more than 18% increase of HBP after 48 hours was associated with 81% specificity for 28-day mortality (odds ratio for unfavorable outcome 8.50; p: 0.017).

    Conclusions: HBP seems to indicate patients who rely at the pro-inflammatory arm of sepsis since it correlates positively with ferritin and with the increased stimulated production of TNFα from circulating monocytes. Increases the first 48 hours by more than 18% indicate progression towards unfavorable outcome.

    Fig. 1 (abstract P032).

    See text for description

    P033 Integration of heparin-binding protein (HBP) and one sign of quick SOFA score (qSOFA) to predict 30-day outcome.

    E Kyriazopoulou, C Psarrakis, C Moschopoulos, I Christou, P Fragkou, T Marantos, E Karofyllakis, K Roussakis, E Giamarellos-Bourboulis

    Attikon University Hospital, Athens, Greece

    Introduction: Early prediction of the risk of death among patients admitted at the Emergency Department (ED) remains an unmet need. The prognostic performance of HBP that is secreted by neutrophils was prospectively validated in a series of sequential ED admissions.

    Methods: HBP and elements of qSOFA were analyzed prospectively in 310 serial ED admissions (main reasons for admission: acute abdominal pain 28.4%; fever 24.5%; vomiting/diarrhea 23.9%; dyspnea 22.3%; neurologic signs 11.3%; non-specific complaints 38.1%; most patients admitted for more than one reasons). Upon ED admission patients were scored as low-risk, intermediate-risk and high-risk at the discretion of the physician. HBP was measured in blood samples upon admission by an enzyme immunosorbent assay.

    Results: HBP was significantly greater among patients who died very early (Fig. 1). In five out of six of patients dying early HBP was greater than 15 ng/ml. We combined HBP more than 15 ng/ml and the presence of one sign of qSOFA into a new score; this had 82.4% sensitivity to predict 30-day mortality. The respective sensitivity of two signs of qSOFA was 23.5% (p: 0.002). The use of this new score allowed better stratification of patients originally considered at the triage as low-risk into high-risk (Fig. 2).

    Conclusions: We propose HBP more than 15 ng/ml and one qSOFA sign as an early score for 30-day mortality at the ED.

    Fig. 1 (abstract P033).

    See text for description

    Fig. 2 (abstract P033).

    See text for description

    P034 Usefulness of heparin binding protein in early diagnosis of septic shock

    C Balci, E Haftaci, B Koyun

    Health Sciences University, Kocaeli Derince Traning Hospital, Kocaeli, Turkey

    Introduction: Despite of our growing knowledge in pathophysiology of septic shock still remain one of the most important factors of hospital mortality. It is thought that early diagnosis and treatment at early stage of septic shock would decrease its mortality. There have been on-going studies in recent years which research the usability of Heparin Binding Protein (HBP) in early diagnosis of sepsis [1]. To seek the usability of C- reactive protein (C-RP), procalcitonin (PCT) and HBP biomarker combination in early diagnosis of septic shock.

    Methods: 30 patients, who have the diagnosis of septic shock, that are expected to stay in intensive care unit more than 24 hours, and aged between 22-75 are included in the study. Data are collected from the patients’ blood samples that are drawn on admission, on the 24th hour, and on the day of discharge or death.

    Results: It has been found in our study that, best “cut-off” value 124 ng/mL, specificity 0.82 and sensitivity 0.77 for HBP. Compared with other biomarkers, HBP was the best predictor of progression to organ dysfunction (area under the receiver operating characteristic curve (AUC) = 0.801).

    Conclusions: Although there have been many biomarkers for early diagnose of septic shock, C-RP and PCT are the most common used markers in nowadays’ clinical practice. The usability of HBP in early diagnosis of sepsis is still being researched. We concluded that PCT, C-RP and HBP biomarker combination is usable to diagnose septic shock at the end of our study.


    1. Holub M, Beran O. Crit Care 16(3):133, 2012.

    P035 Change of ADAMTS-13 during sepsis is associated with outcome

    I Vasileiadis1, M Politou2, N Rovina1, S Dimopoulos2, E Tripodaki3, A Kyriakoudi1, E Ntouka1, E Stavrou1, A Koutsoukou1

    1Sotiria Hospital, National and Kapodistrian University of Athens, Athens, Greece,2Onasseio Cardiac Surgery Center, Athens, Greece,3Agios Savvas Regional Cancer Hospital, Athens, Greece

    Introduction: Reduced ADAMTS-13 and increased von Willebrand Factor (vWF)/ADAMTS-13 ratio have been observed in sepsis and are associated with the severity of the disease [1,2]. However, their change during the septic episode and in the event of a change in the clinical status of the septic patients has not been investigated. The aim of the study was to assess the variation of these hemostatic parameters in critically ill patients during the course of a septic episode.

    Methods: We monitored 34 septic patients admitted in the Intensive Care Unit (ICU). 23 improved (group A) while 11 deteriorated (group B). We assessed vWF, ADAMTS-13 and the vWF/ADAMTS-13 ratio on admission in ICU (time point 0) and at the time of a change in patients’ clinical condition (remission or deterioration, time point 1).

    Results: In group A, ADAMTS-13 and the vWF/ADAMTS-13 ratio did not significantly change (567.0±296.0 vs 670.7±534.5 ng/ml, p=0.238 and 0.709±0.588 vs 0.876±0.687, p=0.34 respectively) while vWF increased (326.2±122.7 vs 407.0±157.6 % of norm., p=0.028) at time point 1 compared to time point 0. In group B, ADAMTS-13 decreased (831.4±586.1 vs 482.0±277.8 ng/ml, p=0.026) while vWF and the vWF/ADAMTS-13 ratio increased (389.5±170.5 vs 525.3±141.0 % of norm., p=0.02 and 0.779±0.851 vs 1.490±1.060, p=0.002) at time point 1 compared to time point 0. There was a non-statistical greater increase (% change) of vWF (53±63 versus 35±63%, p=0.4) in group B patients compared to group A patients. ADAMTS-13 percentage difference (>or<= 22%) was associated with sepsis outcome (χ =8.7; HR:5.86; 95% CI:1.6-22.1; p=0.009).

    Conclusions: Hemostatic disorders, as assessed by vWF and ADAMTS-13 levels were detected in septic patients, while their changes differed according to the evolution of the septic episode. ADAMTS-13 changes may be associated with outcome.


    1. Fukushima H et al. Shock 39(5):409-14, 2013.

    2. Azfar MF et al. Clin Invest Med 40(2):E49-E58, 2017.

    P036 Integration of biomarkers and clinical signs for the early diagnosis of sepsis

    L Lazaridis1, S Karatzas2, A Prekates3, K Toutouzas2, C Mathas4, J Popp5, J Olsen6, E Giamarellos-Bourboulis1

    1Attikon University Hospital, Athens, Greece,2Ippokrateion General Hospital, Athens, Greece,3Tzaneion Hospital, Piraeus, Greece,4Aghia Olga Hospital, Athens, Greece,5Leibniz Instiute for Photonic Technology, Jena, Germany,6Virogates SA, Copenhagen, Denmark

    Introduction: The Sepsis-3 Task Force has introduced quick sequential organ failure assessment (qSOFA) as a diagnostic tool for the early diagnosis of sepsis. However, the Sepsis-3 criteria and qSOFA have not yet been prospectively validated. INTELLIGENCE-1 ( NCT03306186) is aiming in this prospective validation through the integration of clinical signs and biomarkers.

    Methods: 100 adult patients with at least one sign of qSOFA and infection or acute pancreatitis or after operation were prospectively followed-up. Blood was sampled the first 24 hours; those with HIV infection, neutropenia and multiple injuries were excluded. Sepsis was diagnosed using the Sepsis-3 criteria. Soluble urokinase plasminogen activator receptor (suPAR) was measured by an enzyme immunoassay.

    Results: Sixty patients were classified with sepsis using the Sepsis-3 definitions. Presence of at least two signs of qSOFA had 56.7% sensitivity, 95.0% specificity, 92.8% positive predictive value and 38.0% negative predictive value for the diagnosis of sepsis. The integration of qSOFA signs and suPAR improved the diagnostic performance (Fig. 1).

    Conclusions: Conclusions Two signs of qSOFA have significant positive prognostic value for sepsis but low sensitivity. This is improved after integration with suPAR.

    The INTELLIGENCE-1 study is supported by the European Commission through the Seventh Framework Programme (FP7) HemoSpec.

    Fig. 1 (abstract P036).

    See text for description

    P037 TRIAGE Study protocol: Assessment of biomarkers to predict clinical worsening of patients with sepsis admitted in the Emergency Department

    T Lafon1, C Vallejo1, L Barbier2, MA Cazalis2, T Daix1, A Desachy1, V Gissot3, PF Laterre4, K Tazarourte5, B François1

    1Centre Hospitalier Universitaire Dupuytren, Limoges, France,2bioMérieux SA, Marcy l’Etoile, France,3CHU de Tours/Université François Rabelais, Tours, France,4Cliniqies Saint-Luc, Brussels, Belgium,5Groupement Hospitalier Edouard Herriot - Hospices Civils de Lyon, Lyon, France

    Introduction: Sepsis is a frequent reason for admission in the Emergency Department (ED) and its prognostic mainly relies on early diagnosis. In addition, no validated prognostic tool is currently available. Therefore, identification of patients at high risk of worsening in the ED is key. The TRIAGE objective was to assess the prognostic value of a blood marker panel to predict early clinical worsening of patients admitted in the ED with suspected sepsis.

    Methods: TRIAGE was a prospective, multicenter (11 sites in France and Belgium) study on biological samples conducted in partnership with bioMerieux S.A. Patients admitted in the ED with suspected or confirmed community-acquired infection for less than 72h were included. Exclusion criteria were: admission in the ED for more than 12 hours, septic shock at admission, immunodepression, sepsis syndrome 30 days prior to admission. The protocol included 5 clinical and biological time points (H0, H6, H24, H72, D28). Patients were classified in 3 groups at admission (infection, sepsis, severe sepsis) and divided into 2 evolution/prognosis groups depending on worsening or not from their initial condition to severe sepsis or septic shock and SOFA score’s evolution. The evolution criteria were centrally evaluated by an independent adjudication committee of sepsis experts including emergency physicians and intensivists. Patients were followed up to day 28 for mortality.

    Results: The study duration was 3 years with 600 patients included (102 excluded). The centralized analysis is in progress to select the combination of biomarkers with the best prognostic performance comparing both evolution/prognosis groups. Currently, 125 patients have been classified as worsening and some results will be available in 2018.

    Conclusions: TRIAGE is the largest prospective multicenter study assessing the prognostic value of a panel of blood markers in EDs which could help identification of septic patient at risk of worsening at time of admission in the ED and develop specific management.

    P038 Immune profiling of host response biomarkers through transcriptomic analysis using the FilmArray® system.

    DM Tawfik1, L Ganee1, A Bocquet1, V Moucadel1, J Textoris1, T Rimmele2, G Monneret 2, S Blein 1, M Rol 3, J Montgomery4, F Mallet1, A Pachot 1, J Yugueros Marcos 1,.REALISM Study group5

    1bioMerieux, Lyon, France,2Hospices Civils de Lyon, Lyon, France,3Bioaster Technology Research Institute, Lyon, France,4BioFire Diagnostics LLC Salt Lake City, UT, USA,5REALISM Study group, Lyon, France

    Introduction: Immune status characterization in Intensive Care Unit (ICU) patients presents a major challenge due to the heterogeneity of response. In this study, the FilmArray® system was used with customized gene assays to assess the immune profile of critically-ill ICU patients compared to healthy volunteers; from within the REALISM cohort.

    Methods: A customized FilmArray® pouch containing 24 assays was designed; 16 target and 8 reference genes. Detection and semi-quantification of assays from whole blood collected in PAXgene tubes occurs in the device within 1 hour. A total of 20 subjects from the REALISM cohort were tested in duplicates: 1 trauma, 5 septic shock and 5 surgery patients, along with 9 healthy volunteers. The patients’ selection was based on HLA-DR expression on monocytes, and PHA-(Phytohaemagglutinin) stimulated T-cell proliferation assay, to have various immune profiles.

    Results: Quantification cycle values of the target genes were normalized by the geometrical mean of reference genes to account for the different cell counts among specimens. The number of the CD3+ cells and HLA-DR, determined by flow cytometry, showed good correlation to CD3D and CD74 gene expression, respectively. Seven genes showed significant differences in expression levels between the healthy volunteers and patient groups: CD3D, CD74, CTLA4 & CX3CR1 were down-regulated, while IL-10, IL1RN and S100A9 were up-regulated in the patient populations. The use of relative quantitative difference of some markers was able to distinguish and emphasize the variability between the patient groups while homogenizing the discrepancy among healthy volunteers.

    Conclusions: The FilmArray® system was shown to allow host transcriptomics analysis of immune-relevant genes directly from PAXgene tubes, in only one hour. These results show great potential for the development of a fully automated immune profiling tool, enabling close monitoring of critically-ill patients.

    P039 Rapid biophysical analysis of host immune cells enables diagnosis of sepsis in the emergency department

    M Macdonald1, R Sheybani1, A DeWitt2, S Brierre3, T Caffery3, T Jagneaux3, C Thomas3, D Di Carlo4, H Tse1, A Shah1, H O’Neal3

    1CytoVale, San Francisco, CA, USA,2Baton Rouge General Medical Center, Baton Rouge, LA, USA,3Louisiana State University Health Sciences Center, Baton Rouge, LA, USA,4University of California, Los Angeles, CA, USA

    Introduction: Early, rapid diagnosis is integral to the efficient effective treatment of sepsis; however, there is no gold standard for diagnosis, and biochemical surrogates are of limited and controversial utility. The CytoVale system measures biophysical properties of cells by imaging thousands of single cells per second as they are hydrodynamically stretched in a microfluidic channel. This platform has been shown to measure dozens of mechanical, morphological, and cell surface biomarkers of WBC activation simultaneously [1,2]. In this study, we show the performance of the CytoVale system in measuring biophysical markers for sepsis detection in the emergency department (ED).

    Methods: We conducted an IRB-approved prospective cohort study of Emergency Department (ED) patients with 2+ SIRS criteria and evidence of organ dysfunction. 307 patients were included for analysis. Blood samples for the Cytovale assay were collected in the ED, and the diagnosis of sepsis was adjudicated by blinded clinician review of the medical record. Captured imaging data were analyzed using computer vision to quantify mechanical parameters per cell, and a logistic model was trained to discriminate patients who had sepsis from those who did not.

    Results: We found substantial biophysical differences between cells from septic and non-septic patients as observed at both the single cell level (Fig. 1) and when looking at the overall leukocyte populations (Fig. 2). A multiparameter classification algorithm to discriminate septic from non-septic patients based on biophysical markers currently yields a sensitivity of 88% with a negative predictive value of 95%.

    Conclusions: In patients presenting to the ED with 2 of 4 SIRS criteria and evidence of organ dysfunction, the CytoVale system provides a potentially viable means for the early diagnosis of sepsis via the quantification of biophysical properties of leukocytes.


    1. Gossett DR et al. PNAS 20:7630–5, 2012

    2. Crawford K et al. AJRCCM, under review, 2017

    Fig. 1 (abstract P039).

    Mechanical phenotyping reveals new biophysical markers of WBC activation

    Fig. 2 (abstract P039).

    Performance of two biophysical markers (diameter and aspect ratio) in discriminating non-septic and septic patients at the leukocyte population level

    P040 Oxidative stress and other biomarkers to predict the presence of sepsis in ICU patients

    V Tsolaki, M Karapetsa, G Ganeli, E Zakynthinos

    ICU, Larissa, Greece

    Introduction: Early identification of sepsis adds a survival benefit in ICU patients. Several biomarkers have been evaluated, yet an optimal marker is still lacking [1].

    Methods: We prospectively determined oxidative status in patients admitted in a general Intensive Care Unit of the University Hospital of Larisa. Oxidative status was determined measuring the novel static (sORP) and capacity (cORP) oxidation-reduction potential markers. Other biomarkers (BNP, presepsin, CRP) were measured, and the discriminative properties for the detection of sepsis were evaluated.

    Results: Oxidative status was evaluated in a hundred and fifty two consecutive patients. Patients with severe sepsis and septic shock had significantly higher sORP values than patients without sepsis (173.31± 20.44 vs 164.11±18.78, p=0.006), while cORP did not differ (0.34±0.31 vs 0.37±0.20, ns). Patients with cerebral damage had the lowest sORP on admission while surgical and medical patients had the highest sORP values (157.2 ±18.33 vs 174.04±18.1 respectively, p<0.001). sORP could predict the presence of sever sepsis (OR 1.107, p=0.009), along with presepsin (OR 1.002, p<0.0001), C-Reactive Protein values (OR 1.161, p=0.013) and Brain Natriuretic Peptide (OR 1.001, p=0.046). The best discriminating properties had presepsin (AUC 0.893, p<0.0001) and CRP (AUC 0.743 p<0.0001). The presence of a microorganism in blood or bronchial secretions could be predicted from the values sORP (AUC 0.633, p=0.042) and CRP (AUC 0.653, p=0.02).

    Conclusions: Oxidative status differs between patients admitted in the ICU and could serve as a prognostic marker for the presence of sepsis.


    1. Singer M, et al. JAMA 315(8):801-810, 2016

    P041 Relationship between pre-operative C-reactive protein elevation and major cardiovascular events after vascular surgery

    I Ben Naoui, A El Ghali, AG Ammous, I Nefzi, A Saidi, A Ammous, A Cherif

    La Rabta Hospital, Tunis, Tunisia

    Introduction: C-reactive protein (CRP), is reported to be an effective marker for the assessment of vascular inflammation activity and acute coronary events prediction [1].We hypothesized that preoperative CRP elevation is related to the occurrence of postoperative adverse cardiovascular outcomes.

    Methods: We prospectively included patients scheduled to undergo different vascular surgeries from december 2016 to september 2017. we assessed demographic data, comorbidities, revised cardiac risk index (RCRI) and biomarkers (CRP, cardiac troponin high sensitive Ths, creatinine and urea) in the preoperative period. we also noted type and duration of surgery, intraoperative blood loss, ICU stay and mortality. we evaluated CRP as a predictive marker of major cardiovascular events defined as chest pain, Ths elevation, electrocardiogram changes, arrhythmia, pulmonary embolism, stroke occuring within postoperative 3 months.

    Results: During our study, 30 patients were scheduled to undergo vascular surgeries. From the 30 patients, 66% developed adverse cardiac events (Table 1). We showed the predictive value of CRP in major cardiovascular event in a ROC analysis (Fig. 1). The cuttoff value of CPR was 54 giving 85% of sensitivity and 82% of specificity.

    Conclusions: Our study pointed out that CRP preoperative elevation could have a very strong predictive value of post-operative cardiovascular events in vascular surgery, this is in line with results showed by previous studies [1].


    1. Yunxiang Li. Int J Clin Exp Pathol 9:11904-11910, 2016

    Table 1 (abstract P041). Major cardiovascular events immediately after surgery
    Fig. 1 (abstract P041).

    The area under the curve for CRP elevation is 0.891

    P042 Impact of age in critically ill infected patients: a post-hoc analysis of the INFAUCI study

    S Moreira1, J Baptista1, F Froes2, J Pereira3, J Gonçalves-Pereira4, C Dias5, J Paiva3

    1Centro Hospitalar e Universitário de Coimbra, Coimbra, Portugal,2Hospital Pulido Valente, Centro Hospitalar Lisboa Norte, Lisboa, Portugal,3Centro Hospitalar São João, Porto, Portugal,4Hospital Vila Franca de Xira, Vila Franca de Xira, Portugal,5Centro de Investigação em Tecnologias e em Serviços de Saúde, Porto, Portugal

    Introduction: Elderly are particularly susceptible to bacterial infections and sepsis, and they comprise an increasing proportion of intensive care unit (ICU) admissions. Our aim was to evaluate the impact of age on critically ill infected patients.

    Methods: We performed a post-hoc analysis of all infected patients admitted to ICU enrolled in a 1-year prospective, observational, multicenter study involving 14 ICUs. Patients aged <65, 65-74 and >=75 years were compared (group A, B, and C). Multidrug-resistance (MDR) was defined as acquired non-susceptibility to at least one agent within three or more antimicrobial categories.

    Results: Of the 3766 patients analyzed, 1652 (43.9%) were infected on ICU admission. Of these, 828 (50%) belonged to group A, 434 (23%) to group B and 440 (27%) to group C. Group C were more dependent, had higher SAPS II and Charlson scores (p<0.05). ICU and hospital length of stay did not differ between groups. Microorganism isolation and bacteremia were higher in group B (53% and 24%, respectively) than groups A (45% and 19%, respectively) and C (47% and 17%, respectively; p<0.05). Septic shock was present in 58% of patients and was more frequent in groups B (55%) and C (55%) than group A (48%). The most common sources of infections were respiratory and intra-abdominal. Isolation of gram-negative bacteria was significantly increased in group B and C (p=0.034). The most common isolated bacteria were Escherichia coli (17%), Staphylococcus aureus (15%) and Pseudomonas aeruginosa (8%) for all groups. In total, 151 isolates (22%) corresponded to MDR bacteria, of which 57% were Staphylococcus aureus. Age was not a risk factor for infection by MDR. All-cause mortality in ICU and hospital was: 23% and 30%; 29% and 40%; 36% and 53% - respectively for groups A, B, and C (p < 0.001).

    Conclusions: Old patients (65-74 years) were more prone to present with bacteremia, which could account for the increased severity of sepsis and higher all-cause mortality. Age was not a risk factor for MDR infection.

    P043 Review of rejected microbiology specimens from intensive care

    AR Garg1, E Sherry1, L Verrinder1, JM Patel2

    1University Hospital Birmingham, Birmingham, UK,2University of Birmingham, Birmingham, UK

    Introduction: The rapid identification of pathogens using patient samples is crucial. Delays in this can potentially have serious implications for patients and infection prevention/control [1]. The aim of this project was to identify the number of microbiology samples sent, the number rejected and reasons for rejection, with the intention to reduce such instances.

    Methods: Data was collected retrospectively on ICU admissions from January-June 2017 to a university hospital in the UK. Patients were identified and data collected using the Intensive Care National Audit and Research Centre (ICNARC) database and from electronic patient records. Data collected included: demographics, length of stay, microbiology samples sent and details on the rejected samples.

    Results: 530 patients were identified with a total of 4725 (median: 4 samples/patient) samples sent to microbiology. 144 were rejected (3%). 100 (18%) patients had at least 1 sample rejected. The median number of samples rejected per patient was 1 (Range: 1-10). The most common samples rejected were urine (22%), blood (20%), faeces (19%) and sputum (8%). 69 (48%) of the samples were resent for testing (median 1 day; range 0-20). Reasons for sample rejection are shown in Table 1. Most rejections occurred within 48-hours of admission (Fig. 1).

    Conclusions: This study confirms a high number of samples are sent to microbiology. Although a few are rejected, overall this represents a large number, with most occurring during the first days of admission. Reasons for sample rejection are remedial through improved training and vigilance. A bespoke guide to sample collection for microbiology coupled with a training program for healthcare professionals has been introduced with the aim to reduce sample rejections from 3% to 0.5%.


    1. Vincent JL et al. Crit Care Med 43:2283-2291, 2015

    Table 1 (abstract P043). Reasons for sample rejection
    Fig. 1 (abstract P043).

    Timeline for rejection of samples

    P044 Microbiological colonization of ICU healthcare workers’ mobile phones

    A Galazzi1, E Broggi1, A Grancini1, M Panigada1, I Zainaghi1, F Binda1, T Mauri2, G Grasselli2, I Adamini1, A Pesenti2

    1Fondazione IRCCS Ca’ Granda Ospedale Maggiore Policlinico, University of Milan, Milano, Italy,2Fondazione IRCCS Ca’ Granda Ospedale Maggiore Policlinico, Dipartimento di Fisiopatologia Medico-Chirurgica e dei Trapianti, University of Milan, Milan, Italy

    Introduction: Careful hand hygiene of health-care workers (HCWs) is recommended to reduce transmission of pathogenic microorganisms to patients [1]. Mobile phones are commonly used during work shifts and may act as vehicles of pathogens [2,3]. The purpose of this study was to assess the colonization rate of ICU HCWs’ mobile phones before and after work shifts.

    Methods: Prospective observational study conducted in an academic, tertiary-level ICU. HCWs (including medical and nursing staff) had their mobile phones sampled for microbiology before and after work shifts on 6 different days. Samples were taken with eSwab in a standardized modality and seeded on Columbia Agar plus 5% sheep blood. A semiquantitative growth evaluation was performed at 24 and 48 hours after incubation at 35°C.

    Results: Fifty HCWs participated in the study (91% of department staff). One hundred swabs were taken from 50 mobile phones. Forty-three HCWs (86%) reported a habitual use of their phones during the work shift, and 38 of them (88.4%) usually kept their mobiles in the uniform pocket. All phones (100%) were positive for bacteria. The most frequently isolated bacteria were Coagulase Negative Staphylococcus, Bacillus sp. and MRSA (97%, 56%, 17%, respectively). No patient admitted to the ICU during the study period was positive for bacteria found of HCWs’ mobile phones. No difference in bacteria types and burden was found between the beginning and the end of work shifts.

    Conclusions: HCWs’ mobile phones are always colonized mainly by flora resident on HCW’s hands, even before the work shift and irrespective of the microbiological patients’ flora. Further studies are warranted to investigate the role of mobile phones’ bacterial colonization in the ICU setting and to determine whether routine cleaning of HCWs’ mobile phones may reduce the rate of infection transmission in critical patients.


    1. WHO Guidelines on Hand Hygiene in Health Care. 2009

    2. Ulger F et al. Ann Clin Microbiol Antimicrob 8:7, 2009

    3. Russotto V et al. J Intensive Care 3:54, 2015

    P045 Microbiological contamination of mobile phones carried by health care workers in intensive care units and operating room

    P Mogrovejo, S Castro, V Arízaga, E Guerrero, A Loja, L Tamayo, H Aguirre

    Santa Inés Hospital, Cuenca, Ecuador

    Introduction: Mobile phones (MP) of health care workers (HCWs) could be colonized by pathogenic bacteria. It can be a vector of drug resistant bacteria and could increase nosocomial infections. The aim of this study is to evaluate the prevalence of MP bacterial colonization in the adult intensive care unit (AICU), pediatric intensive care unit (PICU) and operating room (OR) in a tertiary level hospital.

    Methods: Sixty samples were collected from AICU (n=25), PICU (n=15) and OR (n=20) during August to September 2017. Samples were randomly selected and taken at the end of the HCWs duty with a sterile swab covering all MP surfaces. The inoculation was made into blood sheep and eosyn methilene blue agar for culture. Isolated bacteria were identified according to standard microbiological techniques. Antibiotic sensitivity testing was performed using disc diffusion method.

    Results: Overall MP bacterial colonization rate was 95%. Main results are detailed in Table 1. Most common non pathogenic bacteria was Staphylococcus epidermidis n=18 (90%). Isolated pathogenic bacteria were Meticilin-susceptible Staphylococcus aureus n=14 (38%), Methicillin-resistant Staphylococcus aureus n=10 (27%), resistant Staphylococcus epidermidis n=4 (11%), Acinetobacter baumanni n=4 (11%), Klebsiella pneumoniae n=4 (11%) and resistant Acinetobacter baumanni n=1 (2%). No significant difference was found in colonization rates of pathogenic bacteria between MP with case n=35 (59%) versus MP without case n=25 (41%) (p< .753).

    Conclusions: We found high rates of MP colonization with pathogenic bacteria. An educational program is necessary to reduce the contamination and transmission of these high risk microorganisms.


    Table 1 (abstract P045). Colonization rates per department and subtype of isolated bacteria

    P046 Assessment of the variability of airborne contamination levels in an intensive care unit over a 24 hour period

    M Booth1, L Dougall2, E Khoo3, H Hood3, S MacGregor2, M Maclean2

    1Glasgow Royal Infirmary, Glasgow, UK,2University of Strathclyde, Glasgow, UK,3University of Glasgow, Glasgow, UK

    Introduction: The objective of this study was to evaluate the variability in the dynamics and levels of airborne contamination within a hospital Intensive Care Unit in order to establish an improved understanding of the extent to which airborne bioburden contributes to cross-infection of patients. Microorganisms from the respiratory tract or skin can become airborne by coughing, sneezing and periods of increased activity such as bed changes and staff rounds. Current knowledge of the clinical microflora is limited however it is estimated that 10-33% of nosocomial infections are transmitted via air.

    Methods: Environmental air monitoring was conducted in Glasgow Royal Infirmary ICU, in the open ward and in patient isolation rooms. A sieve impactor air sampler was used to collect 500 L air samples every 15 minutes over 10 hour (08:00-18:00 h) and 24 hour (08:00-08:00 h) periods. Samples were collected, room activity logged and the bacterial contamination levels were recorded as CFU/m3 of air.

    Results: A high degree of variability in levels of airborne contamination was observed over the course of a 10 hour day and a 24 period in a hospital ICU. Counts ranged from 12-510 CFU/m3 over 24 hours in an isolation room occupied for 10 days by a patient with C. difficile infection. Contamination levels were found to be lowest during the night and in unoccupied rooms, with an average value of 20 CFU/m3. Peaks in airborne contamination showed a direct relation to increased room activity.

    Conclusions: This study demonstrates the degree of airborne contamination that can occur in an ICU over a 24 hour period. Numerous factors were found to contribute to microbial air contamination and consideration should be given to potential improved infection control strategies and decontamination technologies which could be deployed within the clinical environment to reduce the airborne contamination levels, with the ultimate aim of reducing healthcare-associated infections from environmental sources.

    P047 New practice of fixing the venous catheter of the jugular on the thorax and its impact on the infection

    F Goldstein, C Carius, A Coscia

    QuintaD’or, Rio de Janeiro, Brazil

    Introduction: Central Line-associated Bloodstream Infection (CLABSI) is an important concern in the ICU, mainly in those with a high density of use of central venous catheter. Any measures that may have an impact on the reduction of CLABSI are important in reducing morbidity and mortality of hospitalized patients. Therefore we present a retrospective study comparing the fixation site (neck vs. thorax) of the catheters implanted in the jugular vein, guided by ultrasonography and evaluating its impact on the incidence of CLABSI. The purpose of our study was to identify if there is any positive impact on the reduction of CLABSI when the catheter is fixated on the thorax.

    Methods: A retrospective unicentric study comparing the infection rates between the year of 2012, when the traditional technique of catheter fixation on the neck was used, and 2015, when 100% of the catheters were fixated on the thoracic region. The criteria for CLABSI were defined by the Infection Commission of QuintaD`or Hospital and the data on CLABSI were provided by the same commission. During this period there were no changes in the team of our unit and the patient's profile was the same. No deep vein catheter impregnated with antibiotics were used in the patients included in the study. The comparison used Fisher ́s test as a tool. All the patients hospitalized in the intensive care unit with indication of the central venous catheter of short permanence in the internal jugular vein were included. Patients with the central venous catheter of short permanence in other topographies, patients with hemodialysis catheter or with PICC were excluded.

    Results: During the year of 2012, 98 internal jugular vein catheters were installed in our unit using the traditional technique, fixing the catheter on the neck. In this period, 6 cases of CLABSI were detected. On the other hand, in the year of 2015, 127 internal jugular vein catheters were installed in the same unit, all of them, using the thorax as the point of fixation. Although the number of catheters installed this year was higher, there was no case of CLABSI. It appears that this position, provides a better fixation of the catheter, avoiding that the bandage gets uncovered.

    Conclusions: During the year of 2015, though there were more patients using deep vein catheters of short permanence, we had less CLABSI events on our unity compared to the year of 2012. Fisher's exact test identified a p-value of this association of 0.476. Fixation of the internal jugular vein catheter in the thorax seems to contribute to the prevention of CLABSI. Further prospective and randomized studies are required to evaluate the contribution of fixation of the jugular vein catheter in the thorax in the CLABSI prevention.

    P048 The Oral Biofilm Index in patients hospitalized on an intensive care unit

    R Marinho1, J Marinho1, A Marinho1, J Frias-Bulhosa2

    1Centro Hospitalar do Porto, Porto, Portugal,2Universidade Fernando Pessoa, Porto, Portugal

    Introduction: The oral cavity of a patient who has been hospitalized presents a different flora from normal healthy people. After 48h hours of hospital stay, the flora presents a bigger number of microorganisms that can be responsible for secondary infections, like pneumonia, because of their growth and proliferation. The objective of our study was to assess the dental plaque index on patients on admission to an Intensive Care Unit, and reassess 7 days later, to evaluate the efficacy of oral hygiene.

    Methods: Prospective, descriptive and observational study in an Intensive Care Unit of the CHP. Demographic, admission motive, hospital length of stay, feeding protocol, respiratory support need and oral hygiene protocol data was collected. The Greene & Vermillion Simplified Oral Hygiene Index (IHO-S) was used as the assessment tool on the first 24h and on 7th day.

    Results: 74 patients were evaluated, 42 of which were excluded for not meeting the minimal dentition. 32 patients had a mean age of 60,53 ± 14,44 years, 53,1% were males and most of medical and surgical scope (37,5% each). Mean hospital length of stay was 15,69±6,69 days. The majority of patients were sedated (75%), under ventilator support (81,3%) and with enteric nutritional support, under nasogastric tube feeding. Initial IHO-S score was 0,67±0,45, rising to 1,04±0,51 (p<0,05) 7 days later.

    Conclusions: Various studies have proven the importance of a good oral hygiene to avoid bacterial growth and reduce the risk for nosocomial infections. In this study, we’ve observed a significant worsening of oral hygiene one week after admission. Although this could be unimportant for a one week staying patient, it could indicate an increased risk for nosocomial infections for longer staying patients, which could benefit from a more efficient oral hygiene protocol.

    P049 Positive pocket cultures and infection risk after cardiac electronic device implantation-a retrospective observational single-center cohort study

    P Pekić1, M Bura2, N Marić1

    1University Hospital "Sveti Duh", Zagreb, Croatia,2Neuropsychiatric hospital "dr. Ivan Barbot", Popovača, Croatia

    Introduction: Positive pocket cultures after implantation of cardiac implantable electronic devices (CIEDs) are often found without clinically apparent infection. Infections related to CIEDs are a serious complication requiring complete device removal, prolonged antimicrobial therapy and can have an adverse patient outcome.

    Methods: We performed a retrospective observational single-center cohort study on 251 patients who received de novo implantation of pacemaker, cardioverter-defibrillator or cardiac resynchronization therapy device in a two-year period. Each patient was implanted using standard aseptic procedure according to local protocol and antibiotic (cefazolin) prophylaxis before the procedure. Pocket aspirate was taken after irrigating the wound with normal saline just before device placement.

    Results: We analyzed 251 patients (58.6% male, 41.4% female). The most often implanted device was a DDD pacemaker followed by a VVI pacemaker. Mean length of hospital stay was 12.02±8.34 days. There were 54 (21.5%) positive cultures with overall 3 (1.19%) clinically apparent infections which required prolonged iv antibiotics, removal of device and reimplantation after infection resolution. In regard to microbiology, S. epidermidis (48.2%) and coagulase negative Staphylococcus (29.6%) were the most often finding which is in contrast to the cultures described in the literature. The only statistically significant risk factor for positive pocket culture was male sex and presence of a urinary catheter. Invasive vascular devices, previous intrahospital infection, and diabetes were not found to increase the likelihood of positive pocket culture.

    Conclusions: Positive pocket cultures after CIED implant are a frequent finding mostly due to contamination and colonisation. The risk factors for such a finding differ from the usual and expected clinical circumstances. Our results are consistent with those in the literature. It turns out that the most important preventive measure in CIED implantation is strict aseptic procedure.

    P050 Use of dry bathing for intensive care patients

    W Yacov, Y Polishuk, A Geal-dor, G Yosef- hay

    Kaplan Medical Center, Rehovot, Israel

    Introduction: Intensive care patients are in constant risk of contamination due to suppression of their immune system, use of invasive procedures and medical equipment and health associated infections (HAI). Chlorhexidine Gluconate (CHG) is an antiseptic and disinfectant product. In medical research it has been found that daily CHG bathing is affective in reducing levels of skin and central line related infections (Climo, 2013). It is also referred to in the recommendations of the ministry of health "prevention of septicemia due to central lines" (2011).

    Methods: Unit guide lines for patient Dry Bathing were written in May 2015 and thereafter began the implementation and instruction of nursing staff. Quality control was inspected by observation. There was a 15 phase questioner that included several categories such as: preparation of the CHG solution, staff protection actions, infusions and surgical wound dressings, bathing performance and documentation.

    Results: A gradual rise of 97%was observed in theperformance ofdry bathing according to the unit guidelines

    Conclusions: 97% of observed dry baths where performed according to the guide lines. Points for improvement: Correct care of infusions and surgical wound dressing and verify use of separate wipes for each body part. Next we will examine the correlation between the use of dry baths and theextent of infections in the unit. Dry Baths are nowconsidered an integralpart of the daily nursing routine. They have no substantial costs, help prevent complications from infection and add to the patient’s safety.

    P051 Pragmatic selective digestive decontamination (SDD): ventilator-associated pneumonia (VAP) rates & local antibiotic resistance

    J Highgate1, A Rashid2, J Kilic1, F Baldwin1

    1Brighton and Sussex University Hospitals NHS Trust, Brighton, UK,2University of Brighton, Brighton, UK

    Introduction: Despite reductions in mortality reported with SDD, concerns about bacterial resistance and alteration of microbiome limit use. A retrospective observational study was conducted into the effect of local SDD protocols on VAP rates and resistance patterns. Over a 2-year period, 2 regimens were used dependent on drug availability and hospital antibiotic stewardship concerns. The study was designed to review practice and identify any risks of partial implementation.

    Methods: Patients ventilated on a general intensive care were identified via clinical information systems. Three periods were reviewed for adherence to SDD protocols, Pre SDD (Jan - Feb 14), Full (July - Sept 15) and Partial (July - Sept 16). High-risk patients during both SDD periods also received IV antibiotics for 96 hours. Patients admitted with pneumonia or tuberculosis were excluded from VAP analysis. Remaining patients’ records were reviewed and the Clinical Pulmonary Infection Score (CPIS) calculated for each ventilated day to identify VAP rates. Positive respiratory microbiological results for all patients admitted to the ICU during each time period were reviewed to assess for wider changes in local resistance patterns.

    Results: Protocol adherence was assessed in 71 patients during the full SDD period and 70 during the partial (Table 1). The number of patients included for analysis of VAP rates during each period was 38 pre SDD, 50 during full SDD and 37 during partial SDD. There were no significant changes in resistance patterns or Clostridium difficule rates (Table 2).

    Conclusions: Compliance with the available enteral antibiotics was reasonable but with IV antibiotics was poor. It is accepted that alterations and non-adherence to protocols risk development of resistant bacterial strains. Within our unit no decrease in VAP rates was seen but reassuringly no increased rates of extended bacterial resistance were identified during the treatment periods.

    Table 1 (abstract P051). SDD Protocol adherence & VAP rates
    Table 2 (abstract P051). Number of resistant organisms isolated

    P052 Arterial catheter colonization and related bloodstream infection in ICU: is this an issue?

    S Moreira, C Silva, JP Baptista, C Monteiro, A Marques, P Coutinho, J Pimentel

    Centro Hospitalar e Universitário de Coimbra, Coimbra, Portugal

    Introduction: Arterial catheters are commonly used in Intensive Care Units (ICU) and are among the most frequently manipulated vascular access devices. Our aim was to evaluate the rate of arterial catheter-related bloodstream infection and colonization.

    Methods: This was a 12-month, prospective and monocentric cohort study, performed in a multipurpose ICU. All arterial catheters, inserted in or presented to the ICU, were cultured and assessed for colonization or catheter-related bloodstream infection (CRBI).

    Results: We enrolled 119 patients (63.8% males, average age 59±17 years, SAPS 2 42±21) of whom a total of 141 arterial catheters were analyzed for a total of 1552 catheter-days. Radial arterial catheters were inserted in 88.7% (n=125), femoral arterial catheters in 7.8% (n=11) and other arterial catheters in 3.5% (n=5). Signs of dysfunction were found in 28.8% and 45.5%, respectively. Radial arterial catheters colonization (n=5) and CRBI (n=1) occurred at a rate of 3.0 and 0.8/1000 catheter-days. Femoral arterial catheters colonization (n=2) and CRBI (n=1) occurred at a rate of 10.8 and 5.4/1000 catheter-days, respectively. Mean catheter time insertion was significantly higher in colonized catheters/CRBI (21±8 days; 95% CI: 14-28) when compared to arterial catheters with negative cultures (10±8 days; 95% CI: 9-12); p = 0.002). Colonized lines showed Acinetobacter baumannii (n=3), Staphylococcus epidermidis (n=1), Enterococcus spp (n=1) and Pseudomonas aeruginosa (n=1). CRBI were caused by Staphylococcus epidermidis (n=1) and Staphylococcus haemolyticus (n=1).

    Conclusions: The incidence of radial arterial catheters colonization and CRBI were lower than reported rates in literature. Colonization and CRBI rates were higher in femoral catheters. Femoral catheters showed dysfunction more frequently. Prolonged catheterization was associated with colonization and CRBI.

    P053 A multimodality approach to decreasing ICU infections by hydrogen peroxide, silver cations and compartmentalization and applying acinetobacter as infection marker

    A Al Bshabshe1, M Hamid 1, A Assiri2, M Joseph1

    1King Khalid University, Abha, Saudi Arabia, 2Aseer Central Hospital, Abha, Saudi Arabia

    Introduction: Nosocomial infections at the intensive care unit (ICU) represent a substantial health threat [1, 2]. ICU infections are mainly attributed to the extended hospital delay which results in high morbidities and mortalities.

    Methods: A cross sectional study was conducted at the Intensive Care Unit, Aseer Central Hospital, Saudi Arabia over 13 months period (2014-2015). The intervention program included the application of mist of hydrogen peroxide and silver cations, physical separation and compartmentalization of the intensive care unit. The GLOSAIR™ 400 System was used to deliver a mist of hydrogen peroxide and silver cations. Hydrogen peroxide is an oxidizing agent, which kills microorganisms.

    Results: A total of 103 strains of Acinetobacter species were identified from the patients over the 13 months period (Fig. 1). The mean infection rates decreased from 14.3 in the first three months of the program to 4 in the last three month after continuous.

    Conclusions: The program using the three procedures offered a significant decrease in infections at the ICU as measured by Acinetobacter count, which is one of the most hazardous nosocomial pathogens.


    1. Burgmann H et al. Intensive Care Med 36:1597-1601, 2010.

    2. Boncagni F et al. Minerva Anestesiol 81:765-775.3, 2015.

    Fig. 1 (abstract P053).

    A linear regression of Acinetobacter species recovered from the intensive care unit, Aseer Central Hospital, Saudi Arabia: A one year trend analysis (y = - 0.7912x + 13.615; R2 = 0.44)

    P054 A review of current practice of continuous antibiotic infusions on intensive care units in England

    G Page, E Turner, C Day

    Royal Devon and Exeter Hospital, Exeter, UK

    Introduction: The efficacy of ß lactam antibiotics is related to the time above MIC. Continuous or extended infusions can be used to increase the time above MIC, especially in patients with normal or increased drug clearance. Administering antibiotics by continuous infusion is not a new concept. A review in 1992 looks at the outcomes of continuous infusions [1]. More recently an improvement in mortality has been demonstrated [2]. Our perception was that uptake of this low cost intervention was not common, so we undertook a survey to determine how commonly continuous infusions are used in England.

    Methods: A telephone survey of all intensive care units in England was undertaken. Questions included:

    • Are you using continuous or extended antibiotic infusions?

    • Which antibiotics are you using for continuous or extended infusions?

    • If not currently using has it been considered?

    Data was collected over a week in June 2017.

    Results: There was an 87% response rate. 73 (44.5%) of the units continuously infuse some antibiotics, however 71.2% of those only infuse vancomycin and not ß lactams. Only 21 of the total responders (12.8%) infuse antibiotics other than vancomycin (i.e. ß lactams).

    Conclusions: The theoretical advantage of continuous infusion of ß lactam antibiotics has been described for over 20 years. There is now evidence that this may improve survival. Despite this, uptake in England has been slow.


    1. Craig WA et al. Antimicrob Agents Chemotherap 36(12):2577-83, 1992

    2. Roberts JA et al. Am J Respir Crit Care Med 194(6):681-91, 2016

    P055 Infections in a tertiary referral hospital intensive care unit in Rwanda

    J Mvukiyehe1, P Banguti1, R Elisabeth2, J Richard3, E Tuyishime1

    1University of Rwanda, Kigali, Rwanda,2Harvard University, Boston, MA, USA,3Minnesota University, Minneapolis, MN, USA

    Introduction: Infections contribute to a significant proportion of morbidity and mortality worldwide. While many infections are successfully managed with antimicrobial therapy, rates of antimicrobial resistance (AMR) are increasing. Certain patient populations such as those admitted to intensive care units (ICU) are at high risk.

    Methods: We conducted a retrospective, observational study of all ICU patients at a tertiary referral hospital in Rwanda from January 2015 through December 2016 We collected data on diagnosis, ICU length of stay, mortality and hospital length of stay, as well as microorganism, site of culture, AMR and antibiotics prescribe.

    Results: Overall, 331 patients were admitted to the ICU. Most patients were admitted from the main operating theater (n=150, 45%).The most common admitting diagnoses were sepsis (n=113, 34%), head trauma (n= 90, 27%). A total of 268 samples were collected from 331 patients. The samples were from blood (n=110, 33%), tracheal aspirate (n=22, 7%),. The most common organisms isolated were Klebsiella (n=30, 29%), Acinetobacter (n=20, 19%), E.coli (n=16, 15%), Proteus (n=15, 14%), Citrobacter (n=8, 8%), S aureus (n=7, 7%), Pseudomonas (n=5, 5%), and other (n=9, 9%). Of Klebsiella isolates, 100% and 76% were resistant to ceftriaxone and cefotaxime, respectively. Of E.coli isolates, 86% and 71% were resistant to ceftriaxone and cefotaxime, respectively. All Acinetobacter isolates were resistant to ceftriaxone and cefotaxime.

    Conclusions: There is an alarming rate of antimicrobial resistance to commonly used antibiotics in the ICU. Expanding antibiotic options and strengthening antimicrobial stewardship are critical for patient care.

    P056 The last three days

    G Latten1, P Stassen2

    1Zuyderland MC, Sittard-Geleen, Netherlands,2Maastricht UMC+, Maastricht, Netherlands

    Introduction: This study provides an overview of the prehospital course of patients with a (suspected) infection in the emergency department (ED). Most research on serious infections and sepsis has focused on the hospital environment, while potentially most delay, and therefore possibly the best opportunity to improve treatment, lies in the prehospital setting.

    Methods: Patients were included in this prospective observational study during a 4 week period in 2017. All patients aged 18 years or older with a suspected or proven infection were included. Prehospital, ED and outcomes were registered.

    Results: In total, 2452 patients visited the ED during the study period, of whom 440 (17.9%) patients had a (suspected) infection. (Fig. 1) Median duration of symptoms before ED visit was 3 days (IQR 1-7 days), with 23.9% of patients using antibiotics before arrival in the ED. Most patients (83%) had been referred by a general practicioner (GP), while 41.1% of patients had visited their GP previously during the current disease episode. Twenty-two patients (5.0%) experienced an adverse outcome (ICU admission and/or 30-day all- cause mortality): these patients were less often referred by a general practicioner (GP) (59.1 vs. 84.2%, p=0.001) and were considered more urgent both by EMS and in the ED.

    Conclusions: The prehospital phase of patients with an infection provides a window of opportunity for improvement of care. Patients become ill 3 days before the ED visit and 41.1% already visited their GP previously during the current disease episode, while 23.9% is currently using antibiotics. Future research should focus on quality improvement programs in the prehospital setting, targeting patients and/or primary care professionals.

    Fig. 1 (abstract P056).

    See text for description

    P057 Severe tetanus: clinical data in a Moroccan medical intensive care unit

    H Ezzouine, Z Sghier, Y Hafiani, K Mediouni, A Benslama

    Faculty of medicine and pharmacy.University Hassan II.Casablanca, Casablanca, Morocco

    Introduction: Worldwide, the prevalence of tetanus has decreased.However, even if progress has been made in the combat to eradicate tetanus it may be a cause of admission to intensive care.The objectives of our study are to determine epidemiological,clinical and prognostic characteristics for severe tetanus in our unit.

    Methods: We conducted a retrospective study in the medical intensive care unit of Ibn Rushd hospital in Casablanca in Morocco from 2010 to 2016.We studied the epidemiological,clinical and prognostic characteristics of the patients who were admitted for severe tetanus.

    Results: The incidence of severe tetanus was 2.04% affecting male in 100%.41.9% were aged between 31 and 40 years old. In 85.7% there were a integumentary portal of entry. Contractures were present in 69%of the cases. At intensive care unit admission, 21.4% of the patients were sedated. The anti-tetanus vaccination was never updated. According to the Dakar score 28.6% of the patients were listed Dakar 1, 54.8% Dakar 2 and 16.6% Dakar 3. For the Mollaret score, the crude form was found in 44.2%, the acute generalized form was found in 32.6% and the severe form in 20.9% of the cases.Mechanical ventilation was necessary in 83.3%. Diazepam and baclofen were used in 92.9%, phenobarbital in76.2% and propofol in 42.85%. A serotherapy was used for all the patients and a preliminary vaccination dose for 26.9%. All the patients received antibiotics, penicillin G 33.33% and metronidazole 76.2%. The mortality was 61.9%. The length of intensive care stay was significantly higher. The need for an intubation,its duration and the occurrence of autonomic dysfunction have significantly influenced the mortality.

    Conclusions: To improve the prognosis in these serious forms of tetanus,it is highly important to identify the warning signs and refer patients in intensive care for early and appropriate management in intensive care.

    P058 Bloodstream infections in the ICU of a tertiary care hospital: analysis of resistance patterns

    T Melissopoulou, S Kolovou, M Panoutsopoulou, T Kypraiou, M Papadimitriou, O Apostolou, J Deliolanis, A Pantazatou, A Skiada, J Pavleas, J Floros

    Laiko Hospital, Athens, Greece

    Introduction: Bloodstream infections (BSIs) are associated with increased mortality in the ICU. The aim of the study was to evaluate the epidemiology and resistance patterns during the period 2013 to 2017.

    Methods: Bacteria and fungi isolated from the blood of patients hospitalized in a mixed ICU during the study period were retrospectively analyzed. Sensitivity testing was performed with disk diffusion (Kirby-Bauer) and Microscan Walkaway 96 plus for minimal inhibitory concentrations.

    Results: During the study period 1198 patients were hospitalized in the ICU. BSIs were diagnosed in 284 cases (23.7%). The isolated microorganisms were Acinetobacter baumannii (29%), Klebsiella pneumoniae (15%), other Enterobacteriaceae (8%), Pseudomonas aeruginosa (6%), Stenotrophomonas maltophilia (1%), enterococci (20%), staphylococci (8%) and Candida spp. (13%). Of the A. baumannii isolates, 97% were resistant to carbapenems, 9.6% to colistin, and 31% to tigecycline. Of the K. pneumoniae isolates 80% were resistant to carbapenems, 70% to colistin, and 4.5% to tigecycline. Of the P. aeruginosa species 44% were resistant to carbapenems and they were all susceptible to colistin. The rate of resistance to vancomycin was 56% for the E. faecium isolates, 5.5% for the E. faecalis, while the resistance to methicillin of the coagulase negative staphylococci was 90%. The most commonly isolate species of Candida was C. albicans.

    Conclusions: Multi-drug resistant isolates, especially A. baumannii and Enterobacteriaceae, are a serious problem in our ICU. Gram positive bacteria are less common, but the resistance of enterococci to vancomycin is significant. Antibiotic stewardship and infection control measures should be applied in a more strict way.

    P059 Nosocomial sinusitis in intensive care unit patients

    I Titov 1, S Melnyk2, M Grynovska1

    1Ivano-Frankivsk National Medical University, Ivano-Frankivsk, Ukraine,2Ivano-Frankivsk Regional Clinical Hospital, Ivano-Frankivsk, Ukraine

    Introduction: Nosocomial sinusitis (NS) is a complication of critically ill patients which develops 48-72 h after admission and is mostly linked but not limited to such invasive procedures as nasotracheal intubation and nasogastric tube placement. NS is often overlooked as a source of pyrexia of unknown origin, meningeal manifestations, sepsis and ventilator associated pneumonia in ICU patients. CT scanning and sinus puncture are used to confirm the inflammatory process and identify the pathogen behind it.

    Methods: A retrospective case study of 6.479 ICU patients for a period of 2012-2016 was performed. We have analysed data from the CT scans of paranasal sinuses and bacteriological findings of samples obtained from sinus puncture.

    Results: 644 (9.9%) patients were suspected of NS on the 5-7th day of stay in the ICU. The CT scan confirmed pathological changes in 464 patients (7.1%). Hemisinusitis was detected in 422 patients (90.9%) and pansinusitis in 41 patients (8.8%). There was also an isolated case of maxillary sinusitis in 1 patient (0.2%). The pathogenic culture was identified only in 297 (64%) samples, 34.6% of which revealed isolated bacteria and 65.4% a polymicrobial association. Gram positive bacteria were detected in 16.1% of cases and Gram negative in 49.5%. Most cases revealed multiple antibiotic resistance.

    Conclusions: 1. NS has proved to be largely caused by Gram negative bacteria and polymicrobial associations. The use of broad spectrum antibiotics in ICU may justify the presence of sterile cultures.

    2.Early identification of risk patients in ICU as well as the use of screening CT scan may benefit timely diagnosis and adequate treatment of patients.

    3.Preventive considerations include: patient’s bed head elevation, the use of oral gastric tube in sedated and coma patients on ventilation, nasotracheal intubation only if indicated, removal of nasogastric tube at night, proper hygiene.

    P060 The impact of TB on ICU in a high incidence area of the UK: a ten year analysis

    J Barrett1, A Keeley1, D Keane1, L John1, W Lynn2, N Sabir1

    1Northwick Park Hospital, London, UK,2Ealing Hospital, London, UK

    Introduction: Despite wide availability of effective treatment a minority of patients with TB will become critically ill. Here we describe the presentation, demographics and outcomes of patients with TB admitted to ICU in our trust which is based in an area of high TB incidence in London (incidence >50/100000).

    Methods: The London TB register was cross-referenced against ICU records from 01.01.07 to 31.12.2016. Clinical data were collected for matched patients.

    Results: 78 patients identified; 50% of South Asian origin, 29.5% of African origin, 12% UK born 31% of patients had multifocal TB. TB sensitivities: 89% fully sensitive, 7% mono-drug resistance, 4% multi-drug resistant TB. Median length of ICU stay was 5 days (IQR 2-14). 6 patients were readmitted. 23 patients died on ICU (29.5%), 10 patients died prior to hospital discharge. Median time to death following ICU admission: 15 days (IQR 5-35).

    Conclusions: Only 78 of 4,011 TB patients (2%) required critical care intervention (Table 1). Those admitted to ICU were older and more likely to have pulmonary, CNS, miliary or abdominal TB (Table 2). Mortality was high despite critical care input in a unit familiar with managing TB, and 24 hour access to Infectious Diseases advice within the trust, likely due to overwhelming organ dysfunction, patient frailty and advanced TB infection. Rates of drug resistant TB were low and comparable to UK-wide rates over that period (5% mono-drug resistant, 2% MDR) thus less likely a contributory factor to the majority of deaths.

    Table 1 (abstract P060). Reasons for ICU admission
    Table 2 (abstract P060). Characteristics of ICU TB patients vs non-ICU TB patients

    P061 Short term antibiotics prevent early VAP in patients treated with mild therapeutic hypothermia after cardiac arrest

    T Daix1, A Cariou2, F Meziani3, PF Dequin4, C Guitton5, N Deye6, G Plantefève7, JP Quenot8, A Desachy9, T Kamel10, S Bedon-Carte11, JL Diehl12, N Chudeau13, E Karam14, F Renon-Carron1, A Hernandez Padilla 1, P Vignon1, A Le Gouge4, B François1

    1Centre Hospitalier Universitaire Dupuytren, Limoges, France,2hôpital Cochin (APHP) et université Paris Descartes, Paris, France,3Université de Strasbourg (UNISTRA), Faculté de Médecine, Hôpitaux universitaires de Strasbourg/Nouvel hôpital civil, Strasbourg, France,4CHU Bretonneau, Tours, France,5CHU de Nantes, Nantes, France,6CHU Lariboisière, APHP, Paris, France,7CH Victor Dupouy, Argenteuil, France,8CHU François Mitterrand, Dijon, France,9Centre Hospitalier d’Angoulême, Angoulême, France,10CHR d’Orléans, Orléans, France,11Centre Hospitalier de Périgueux, Périgueux, France,12HEGP, AP-HP, Paris, France,13Centre hospitalier du Mans, Le Mans, France,14Centre hospitalier général, Brive la Gaillarde, France

    Introduction: Patients treated with mild therapeutic hypothermia after cardiac arrests with shockable rhythm are at high risk of ventilator-associated pneumonia (VAP) [1]. Despite retrospective trials suggesting a benefit of short-term (48h) antibiotics in this setting [2], it is not recommended. The primary objective was to demonstrate that systematic antibiotic prophylaxis can reduce incidence of early VAP (<7 days). The impact on incidence of late VAP and on Day 28 mortality was also assessed.

    Methods: Multicenter, placebo-controlled, double-blinded, randomized trial. ICU patients >18 years, mechanically ventilated after out-of-hospital resuscitated cardiac arrest related to initial shockable rhythm and treated with mild therapeutic hypothermia were included. Moribund patients and those requiring extracorporeal life supports, with ongoing antibiotic therapy, known chronic colonization with multiresistant bacteria or known allergy to beta-lactam antibiotics were excluded. Either IV injection of amoxicillin-clavulanic acid (1g/200mg) or placebo was administered 3 times a day for 2 days. All pulmonary infections were recorded and blindly confirmed by an adjudication committee.

    Results: In intention to treat analysis, 196 patients were analyzed, (treatment group n=99; mean age 60.5±14.4 years, sex ratio=4, SOFA score 8.7±3.1). Global characteristics of cardiac arrest were similar (no flow= 3.5min vs 3.8min, low-flow= 21.8min vs 18.2min). 60 VAP were confirmed incl. 51 early VAP, 19 in treatment group vs 32 in placebo group (HR=0.546; IC 95%=[0.315; 0.946]) (Fig. 1). Occurrence of late VAP (4% vs 5.1%) and Day 28 mortality (41.4% vs 37.5%) was not affected by the study procedure.

    Conclusions: Short-term antibiotic prophylaxis significantly decreases incidence of early VAP in patients treated with mild therapeutic hypothermia after out-of-hospital cardiac arrest related to shockable rhythm and should be recommended.


    1. Mongardon N et al Crit Care Med 39:1359-64, 2011.

    2. Davies KJ et al Resuscitation 84:616-9, 2013

    Fig. 1 (abstract P061).

    Incidence of early VAP

    P062 Withdrawn

    P063 Clinical burden of inappropriate empirical antimicrobial therapy of in-hospital complicated intra-abdominal infections

    S Carelli, T Taccheri, MS Vallecoccia, SL Cutuli, V Di Gravio, G De Pascale, M Antonelli

    Fondaz. Policlinico A. Gemelli, Rome, Italy

    Introduction: Complicated intra-abdominal infections (cIAIs) remain a common cause of morbidity and mortality among ICU patients and therapeutic failure still occurs. This study aimed to assess the clinical impact of an initial inappropriate empirical therapy of cIAIs.

    Methods: This retrospective study enrolled patients admitted to the ICU of the Fondaz. Pol. A. Gemelli in Rome, between February 2010 and February 2017 with a diagnosis of in hospital cIAI. Comparisons between patients receiving initial inappropriate antimicrobial therapy (IIAT) and initial appropriate antimcrobial therapy (IAAT) were performed.

    Results: A total of 137 patients were included (IIAT=44 and IAAT=93). Baseline characteristics were comparable, with the exception of SOFA score at infection which was higher in IAAT (p=0.04). Secondary peritonitis was the main type of cIAI (45.5% in IIAT and 40.9% in IAAT) followed by abdominal abscess and biliary tract infection. Secondary bacteraemia was significantly higher in IIAT (p=0.03). Conversely, IAAT had an higher rate of adequate source control (p=0.01). Empirical therapy of IAAT patients included more frequently anti gram-positive (p=0.016) and carbapenems (p=0.01), while empirical dual anti gram negative and antifungal coverages rate were not different (Fig. 1). MDR and polimicrobial infections rate was significantly higher in IIAT when associated with septic shock at occurrence of infection (p=0.03; Fig. 2). IAAT showed significantly lower mortality at 28 and 90 days (p<0.01) as well as higher rate of clinical cure and microbiological eradication than IIAT (p<0.01). At the multivariate analysis, adequate source control [p=0.04, OR 0.25 (0.09-0.65)] and IIAT [(p<0.01, OR 11.4 (4.02-32.3)] turned out to be independently related with 28 days mortality.

    Conclusions: In cIAIs an appropriate empirical antibiotic therapy and an early infection source control are closely associated with better outcomes.

    Fig. 1 (abstract P063).

    See text for description

    Fig. 2 (abstract P063).

    See text for description

    P064 Prospective observational study evaluating antibiotic prescription pattern and microbial isolates and their correlation with hemodynamic stability in icu patients

    M Bhattacharyya 1, A Saha2, T Chatterjee2, S Todi1

    1AMRI Hospitals, Kolkata, India,2Jadavpur University, Kolkata, India

    Introduction: Antibiotics are the most commonly prescribed drugs in ICU.In the era of antibiotic resistance it is difficult to choose antibiotics during septic episode.The choice antibiotics mainly depends on clinical diagnosis,culture sensitivity and local flora. Whether severity of illness really maters is not well known. To study antibiotic prescription pattern and whether the choice of antibiotic varies according to hemodynamic stability in patients admitted in ICU.To study of microbiological isolates and their variability according to hamodynamic stability in ICU patients.

    Methods: All ICU patients of more than 18 years age who received antibiotics and where cultures had been sent were included in the study.Patients discharged against medical advice and where treatment had been withdrawn were excluded in this study. This prospective observational study was conducted between July 2016 to March 2017.Patients were divided into stable and unstable group according to hemodynamic parameter and usage of antibiotics and microbiological isolated were correlated. ICU mortality and length of stay were correlated between hemodynamically stable and unstable group.

    Results: 786 sepsis episode were analysed. Mean age was 65 years, male predominant, and average APACHE IV score was 58(SD25). We had 444 patients in unstable group of which 71% patients got discharged and 86% of patients got discharged in stable group. Antibiotic combination therapy was used more in hemodynamically unstsble patients(p 0.3). BLBLI was used more in stable group. Drug resistance in microbiological isolates did not reveal any statistically significant difference among stable or unstable group.

    Conclusions: There is a tendency to administer combination antibiotics in sicker group of patients with hemodynamic instability. Prevalence of microbial flora did not show any statistical difference. outcome is worse in hemodynamically unstable patients.

    P065 The clinical significance of Candida score in critically ill patients with candida infection

    H Al-Dorzi1, R Khan1, T Aldabbagh1, A Toledo1, S Al Johani1, A Almutairi2, S Khalil2, F Siddiqui2, Y Arabi3

    1King Abdulaziz Medical City, Riyadh, Saudi Arabia,2MSD, Riyadh, Saudi Arabia,3King Saud bin Abdulaziz University for Health Sciences, Riyadh, Saudi Arabia

    Introduction: Candida score (CS) is used to identify patients with invasive candidiasis in the ICU, but its clinical use has not become widespread. Our objective was to evaluate the clinical significance of CS in a mixed population of ICU patients.

    Methods: This was a prospective observational study of critically ill patients who had Candida species growth during their stay in any of six different ICUs of a tertiary-care center. Two intensivists classified patients as having Candida colonization or invasive candidiasis according to predefined criteria. CS was calculated for each patient on the day of Candida species growth as follows: 1 point for parenteral nutrition + 1 point for surgery + 1 point for multifocal Candida colonization + 2 points for severe sepsis. The Receiver Operating Characteristic (ROC) curve was plotted to assess CS ability to discriminate between invasive candidiasis and Candida colonization.

    Results: CS was 1.6±0.9 in patients with Candida colonization (N=261) and 2.4±0.9 in those with invasive candidiasis (N=120) (p<0.001). However, only 38.7% of invasive candidiasis cases had CS >= 3 (compared with 8.0% of Candida colonization cases; p<0.001). The ROC curve (Fig. 1) showed that CS had fair ability to discriminate between invasive candidiasis and Candida colonization (area under the curve 0.71, 95% confidence interval 0.65 to 0.77; p<0.001). In patients with invasive candidiasis, CS was similar in hospital survivors and nonsurvivors (2.2±0.9 and 2.5±0.8, respectively; p=0.13). CS did not discriminate between survivors and nonsurvivors (area under the ROC curve 0.61, 95% confidence interval 0.46 to 0.75; p<0.15).

    Conclusions: CS was higher in patients with invasive candidiasis than those with Candida Colonization. However, its ability to discriminate between these patients was only fair. CS was not associated with hospital mortality.

    Fig. 1 (abstract P065).

    ROC curve for Candida score discrimintaing between invasive candidiasis and Candida colonization

    P066 Poor reliability of creatinine clearance estimates in predicting fluconazole exposure in liver transplant patients

    M Lugano, P Cojutti, F Pea

    ASUIUD, Udine, Italy

    Introduction: Invasive candidiasis (IC) is a frequent complication in liver transplant (LT) recipients, especially during the first 1-3 months after LT. Fluconazole is a triazole antifungal used for prophylaxis and treatment of IC. Due to its renal elimination, dose adjustments are usually based on estimated creatinine clearance (eCrCL). However, the reliability of eCrCL in predicting fluconazole clearance has never been investigated in this population. The aim of this study was to conduct a population pharmacokinetic (popPK) analysis in a cohort of LT patients who underwent therapeutic drug monitoring (TDM) in order to find out which covariates may influence fluconazole pharmacokinetics (PKs).

    Methods: This retrospective study included LT patients who were admitted to the intensive care unit of our University Hospital between December 2007 and May 2016, and who were treated with intravenous fluconazole in the first months after LT. TDM of fluconazole was performed with the intent of attaining the efficacy pharmacodynamic target (AUC24h/MIC > 55.2). The tested covariates were: age, gender, CKD-EPI eCrCL, time from LT, serum albumin and transaminases, SAPS II score. PopPK was carried out with Pmetrics software.

    Results: Nineteen patients (mean±SD age, weight and serum creatinine of 60±8.4 years, 75±16.8 kg, 1.0±0.62 mg/dL, respectively) with a total of 89 fluconazole trough plasma concentrations were included in the popPK analysis. Mean±SD fluconazole distribution volume (Vd) and clearance (CL) were 27.02±10.78 L and 0.55±0.19 L/h. Age and time from LT were the only clinical covariates significantly correlated with fluconazole Vd and CL, respectively. Conversely, CKD-EPI eCLCr was unable to predict fluconazole CL.

    Conclusions: CKD-EPI eCLCr is unreliable in predicting fluconazole exposure in LT recipients. Consistently, in this population adaptation of fluconazole dose should be based on measured CrCL, and TDM may be helpful in optimizing drug exposure.

    P067 Outcomes of a candidiasis screening protocol in a medical ICU

    M Boujelbèn1, I Fathallah1, H Kallel1, D Sakis1, M Tobich1, S Habacha1, N Ben Salah1, M Bouchekoua2, S Trabelsi2, S Khaled2, N Kouraichi1

    1Hôpital Régional Yasminette, Ben Arous, Tunisia, 2Hôpital Charles Nicolle, Tunis, Tunisia

    Introduction: The aim is to determine the incidence, characteristics and risk factors of invasive candidiasis (IC) in critically ill patients by using a weekly screening protocol.

    Methods: A 9 months’ prospective study was conducted in a 6-bed MICU. The candidiasis screening consisted of the culture of plastic swabs (from different body sites), urine and respiratory tract samples.It was conducted upon admission and on weekly basis for all the patients. Decision to treat was based on clinical and microbiological features.

    Results: 97 patients were included. The colonization rate with Candida spp was 28.8%(n=28). 415 screening samples were collected with a positivity rate at 27.9%(n=118). Table 1 describes the isolated Candida species by site. Antifungal resistance was tested in 72(62%) species. The resistance rate to fluconazole was 13.8%(n=10). The antifungal resistance of Candida albicans is detailed in Table 2. 14(14.4%) patients presented an IC with a mean age and mean SAPS II at 54.3 ± 18 years and 48±18.7 respectively. 7(50%) presented acute renal failure upon admission. 85.7% (n=12) of the patients needed mechanical ventilation. The median length of stay was 29 days [18.5-62.5] and the mortality rate was 42.9%(n=6). The mean SOFA score upon infection was 8.5±2.79. The candida score was >= 2.5 and the colonization index was >= 0.5 in 92.8%(n=13) and 78.5%(n=11) of the patients respectively. Only one patient had a positive blood culture. Mannan antigen and anti-mannan antibodies were screened only in five patients with a positivity rate at 100%(n=5). The most isolated species was: Candida albicans 64.3%(n=9). Multivariate analysis showed that prior use of Imipinem more than 6 days was a risk factor for IC (OR=9.37, CI95[1.15 ; 75.8], p=0.03).

    Conclusions: This study showed the ecology and epidemiology of Candida species in our MICU with an increased IC rate and high mortality. Prior Imipinem use was a risk factor for IC.

    Table 1 (abstract P067). Isolated Candida species by site
    Table 2 (abstract P067). Candida albicans’ antifungal resistance



    P069 Invasive fungal infections (IFI) – harmonizing the guidelines for India

    Y Mehta1, O Shrivastav2, K Bajan3, A Khurana4, A Qamra4

    1Medanta The Medicity, Gurgaon, India,2Jaslok Hospital, Mumbai, India,3PD Hinduja Hospital, Mumbai, India,4Wockhardt, Mumbai, India

    Introduction: ICU-acquired infection is as high as 42.7 episodes per 1000 patient-days in lower-middle income countries like India (WHO). Almost three times higher than in high-income countries [1]. Candida Infection is the 3rd most commonly acquired nosocomial infection in India burdening the debilitated patient with longer ICU stay [2]. There are no definite guidelines on whether & when to start anti-fungal treatment, specific to India where IFI risk is high and diagnostic facilities are limited. Currently, the Intensivists across India are using antifungals, according to their clinical experience and selective application of international guidelines leading to non-uniformity of patient outcomes.

    Methods: In an endeavour to synchronize anti-fungal therapy and educate intensivists from small cities of India, 2 Intensivists and 1 Infectious Disease specialist of international repute were approached to design a module on ‘Invasive Fungal Infections – When to Start Anti-fungals in ICU [Fig. 1]. The IFI in India was summarised into a compact 1 hour session for dissemination of knowledge using IDSA 2016 as a reference guideline. 12 Intensivists from across India were trained on the module by our faculty. The module was rolled out to Intensivists and Pulmonologists focussing particularly on the tier-2 & tier -3 cities where avenues for learning are limited [Fig. 2].

    Results: The module covered epidemiology, diagnostic challenges & anti-fungal therapies in Candidemia. It also included Candiduria, Aspergillosis & Mucormycosis that intensivists infrequently encounter. 4 meetings have been conducted and over 150 intensivists have been trained so far and more such trainings are planned in near future.

    Conclusions: This module serves as a good academic tool to create awareness, education and harmonisation of anti-fungal treatment amongst HCPs across India.


    1. WHO Report Burden of Endemic HCI, 2011

    2. Oberoi JK JIMSA 23 No. 1, January - March 2010

    Fig. 1 (abstract P069).

    When to start Anti-fungal therapy in my patient?

    Fig. 2 (abstract P069).

    Design of Anti-fungal module development

    P070 Incidence of Trichosporon spp. urinary tract infections in ICU

    E Belesiotou, C Routsi, C Vrettou, E Magira, E Douka, P Kaltsas, M Nepka, E Perivolioti, E Kraniotaki, S Zakinthinos

    Evaggelismos General HOSPITAL, Athens, Greece

    Introduction: Trichosporon species are fungi found in nature and human normal flora but they can be an opportunistic pathogen, associated with medical devices (biofilm formation), especially in intensive care unit(ICU) patients.

    Methods: After cultivation of urine samples, the identification was based on system Vitek 2, API 20C AUX, API ID 32C(bioMérieux@) and the susceptibility test on Vitek 2 and E test.

    Results: During the last two years, 1/10/2015-30/9/2017, in 2359 ICU patients, 27884 patient days, we detected Trichosporon spp in 31patients. The minimum stay was 28 days. 24(77,4%) men and 7(22,6%) women. 27/31(87%) strains were T. Asahii and 4(13%) T. mucoides. All patients were exposed to antibiotics, had medical devices and other infections with other multi drug resistant strains. In 10 patients the infection persisted over one month, and 2 patients died during Trichosporon spp. funguria. According to the antifungal susceptibility testing, 3 strains T. Asahii had intermediate sensitivity (MIC:16:I) and 3 strains were resistant (MIC:16:R) to Fluconazole.. All strains T. Asahii were sensitive to Amphotericin-B. All T. mucoides were sensitive to Flucytosine. 2/4 (50%). T. mucoides were resistant to voriconazole(MIC:4 R), 2/4(50% R) to Amphotericin-B (MIC:8-16 R), and 1/4(25%) to Fluconazole (MIC:32 R). The echinocandins are not active against Trichosporon spp.. In most patients treatment was administration of voriconazole.

    Conclusions: The Trichosporon spp after 28 days in ICU percentage is 31/2359(1,31%) and the incidence is 31/2788(1.11%). The majority were men. The azole drugs and Amphotericin-B showed activity against Trichosporon spp but recommendations must be based on in vitro susceptibility data and clinical experience and features.

    P071 Acinetobacter baumannii ventilator-associated pneumonia epidemiology, risk and prognosis factors

    W Sellami, W Essmat, I Ben mrad, Z Hajjej, H Gharssallah, I Labbene, M Ferjani

    Military Hospital, Tunis, Tunisia

    Introduction: Ventilator-associated pneumonia (VAP) is the most common nosocomial infection in critically ill patients, reaching up to 30 to 50%, with a high mortality rate. Acinetobacter baumannii (AB) has emerged as a pathogen frequently incriminated in VAP’s in Tunisia. The aim of this study was to describe the epidemiological characteristics of Acinetobacter baumannii VAP, to identify the risk factors and the predictors of poor outcome of VAP with AB.

    Methods: A retrospective study was conducted in the intensive care unit of the Military Hospital of Tunis, from January 2015 to December 2016. All patients with VAP’s documented infection were included. VAP’s patients with AB vs VAP’s patients due to other pathogens.

    Results: Seventy patients (10%) developed VAP. The incidence of VAP with AB was 6.28%. Previous antibiotic therapy was identified as a risk factor for Acinetobacter baumanii-induced pneumonia, unlike the underlying disease. AB was resistant to ceftazidime in 100%, imipenem in 97.5% with sensitivity to colistin in 100% of cases. Multidrug-resistant AB accounted for 22.5% and highly resistant AB accounted for 77.5%. Patients with AB pneumonia were more frequently complicated by acute respiratory distress syndrome compared to other patients (37.5% versus 8.9%, p = 0.02), leading to higher mortality (52.5% versus 20%, p = 0.02).

    Conclusions: The increasing incidence of VAP in multidrug-resistant and highly resistant AB predicts a high morbidity and mortality. Hence, the risk factors related to poor outcome in VAP’s need to be identified. The implementation of infection-control measures, mainly the cross-transmission, may be needed to improve outcome.

    P072 Single versus combination antibiotic therapy in the treatment of gram negative infections

    S Chatterjee 1, A Bhakta1, J Basak2, S Todi1

    1AMRI Hospitals, Kolkata, India,2Jadavpore University, Kolkata, India

    Introduction: This study assessed whether empiric combination antibiotic therapy directed against Gram-negative bacteria is associated with lower intensive care unit (ICU) mortality compared to single antibiotic therapy.

    Methods: Retrospective cohort study on prospectively collected data conducted in the ICU of a tertiary care hospital in India between July2016 to March2017. All consecutive infection episodes treated with empiric antibiotic therapy and with subsequent positive culture for Gram-negative bacteria were included. Primary and secondary outcomes were all cause ICU mortality and ICU length of stay (LOS). Outcomes were compared between infection episodes treated with single vs.combination antibiotic therapy.

    Results: Of total 214 episodes of gram-negative infections 66.4% received combination-antibiotic therapy. Baseline demographic and clinical characteristics between single vs. combination therapy groups were similar (mean age: p=0.07; sex: p=0.3; mean APACHE IV score: p=0.07). Overall ICU mortality did not significantly differ between single and combination antibiotic groups (30.2% vs. 27%; p=0.7). In single antibiotic group, ICU mortality was significantly higher for antibiotic-resistant compared to antibiotic-sensitive bacteria (77.8% vs. 18.5%, p=0.0002). In combination group, significantly lower ICU mortality was noted if bacteria was sensitive to even one antibiotic compared to pan-resistant bacteria (21.4% vs. 63.6%, p=0.0001). ICU LOS was similar between antibiotic-sensitive bacteria and antibiotic-resistant bacteria, both in single and combination therapy groups (single, antibiotic-sensitive vs. antibiotic-resistant: mean LOS±SD 14.6±12.7 vs.12.8±11days; p=0.6; combination, antibiotic-sensitive vs. antibiotic-resistant: 15.5±13.3 vs.11.2 days; p=0.1).

    Conclusions: Irrespective of the number of antibiotics prescribed as empiric therapy, outcome of patients solely depends on the sensitivity pattern of the bacteria isolated.

    P073 Pharmacokinetics of trimethoprim and sulfametrole in critically ill patients on continuous haemofiltration

    R Welte1, J Hotter1, T Gasperetti1, R Beyer1, R Bellmann-Weiler1, S Eschertzhuber1, M Zaruba1, I Lorenz1, M Ströhle2, M Joannidis1, R Bellmann1

    1Medical University of Innsbruck, Innsbruck, Austria,2General Hospital of Innsbruck, Innsbruck, Austria

    Introduction: The combination of trimethoprim and sulfametrole (TMP-SMT, Rokiprim®) is active against multi-drug resistant bacteria and Pneumocystis jirovecii. In critically ill patients undergoing continuous veno-venous haemofiltration (CVVH), however, its use is limited because of lacking pharmacokinetic data.

    Methods: Pharmacokinetics of both drugs were determined after standard doses in patients on CVVH and in critically ill patients with approximately normal renal function. Quantification of TMP and SMT was done by high pressure liquid chromatography (HPLC) and UV detection after pre-purification by solid phase extraction. The total clearance (CLtot) was estimated from arterial plasma levels and the haemofilter clearance (CLHF) from plasma and ultrafiltrate concentrations.

    Results: Six patients on CVVH (3 after the first dose, 3 at steady state) and nine patients off CVVH have been enrolled (4 after first dose, 7 at steady state). After a single dose, CLtot of SMT was 3.5 (1.8-3.8, median [range]) and 1.7 (1.1-2.7) L/h on and off CVVH, respectively. At steady state, we observed a CLtot of 1.0 (0.5-1.0) and 0.3 (0.2-0.9) L/h, respectively, on and off CVVH. Steady state trough levels (Cmin) of SMT amounted to 52-113 mg/L in patients on CVVH and 18-145 in patients off CVVH. CLtot of TMP was 4.4 (2.5-5.3) L/h on CVVH and 5.4 (3.2-9.9) L/h off CVVH after the first dose. At steady state, its CLtot amounted to 0.8 (0.4-0.8) and 1.0 (0.6-1.9) L/h on and off CVVH, respectively. Cmin was 4-12 mg/L on CVVH and 3-9 mg/L in patients off CVVH. CLHF accounted for 22-68% of CLtot of SMT and 28-72% of CLtot TMP.

    Conclusions: Exposure to both antimicrobial agents is highly variable, but comparable in patients on and off CVVH. As considerable amounts of SMT and TMP are eliminated by CVVH, no excessive accumulation appears to take place during treatment with standard doses.

    P074 The positive impact of meropenem stewardship intervention at a Brazilian intensive care unit

    W Freitas1, M Davi1, L Souza1, M Couto1, L Lourenço1, R Eiras1, H Primo1, J Páramo1, J Garcia1, M Alves2

    1Hospital Casa de Portugal, Rio de Janeiro, Brazil,2Universidade Federal de Juiz de Fora, Juiz de Fora, Brazil

    Introduction: This study aimed to evolutionary analyze the decrease of the Meropenem Defined Daily Doses (DDD), at a Brazilian Intensity Care Unity (ICU) before and after six months of the antimicrobial stewardship intervention.

    Methods: From June to November 2017, the meropenem use to treat inpatients at the Hospital Casa de Portugal ICU, Rio de Janeiro, Brazil, was reduced to seven days and Meropenem DDD, CRE ID, consumption of saline 100 mL for dilution, equipment for infusion of antibiotics, and global mortality values and BSI deaths were noted. Same data were retrospectively collected from December 2016 to May 2017. Results of both periods were analyzed by Student’s T Test.

    Results: Meropenem DDD values ranged from 143.01 to 187.70 and 22.87 to 160.76, from December 2016 to May 2017 and June to November 2017, respectively, with average of 174.7 and 101.6 in this order, with T(10) of 3.60 (p = 0.005). CRE ID also decreased, with values ranging from 9.6 to 34.9 (December 2016 to May 2017) and 1 to 18.1 (June to November 2017), with average of 16.2 and 7.2, respectively. A decrease in the use of saline for dilution by 1000 patients-day with values from 2930 to 4436 and 2619 to 2980 from December 2016 to May 2017 and from June to November 2017 respectively, with 3310 and 2868 (average) in this order was detected. We have observed concomitant decrease in equipment for infusion by 1000 patients-day with values from 995 to 2116 and from 918 to 1054, from December 2016 to May 2017 and from June to November 2017, respectively, with average values of 1236 and 967, in this order. The global mortality values varied of 17.41 to 22.96 (December 2016 to May 2017) and 11.61 to 17.9 (June to November 2017), with average of 19.5 and 14.5 respectively, with T(10) of 3.78 (p = 0.004). A drop in the mean number of BSI deaths from December 2016 to May 2017 and June to November 2017 of 2.6 to < 1, in this order, was also observed.

    Conclusions: Meropenem stewardship intervention had a positive impact in the Intensive Care Unity evaluated.

    P075 Colistin ‘MIC’ creep, a harbinger for resistance? Study for monitoring antimicrobial resistance trends (SMART) - South Africa 2015-2016

    B Magazi, R Holl

    MSD, Midrand, South Africa

    Introduction: Loss of colistin as a clinical option has profound public health implications. Widespread use of colistin in agriculture and humans has seen the emergence of Mcr-1 mediated resistance amongst South African patients [1]. We sought to describe the trends of colistin minimum inhibitory concentrations (MIC) over two years using data collected by SMART.

    Methods: SMART monitors the in vitro susceptibility of clinical aerobic and facultative gram-negative bacterial isolates to selected antimicrobials of importance, enabling longitudinal analyses to determine changes over time. The dataset comprised bacterial isolates from four different South African private pathology laboratories and one public sector pathology laboratory from 2015 - 2016. The methods used in the study have been described elsewhere [2]. Isolate proportions between years were compared using the chi-squared test with Yates’ continuity correction.

    Results: 146 and 198 Pseudomonas aeruginosa isolates were called in 2015 and 2016 respectively. Isolates with MICs=2 increased from 4.8% to 14.6 % between 2015 and 2016 (p=0.00278) (Table 1). In both years none of the isolates had an MIC>2. Non-significant changes were observed in Klebsiella pneumoniae and Escherichia coli.

    Conclusions: While the clinical and public health significance of this MIC creep amongst P. aeruginosa is unknown, it is disconcerting. Could this be similar to the precipice witnessed in the hVISA phenotype in Staphylococcus aureus [3]? Further studies are required to elucidate this. Meanwhile, we call for more prudent use of this agent and the development of colistin sparing treatment options for P. aeruginosa.


    1. Coetzee J et al. 106:449-450, 2016

    2. Morrissey I et al. Pharmaceuticals. 6(11):1335-1346, 2013

    3. McGuinness WA et al. Yale Bio Med 90(2):269-281, 2017

    Table 1 (abstract P075). Colistin MIC Levels 2015-2016

    P076 Ceftazidime/avibactam for treating severe infections in the critically ill due to carbapenemases producing Klebsiella pneumoniae

    A Corona, A Veronese, S Antinori, S Santini, C Soru, M Corbellino, F Cantarero, E Catena

    ASST Fatebenefratelli Sacco, PO SACCO - Milano, Milano, Italy

    Introduction: Carbapenemases producing (cp) Klebsiella pneumoniae (KP) infection rate ranges between 5-50% and is associated with a high mortality (19-51%). The use of ceftazidime-avibactam - a 3-generation cephalosporin plus a new inhibitor of class A (KPC), C (AmpC) and D (OXA48) ESBL, may resolve life-threatening conditions.

    Methods: Case-report of 14 patients undergoing compassionate treatment.

    Results: From April to November 2017, 14 patients (10 M and 4 F), median age 57 (IQR = 42.5-70.5) were given ceftazidime/avibactam for major KP-cp (meropenem MIC > 16) infections: (i.e. 9 bacteramia 3 secondary peritonitis and 2 UTI. 10/14 (71.5%) patients, developed a septic shock [median (IQR) SOFA score 10 (8-17)) and needed mechanical ventilation [median (IQR) 8 (4-17) days], norepinephrine infusion [median (IQR) 3 (2-5) days]; 4 patients underwent renal replacement therapy. The median treatment duration (IQR) was 14 (13-14) days. In 41.6% of cases, antibiotic-therapy therapy combination (phosphomycin and colistin) was chosen. All the patients experienced a clinical response by 72/96 hours from the ceftazidime/avibactam commencing. In 8/9 bacteraemic patients negativization of blood culture occurred by 96 hours as well as of the rectal swab in 5/14 patients. A (B) recurred and a second treatment was given. 11/14 (78.5%) patients survived, whereas death was caused by multi-organ failure. The susceptibility test of strains showed sensitivity to ceftazidime/avibactam, whereas 100% of resistance to carbapenems, quinolones and III/IV generation cephalosporin, tigecycline and piperacillin/tazobactam; 62.5% of susceptibility to fosfomycin and colistin; (v) less than 50% of suceptibility to aminoglicosides.

    Conclusions: The 14 strains of KP-cp were susceptible to ceftazidime-avibactam despite the high carbapenem-resistance recorded in our ICU, because od rare identification of KP-cp VIM/NDL +. The preliminary data seems to confirm the efficacy and clinical utility of this antibiotic for the critically ill patients.

    P077 Phage-based therapy against Acinetobacter baumannii lung infection in mice

    SM Wienhold1, MC Brack1, C Rohde2, G Nouailles1, C Seitz3, A Ross3, H Ziehr3, K Dietert4, C Gurtner4, O Kershaw4, AD Gruber4, M Rohde5, N Suttorp1, M Witzenrath1

    1Charité – Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität zu Berlin, and Berlin Institute of Health, Berlin, Germany,2Leibniz Institute DSMZ - German Collection of Microorganisms and Cell Cultures, Braunschweig, Germany,3Fraunhofer Institute for Toxicology and Experimental Medicine (ITEM), Braunschweig, Germany,4Freie Universität Berlin, Berlin, Germany,5Helmholtz Centre for Infection Research, Braunschweig, Germany

    Introduction: Multidrug resistant bacteria (MDR) are an increasing problem on intensive care units. Lung infections caused by Acinetobacter baumannii are frequently difficult to treat. Phages have regained attention as treatment option for bacterial infections due to their specificity and effectivity in lysis. The aim of this preclinical study was to determine efficacy and safety of a novel phage preparation in mice.

    Methods: Mice were transnasally infected with a MDR A. baumannii strain [1] and 12 hours later treated intratracheally with a specific phage or solvent. Phage Acibel004 [2] was produced as suspension including efficient depletion of endotoxins. At defined time points, clinical parameters, bacterial burden in lung and bronchoalveolar lavage fluid (BALF) and cell influx were determined. Further, lung permeability and cytokine release were quantified and histopathological examination was performed.

    Results: Mice treated with phages recovered faster from infection-associated hypothermia. 48 hours after infection, phage treatment led to a reduction in bacterial loads in lungs and BALF. In addition, lung permeability and cytokine production were reduced in phage-treated mice. Histopathological examination of the lungs showed less spreading of bacteria to the periphery in phage-treated mice, whereas cellular recruitment into the lung was unaffected. No adverse effects were observed.

    Conclusions: For the first time a highly purified phage against A. baumannii was successfully used in vivo. The current preclinical data support the concept of a phage-based therapy against pulmonary A. baumannii infections.


    1. Knapp S, et al. Am J Respir Crit Care Med. 1;173(1):122-9, 2006

    2. Merabishvili M, et al. PLoS One. 11;9(8):e104853, 2014

    P078 Efficacy of phage therapy against lethal methicillin resistant Staphylococcus aureus (MRSA) ventilator associated pneumonia in rats (VAP)

    J Prazak1, P Reichlin2, D Grandgirard1, G Resch3, M Qiao1, S Nansoz1, Y Que1, M Haenggi1

    1Inselspital, University Of Bern, Bern, Switzerland,2Medical School, University of Bern, Bern, Switzerland,3University of Lausanne, Lausanne, Switzerland

    Introduction: VAP is common in critically ill patients and associated with high morbidity and mortality, especially when caused by antibiotic resistant bacteria. Recently, phage therapy has emerged as a promising non-antibiotic based treatment of antibiotic resistant bacterial infections. However, proof-of-concept experimental and clinical studies are missing before its wider use in clinical medicine. The goal of this experimental study was to compare the efficacy of phage therapy versus antibiotics for the treatment of MRSA in a rat model of VAP.

    Methods: Four hours after intubation and protective ventilation, rats were inoculated via the endotracheal tube with 6-9 x 109 CFU (LD100) of the MRSA clinical isolate AW7. The animals were subsequently extubated. Two hours after bacterial challenge, rats were randomised to receive intravenously either teicoplanin (n= 8), a cocktail of 4 lytic anti-S. aureus bacteriophages (n=9) or combination of both (n=6). 5 animals served as control (no treatment). Survival by 96 hours was the primary outcome. Secondary outcomes were bacterial count in lungs, spleen and blood. Kaplan-Meier estimates of survival were done and multiple comparisons of survival rates performed using the Holm-Sidak method.

    Results: Treatment with either phages, antibiotics or combination of both significantly increased survival (66%, 50%, 50% respectively, compared to 0% survival for controls, p<0.05). There were no statistical differences in survival rates between either forms of treatment (Fig. 1). Treatments hinder the systemic extension of the infection into the blood and spleen without impacting bacterial counts within the lungs, but the numbers are too small to perform statistical tests (Table 1).

    Conclusions: Phage therapy performed as well as teicoplanin for the treatment of MRSA VAP when administered intravenously. Further experiments are needed to investigate whether phage administered via aerosols could heal infection within the lungs.

    Fig. 1 (abstract P078).

    Effect of treatment with phages, antibiotics or combination of both on survival rate

    Table 1 (abstract P078). Bacterial count in lung, spleen and blood at euthanasia. Median [IQR]

    P079 Influence of amikacin inhalation on the efficacy of ventilation-associated pneumonia and ventilation-associated tracheobronchitis treatment caused by multi-drug resistant gram-negative bacteria: comparative study

    A Yaroshetskiy1, N Rezepov2, I Mandel3, V Khodak4, V Konanykhin3

    1Pirogov Russian National Research Medical University, Moscow, Russia,2City Clinical Hospital 1 67, Moscow, Russia,3Sechenov University, Moscow, Russia,4City Hospital 135, Nigniy Novgorod, Russia

    Introduction: The aim of the study was comparative evaluation of the clinical and microbiological efficacy of combination of amikacin thru nebuliser Aeroneb Pro and standard antimicrobal therapy (AMTcomb) with standard antimicrobal therapy (AMTst) in treatment of ventilator-associated pneumonia (VAP) and ventilator-associated tracheobronchitis (VAT) caused by multi-drug resistant gram-negative bacteria.

    Methods: In prospective two-center study with retrospective control included patients with VAP and VAT. In AMTst group (retrospective, n=25) we used combination of meropenem 1 g every 8h iv as continuous infusion, cefoperazon/sulbactam 4 g every 12 h iv as continuous infusion and amikacin 1 g iv every 24 h. In AMTcomb group (prospective, n=25) we used combination of AMTst and amikacin inhalation 500 mg every 12 h thru nebuliser Aeroneb Pro.

    Results: In AMTcomb clinical cure rate was 84%, while in AMTst 29.2% (p<0.001), Clinical Pulmonary Infection Score (CPIS) on day 7 was 6 (4-7) points in AMTst and 2 (0-4) points in AMTcomb (p<0.001). Recurrence of VAP/VAT was 29.2% in AMTst and 12.5% in AMTcomb (p=0.008). On day 7 infectious agent titer in tracheal aspirate was 107 (103-108) CFU/ml in AMTst group, while 103 (no growth-106) CFU/ml in AMTcomb (p=0.016). Microbiological eradication observed in 13 patients in AMTcomb vs in 1 patient in AMTst and microbiological persistance observed in 6 patients in AMTcomb vs 17 patients in AMTst (p=0.002). In AMTcomb on 3rd day sputum was less purulent (p=0.016). Amikacin nebulisation didn’t led to deterioration of organ dysfunction: on day 7 there was no difference in platelet count, creatinine and bilirubin levels as compared to day 0 (p=0.102; p=0.297, p=0.532, respectively).

    Conclusions: Addition of amikacin inhalation 500 mg every 12 h thru Aeroneb Pro nebuliser in patients with VAP and VAT was more efficacious than intravenous standard antimicrobal treatment with comparable safety profile.

    P080 Aerozolized colistin is an effective adjunct to systemic antibiotics in ventilator-associated pneumonia

    A Kuzovlev, A Shabanov, A Goloubev, V Moroz, A Grechko

    Federal Research and Clinical Center of Intensive Care Medicine and Rehabilitology, V.A. Negovsky research institute of general reanimatology, Moscow, Russia

    Introduction: The aim of the study was to assess the effectiveness of inhaled colistin (IC) as an adjunct to systemic antibiotics in the treatment of ventilator-associated pneumonia (VAP).

    Methods: 110 ICU patients with VAP were enrolled in this observational study. Resolution of VAP was assessed as primary endpoint; eradication of pathogens in sputum, weaning time, duration of ICU stay and mortality were assessed as secondary outcomes. Patients were split into 2 groups: Gr.1 (n = 60) - addition of IC to systemic antibiotics without changing the basic regimen; Gr. 2 (n = 50) - change in systemic antibiotics according to sensitivity. Groups were comparable. IC was administered in a dose of 2 million IU TID (Xselia Pharmaceuticals ApS, Denmark). Statistical analysis was performed using Statistica 7.0 (M, σ, Newman-Keuls test; p <0.05).

    Results: VAP resolution rate was 77% in Gr.1 (vs. 50% in Gr. 2, p = 0.0295); eradication of pathogens from sputum by the 7th day. treatment was achieved in 80% of Gr. 1 and 60% in the Gr. 2 (n = 12) (p> 0.05); in Gr. 1 weaning from ventilation was possible earlier than in Gr. 2 - 7.8±1.3 days. in Gr. 1 vs. 10.9±4.5 days. in Gr. 2 (p = 0.0000); in Gr. 1 duration of ICU stay was shorter than in Gr. 2 - 11.5±3.2 days vs. 17.1±2.3 days. in Gr. 2 (p = 0.0000). No mortality differences were detected.

    Conclusions: Administration of inhaled colistin 2 million IU TID is effective as an adjunct to systemic antibiotics in the treatment of VAP. This modified treatment promotes a more rapid resolution of VAP, earlier weaning from ventilator, reduction of the duration of ICU stay, with no impact on mortality. The addition of IC to systemic antibiotics should be considered as second-line regimen in VAP patients.

    P081 Factors associated with no de-escalation of empirical antimicrobial therapy in ICU settings with high rate of multi-drug resistant bacteria

    C Routsi 1, K Arvaniti1, G Poulakou1, A Tourtoglou1, A Goufa1, A Vemvetsou1, A Kanavou1, E Hasou1, E Antoniadou1, C Nikolaou1, K Ntorlis1, S Kokkoris1, V Theodorou1, S Vasilliagkou2, H Giamarellou2

    1National and Kapodistrian University of Athens, Athens, Greece,2Hellenic Society of Chemotherapy Study Group, Athens, Greece

    Introduction: De-escalation is recommended in the management of antimicrobial therapy in ICU patients [1]. However, this strategy has not been adequately evaluated in the presence of increased prevalence of multidrug-resistant (MDR) bacteria. The aim of this study was to identify factors associated with no de-escalation in ICUs with high rate of MDR bacteria [2].

    Methods: Prospective, multicenter study conducted in 12 Greek ICUs over a 1-year period. Patients with laboratory confirmed infections were included. SOFA score on admission, on septic episode and thereafter every 24 h over 14 days, infection site(s), culture results, antimicrobial therapy, and mortality were recorded. Only the first septic episode was analyzed. In order to assess the factors associated with no de-escalation, a multivariate analysis was performed.

    Results: A total of 211 patients (admission SOFA score 10±3) were analyzed. 43% of those had septic episode on ICU admission; 57% patients had an ICU-acquired. De-escalation was applied to 44 (21%) patients whereas it was not feasible in 75 patients (44 %) due to the recovery of MDR pathogens or it was not applied, although the microbiology results allowed it, in 92 patients (56 %). Septic shock on the day of septic episode was present in 67% and 79% of patients with and without de-escalation, respectively, p=0.072). Compared to no de-escalation, de-escalation strategy was associated with a shorter duration of shock (4±5 vs. 9±7 days, p<0.001) and all-cause mortality (15.4% vs. 46.4%, p<0.001). Multivariate analysis showed that the variables associated with no de-escalation were: a deteriorating clinical course as indicated by an increasing SOFA score (OR 14.7, p<0.001) and a lack of de-escalation possibility due to recovery of MDR pathogens (OR 27.3, p=0.008).

    Conclusions: Deteriorating clinical course and MDR pathogens are independently associated with no de-escalation strategy in critically ill patients.


    1. Rhodes A et al. Intensive Care Med 43:304, 2017

    2. Dimopoulos G et al. Minerva Anestesiol 81:405, 2015

    P082 Corticosteroid treatment in patients with severe influenza pneumonia: a propensity score matching analysis.

    G Moreno1, A Rodríguez1, LF Reyes2, J Sole-Violan3, E Díaz4, M Bodí1, S Trefler1, J Guardiola5, JC Yébenes6, A Soriano7, I Martín-Loeches8, MI Restrepo2

    1Hospital Universitari Joan XXIII, Tarragona, Spain,2University of Texas Health at San Antonio, San Antonio, TX, USA, 3Hospital Dr. Negrín, Gran Canaria, Gran Canaria, Spain,4Hospital Parc Tauli, Sabadell, Spain,5University of Louisville and Robley Rex VA Medical Center, Louisville, Kentucky, USA,6Hospital de Mataró, Mataró, Spain,7Hospital Clinic, Barcelona, Spain,8Multidisciplinary Intensive Care Research Organization (MICRO). St James’s University Hospital. Trinity Centre for Health Sciences, Dublin, Ireland

    Introduction: Patients with influenza infection could develop acute respiratory failure and ultimately ARDS. Corticosteroids have been broadly used as immunomodulatory despite lack of evidence supporting its effect in patients with influenza infection. Therefore, our aim was to determine the clinical predictors associated with corticosteroid administration and its association with ICU mortality.

    Methods: This is a secondary analysis of a prospective cohort study of critically ill patients with confirmed influenza pneumonia admitted to 148 ICUs in Spain, between June 2009 and April 2014. Patients were stratified according to treatment with systemic corticosteroids when administrated within the first 24-hours of hospital admission. We use a propensity-score matching (PSM) analysis to reduce confounding factors and association of the administration of corticosteroids. Primary outcome was ICU mortality. Cox proportional hazard analysis was performed to investigate the association between baseline characteristics and steroid use and used ICU mortality as the dependent variable.

    Results: The total population comprised 1,846 patients with H1N1 pneumonia, corticosteroids were administered in 604(32.7%) patients, being methylprednisolone the most frequently used medication(578/604 [95.7%]). The median daily dose was equivalent to 80 mg(IQR60-120) for a median duration of 7 days(IQR 5-10). Asthma, COPD, hematological disease and requirement of mechanical ventilation were independently associated with corticosteroid use(p<0.05). After adjusting with PSM, patients with H1N1 pneumonia who received corticosteroids had a higher ICU mortality(27.5%vs.18.8%, HR 1.44,1.18-1.76, p=0.005) compared to patients not treated with corticosteroids (Fig. 1).

    Conclusions: Administration of corticosteroids in patients with severe influenza pneumonia was associated with an increased ICU mortality. Thus, we advocate caution to physician using these medications for influenza pneumonia.


    Díaz E et al. J Infect 64(3):311-8, 2012

    Fig. 1 (abstract P082).

    Cox hazard Regression analysis

    P083 Tracheostomy dependence increases the risk of medical emergency team activation in pediatric patients

    B McKelvie1, A Lobos2, J Chan2, F Momoli2, J McNally2

    1London Health Sciences Centre - Children’s Hospital, London, Canada,2Children’s Hospital of Eastern Ontario, Ottawa, Canada

    Introduction: The purpose of this study was to determine if pediatric inpatients with tracheostomies (PTIs) were at increased risk of medical emergency team (MET) activation compared to other ward patients. The requirement for a tracheostomy confers a significant risk for morbidity and mortality [1]. METs have been implemented to improve the detection and management of patients at risk for clinical deterioration. Based on their risk factors, we hypothesized that PTIs would be at higher risk of clinical deterioration than other patients and so would have higher MET activation rates.

    Methods: This retrospective cohort study was conducted at a tertiary pediatric hospital, Children’s Hospital of Eastern Ontario (CHEO), in Ottawa, Canada. PTIs were identified using lists from subspecialty services, decision support and the operating room. MET activation data was obtained from a prospectively maintained database.

    Results: From 2008 to 2014 there were 42,041 admissions, including 264 involving PTIs. MET activations occurred in 1260 distinct admissions, including 33 PTI admissions. In the PTI group, the MET activation rate was significantly higher when compared to other ward patients (14 vs 2.9 per 100 admissions, p<0.001) and they had a 5.7 times increased odds of MET activation (OR 5.7, CI: 95% CI 3.6 to 9.1). Almost all PTI patients required intervention during the MET activation (94.6%) and 21.6% were admitted to PICU.

    Conclusions: PTIs are at significantly higher risk for MET activation. After MET activation, almost 80% of PTIs remained on the wards, which suggests that the MET helped manage deterioration in these patients. Targeted strategies need to be developed to reduce the risk of PTIs on the wards, and based on our results the MET may play a central role in improving care for these patients.


    1. Watters K et al. Laryngoscope 126:2611–2617, 2016

    P084 qSOFA versus SIRS versus SOFA for predicting sepsis and adverse outcomes of patients in the intensive care unit. Preliminary report of Russian national study

    M Astafeva1, V Rudnov1, V Kulabukhov2, V Bagin1

    1City Clinical hospital №40, Yekaterinburg, Russia,2AV Vishnevsky Institute of Surgery, Moscow, Russia

    Introduction: In 2016 sepsis is defined as a life-threatening organ dysfunction caused by infection. This concept, called Sepsis 3 [Singer M at al], define only two stages – sepsis and septic shock. In addition, it is defined that quick SOFA (qSOFA) may be a better predictor of in-hospital mortality than the SOFA score [Seymour CW et al]. However, the possibility of application of the Sepsis-3 and qSOFA conception for use in low/middle-income countries is still unknown.

    Methods: The aim of our study was to define the role of qSOFA, SIRS and SOFA for predicting sepsis and adverse outcomes of patients in the ICUs in Russia. Design: prospective observational multicenter study. We made ROC-analysis to estimate sensitivity, specificity, and predictive value of SIRS, qSOFA, and SOFA to identify sepsis and predict ICU-mortality.

    Results: The study included 335 patients from 22 ICUs in Russia. Sepsis, according to the Sepsis 3 criteria, was identified in 179 (53%) patients, and septic shock – in 82 (24%) patients. 102 (30%) patients died within ICU stay. In the prognosis of sepsis the qSOFA scale, SIRS and SOFA demonstrated the following AUCROCs: 0.683 (95% CI 0.626-0.736); 0.719 (95% CI 0.674 to 0.764); 0.763 (95% CI 0.711 - 0.816) respectively (only the AUCROCs of qSOFA vs SOFA differ significantly, p<0.01). In the prognosis of mortality the qSOFA scale, SIRS and SOFA demonstrated the following AUCROCs: 0.736 (95% CI 0.682-0.791); 0.594 (95% CI 0.546-0.642); 0.843 (95% CI 0.798-0.889) respectively (all AUCROCs significantly differ from each other, p<0.01). For the break-point of the qSOFA score >1 in the prognosis of mortality, the specificity was 65.2%; the sensitivity was 70.6%.

    Conclusions: The qSOFA scale in the prognosis of sepsis does not differ significantly from the SIRS criteria, but in the prognosis of mortality is significantly better than SIRS. qSOFA significantly worse in the prognosis of sepsis and death than the SOFA scale.

    P085 Detection of sepsis by qSOFA and SIRS in patients with infection in general wards

    J Luo, W Jiang, L Weng, J Peng, X Hu, C Wang, G Liu, H Huang, B Du

    Peking Union Medical College Hospital, Beijing, China

    Introduction: The international task force of Sepsis-3 introduced the quick Sequential Failure Assessment (qSOFA) score to supersede the systemic inflammatory response syndrome (SIRS) score as the screen tool for sepsis. The objective of this study is to prospectively access the diagnostic value of qSOFA and SIRS among patients with infection in general wards.

    Methods: A prospective cohort study conducted in ten general wards of a tertiary teaching hospital. For a half-year period, consecutive patients who were admitted with infection or developed infection during hospital stay were included. Demographic data and all variables for qSOFA, SIRS and SOFA scores were collected. We recorded daily qSOFA, SIRS and SOFA scores until hospital discharge, death, or day 28, whichever occurred earlier. The primary outcome was sepsis at 28 days. Discrimination was assessed using the area under the receiver operating characteristic curve (AUROC) and sensitivities or specificities with a conventional cutoff value of 2.

    Results: Of 409 patients (median age, 55 years [IQR, 40-67]; male, 225[55%]; most common diagnosis pneumonia, 234[57%]) who were identified with infection in general wards, 229(56%) developed sepsis at a median of 0 (IQR, 0-1) day, 146 patients (36%) and 371 patients (91%) met qSOFA and SIRS criteria at a median of 1 (IQR, 0-5) and 0 (IQR, 0-0) day, respectively. The qSOFA performed better than SIRS in diagnosing sepsis, with an AUROC of 0.75 (95% CI, 0.71-0.79) vs 0.69(95% CI, 0.64-0.74). With a conventional cutoff value of 2, qSOFA had lower sensitivity (53% [95% CI, 47%-60%] vs. 98% [95% CI, 95%-99%], p < 0.001) and higher specificity (87% [95% CI, 81%-91%] vs. 18% [95% CI, 13%-23%], p < 0.001) than SIRS (Table 1).

    Conclusions: Among patients with infection in general wards, the use of qSOFA resulted in greater diagnostic accuracy for sepsis than SIRS during hospitalization. qSOFA and SIRS scores can predict the occurrence of sepsis with high specificity and high sensitivity, respectively.

    Table 1 (abstract P085). Diagnostic performance of qSOFA and SIRS scores for sepsis-3

    P086 Prognostic accuracy of quick sequential organ failure assessment (qSOFA) score for mortality: systematic review and meta-analysis

    B Brandao Barreto1, M Luz2, D Gusmao-Flores1

    1Pós-graduação em Medicina e Saúde, Salvador, Brazil,2Hospital da Mulher, Salvador, Brazil

    Introduction: The purpose of this study was to summarize the evidence assessing the qSOFA [1], calculated in admission of the patient in emergency department (ED) or intensive care unit (ICU), as a predictor of mortality. The hypothesis was that this tool had a good prediction performance.

    Methods: Systematic review and meta-analysis of studies assessing qSOFA as prediction tool for mortality found on PubMed, OVID, EMBASE, SCOPUS and EBSCO database from inception until November 2017. The primary outcomes were mortality (ICU mortality, in-hospital mortality, 30 and 90-day mortality). Studies reporting sensitivity and specificity of the qSOFA making it possible to create a 2x2 table were included. The diagnostic odds ratio (LnDOR) was summarized following the approach of DerSimonian and Laird using the software R (‘mada’ package). The summary ROC curve was created using the Reistma model (bivariate model). The RevMan 5 software was used to organize the data.

    Results: The search strategy yielded 266 citations. Of 134 unique citations, 48 met the inclusion criteria (426,618 patients). The sensitivity and specificity from each study are shown in Fig. 1. The meta–analysis of the DOR was 4.838 (95% confidence interval (CI): 3.808 - 6.146) and of the LnDOR was 1.576 (95% IC: 1.337 - 1.816) (Fig. 2). The pooled area under the summary receiver operating characteristic (SROC) curve was 0.717. The summary estimative of the sensitivity was 0.55 and the false positive rate was 0.209, by bivariate diagnostic random-effects meta-analysis. The Chi-square goodness of fit test rejects the assumption of homogeneity, and the fit of the model for heterogeneity was better (p-value = 0.3661).

    Conclusions: The qSOFA has a poor performance to predict mortality in patients admitted to the ED or ICU.


    [1] Singer M et al. JAMA 315(8):801-10, 2016.

    Fig. 1 (abstract P086).

    Sensitivity and specificity of all included studies

    Fig. 2 (abstract P086).

    Forest plot lnDOR

    P087 Can early changes in SOFA score predict final outcome? An analysis of two cohorts

    E Karakike1, I Tsangaris1, A Savva1, C Routsi1, K Leventogiannis1, M Tsilika1, V Apollonatou1, I Tomos1, JL Vincent2, EJ Giamarellos-Bourboulis1

    1National and Kapodistrian University of Athens, Athens, Greece,2Erasme Hospital, Université Libre de Bruxelles, Brussels, Belgium

    Introduction: Based on the new Sepsis-3 definitions, it may be hypothesized that changes of baseline SOFA (Sequential Organ Failure Assessment score, δ SOFA) may become a measure of treatment efficacy. To this end, the earliest time-point over the sepsis course where these changes are seen should be specified. This was attempted in the present study through test and validation cohorts.

    Methods: We used clinical data coming from two randomized clinical trials where intravenous clarithromycin was compared to placebo in patients with Gram-negative infections, meeting the 1991 sepsis definitions [1, 2]. We depicted those patients who retrospectively met Sepsis-3 definitions; 449 patients were in the test cohort [1] and 199 in the validation cohort [2]. δ SOFA on days 2, 3, 5, 7, 14 and 28 were calculated. The areas under the curves (AUC) of ROCs were designed to define association with 28-day mortality.

    Results: The AUCs on follow-up days (Table 1) indicated earliest changes on day 7 to detect outcome. On that day, less than 25% decrease of SOFA was associated with 87.1% sensitivity for 28-day mortality (odds ratio 14.44; p=3.13 x10-22). This was 94.9% in the validation cohort (odds ratio 6.95; p=2.1 x10-4).

    Conclusions: 25% decrease of SOFA by day 7 is associated with 28-day mortality and could guide decision-making in future sepsis trials.

    The study was supported by the Horizon 2020 Marie-Curie Grant European Sepsis Academy


    1. Giamarellos-Bourboulis EJ et al. J Antimicrob Chemother 69: 1111-8, 2014

    2. Giamarellos-Bourboulis EJ et al. Clin Infect Dis 46: 1157-64, 2008

    Table 1 (abstract P087). Change of SOFA to discriminate final outcome over-follow-up

    P088 Delta SOFA scores for prediction of treatment outcome in sepsis and septic shock patients

    T Lertwattanachai1, P Dilokpattanamongkol1, V Tangsujaritvijit2

    1Faculty of Pharmacy, Mahidol University, Bangkok, Thailand,2Faculty of Medicine Ramathibodi Hospital, Mahidol University, Bangkok, Thailand

    Introduction: Sepsis and septic shock patients are the most common cause of death in intensive care units.[1] The aim of this study is to quantify the relationship between 72 hours sequential organ failure assessment (SOFA) scores change and in-hospital mortality as a treatment outcome in sepsis and septic shock patients.

    Methods: A retrospective cohort study in tertiary hospital, Thailand was conducted. Sepsis or septic shock patients receiving carbapenems in medical intensive care unit during April to September 2017 were recruited. Delta SOFA scores, calculated by SOFA day 4 – SOFA day 1 after receiving the first dose of carbapenem, and in-hospital mortality were collected.

    Results: There were 86 adult patients (54.70% men, mean age 66 + 17 years, and 70.90% septic shock) during the study period. In-hospital mortality rate was 43%. Mean delta SOFA scores were -0.060 + 3.572 points. Comparing between two groups, the mean delta SOFA scores were significant lower in survivor group than in non-survivor group (-1.140 + 3.116 vs. 1.380 + 3.669; P < 0.001).

    Conclusions: The delta SOFA scores during the first few days were a useful predictor of in-hospital mortality in critically ill patients with sepsis and septic shock. An increase in delta SOFA score in the first 72 hours of treatment should be reassess for evaluate an adequacy of antibiotic therapy.


    1. Bassetti M et al. Intensive Care Med. 44:73-75, 2017

    P089 Experience of having no set criteria for an outreach (rapid response) team in a new hospital

    H Tan, J Teh, F Lee, G Soo, N Horsahamay, S Tan, Y Lee, F Khan

    Ng Teng Fong General Hospital, Singapore, Singapore

    Introduction: An Outreach Team, akin to a Rapid Response Team, is made up of healthcare professionals assembled together for quick and effective reviews in managing of rapidly deteriorating or gravely deteriorated patients [1]. This study aimed to look at the variety of patient referrals in terms of their severity, patient dynamics, reasons for referral and their subsequent dispositions.

    Methods: 258 patient records were randomly reviewed retrospectively from July to October 2017. Data were collated in an excel spreadsheet for comparison and then sorted in accordance with the clinical questions and percentages calculated.

    Results: From the 258 referrals, the severity criteria was done by calculating the National Early Warning Score (NEWS). It was found that 51% patients had a score of 0-4, 23% had a score of 5-6, and 26% scored more or equal to 7. 50% of patients were in the age range 61-70 years old. 78% referrals came from the Emergency Department (ED) where a consultant was involved in the decision of the referral; of this, 46% were referred during office hours of 8AM to 5PM where there was greater manpower to aid management. 19% referrals came from inpatients on the General Wards; 32% were done during office hours. 65% of referrals were transferred to IC/HD upon review; 35% were not, from whom 9 died and 7 were later admitted after procedures (2%) or because they deteriorated further (1%). For reasons for referrals and disposition decisions, see Fig. 1.

    Conclusions: Despite having no set criteria for Outreach Team referrals, the accuracy rate was nearly 65% admissions to IC/HD based on clinician concerns. There was only 1% re-admission rate having been re-reviewed when the patients had not been deemed suitable for IC/HD admission initially. Therefore referrals were done accurately and safely with the protocol of clinician referral openness directly to IC consultants.


    DeVita MA et al. BMJ Quality & Safety 13: 251-254, 2004.

    Fig. 1 (abstract P089).

    Consort diagram of reasons for referral and various dispositions

    P090 A novel model for early detection of patient deterioration in ICU

    Y Lichter1, D Stavi1, U Keler2, IM Pessach3, H Artsi1, N Adi1, I Matot1

    1Tel-Aviv Sourasky Medical Center, Tel Aviv, Israel,2Intensix, Netanya, Israel,3Sheba Medical Center, Safra Children’s Hospital, Tel-Hashomer, Israel

    Introduction: Prompt recognition of patient deterioration allows early initiation of medical intervention with reduction in morbidity and mortality. This digital era provides an opportunity to harness the power of machine learning algorithms to process and analyze big data, automatically acquired from the electronic medical records. The results can be implemented in real-time. Intensix (Netanya, Israel) has developed a novel predictive model that detects early signs of patient deterioration and alerts physicians. In this study we prospectively validated the ability of the model to detect patient deterioration in real time.

    Methods: The model was developed and validated using a retrospective cohort of 9246 consecutive patients admitted to the Intensive Care Unit in the Tel-Aviv Sourasky medical center – a tertiary care facility in Israel, between January 2007 and December 2015. In this study, we tested model performance in real time, on a cohort of 333 patients admitted to the same ICU between June 2016 and August 2017. Significant events that lead to major interventions (e.g. intubation, initiation of treatment for sepsis or shock, etc.) were tagged upon medical case review by a senior intensivist, blinded to model alerts. These tags were then compared with model alerts.

    Results: A total of 136 patients suffered major events during study period, out of which 109 were detected by the model, resulting in a sensitivity of 0.80, specificity of 0.93 and a PPV of 0.89. The model AUC-ROC was 0.86.

    System operation and algorithm execution were fluent and reliable.

    Conclusions: We developed a machine-learning model that can reliably recognize patient deterioration in real time. Ongoing research aims at showing improved model validity and verifying its ability to precede clinical detection. Future research is needed to demonstrate positive effect on patient outcome.

    P091 Detecting acute deterioration in hospital – the NEWS is not enough

    T Buttle, P Parulekar, G Glover

    Guys and St Thomas’ NHS Foundation Trust, London, UK

    Introduction: The UK National Early Warning Score (NEWS) is advocated to detect acute deterioration in hospital [1], however NEWS may lack sensitivity to detect all patients requiring Critical Care (CC) admission.

    Methods: An analysis of the Rapid Response Team (RRT) database for a university hospital; all acute reviews/emergency calls, 1st March – 31st May 2017. Protocolised RRT calling criteria are NEWS >=5/single parameter score 3. For patients reviewed by RRT, probability of CC admission was calculated at each NEWS level. For admissions not meeting calling criteria (‘low NEWS’), reason for admission was investigated. Data is shown as median [IQR] or count (%).

    Results: There were 2742 acute RRT reviews for 1009 patients; median [IQR] NEWS 4 [3-6]. 1240 reviews occurred despite ‘low NEWS’ (Fig. 1). RRT review led to CC admission in 315 (31.2%) cases; median [IQR] NEWS 5 [3-8]. Probability of admission increased with higher NEWS (Fig. 1), however 82 admissions had ‘low NEWS’. Of these 51 were excluded due to high NEWS trigger in the preceding 24hrs or post-operative status. The remaining 31 (9.8%) represented genuine low NEWS cases; age 54 [36-64], 50% male, admission APACHE II 12 [8-17] and day 1 SOFA 2 [1-5]. Admission source was emergency department 29%, medical 42%, surgical 29%. Diagnoses are shown in Table 1. No low NEWS patients with sepsis were qSOFA positive. CC length of stay was 2 [1-4] days and ICU mortality was 9.6%.

    Conclusions: A high proportion of RRT activity occurs at low levels of abnormal physiology. Despite an association between NEWS and CC admission, NEWS fails to trigger for approximately one in ten admitted cases. Clinical concern remains an important component of the escalation of acutely ill patients. Meanwhile, novel markers of deterioration should be sought and validated.


    1. Smith GB et al. Resuscitation 84:465-70, 2013

    Table 1 (abstract P091). Diagnostic categories
    Fig. 1 (abstract P091).

    Number of reviews and proportion admitted at each level of NEWS

    P092 The impact of diurnal pattern of rapid response call activation on patients outcome

    J Silvestre, N Candeias, A Ricardo, R Marques, F Brás, J Nunes

    Hospital dos Lusiadas, Lisbon, Portugal

    Introduction: Although rapid response systems are known to reduce in-hospital cardiac arrest rate, their effect on mortality remains debated. The Rapid Response Call (RRC) is a system designed to escalate care to a specialised team in response to the detection of patient deterioration. There are diurnal variations in hospital staffing levels that can influence the performance of rapid response systems and patient outcomes. The objective of this study was to examine the relationship between the time of RRC activations and patient outcome.

    Methods: Review of retrospectively collected, linked clinical and administrative datasets, at a private hospital during a 34-month period. All patients with medical emergency team activation were included. Rapid response calls occurring between 18:00-07:59 were defined as ‘out of hours’.

    Results: Between January 2015 and October 2017 there were 209 RRC. The trigger for RRCs activation was nurse concern (101; 38.3%), modified early warning score (80; 28.3%) and cardiac arrest (28; 13.4%). 44 RRCs were “out of hours” being the main activation trigger a modified warning score > 5. “Out of hours” patients had higher ICU admissions (31.7% versus 20%) and were more likely to have an in-hospital cardiopulmonary arrest (OR=1.4, p<0.002).

    Conclusions: The diurnal timing of RRCs appears to have significant implications for patient outcomes. Out of hours calls are associated to a poorer outcome. This finding has implications for staffing and resource allocation.

    P093 Sepsis in medical and surgical patients: a 6 years retrospective study

    G Zani, F Di Antonio, M Pichetti, A Garelli, C Gecele, F Menchise, M Valbonetti, S Mescolini, E Graziani, C Nencini, FD Baccarini, M Fusari

    Santa Maria delle Croci Hospital, Ravenna, Italy

    Introduction: Sepsis is life-threatening organ dysfunction caused by a dysregulated host response to infection [1, 2]. We compared organ failure incidence and evolution in medical versus surgical septic patients in ICU.

    Methods: Septic patients admitted to a general ICU were retrospectively analyzed from 2012 to 2017 for: SAPSII, SOFA score at ICU admission and worst value during ICU stay, site of infection and severity, duration of MV, need and timing for tracheotomy, need and duration of vasoactive drugs, need for RRT, ICU-acquired infections, ICU and post-ICU LOS and outcome. Traumatic and neurological patients were excluded. P value <0.05 was considered significant.

    Results: 956 septic patients were enrolled: 56% medical and 44% surgical. Medical patients were younger (66vs70yy, p<0.05) and with worst SAPSII (53vs49, p<0.05). At ICU admission SOFA score was higher in medical patients (9vs7, p<0.05), due primary to neurological and renal dysfunction. During ICU stay, medical patients revealed an haemodynamic worsening (8% shock increase, p<0.05). Moderate ARF was prevalent in both groups; surgical patients had a higher need of MV (96%vs83%, p<0.05), but with a shorter duration than medical ones that were mostly treated with tracheotomy (26vs15%, p<0.05). AKI was more severe in medical patients and worsened in both groups without differences on need of RRT. Targeted antibiotic therapy was higher in medical patients (63vs35%, p<0.05), but no differences emerged for duration and superinfections. Medical patients had a longer ICU LOS (8vs6dd, p<0.05), with a higher ICU mortality rate (26vs17%, p<0.05); they showed a shorter post-ICU LOS (11vs16dd, p<0.05) with a higher but not significant inhospital mortality rate (37vs33%).

    Conclusions: Septic medical patients had a worst outcome in comparison to surgical ones, probably related to a more severe clinical state at ICU admission and a worsening in organ function.


    [1] Rhodes A et al. Intensive Care Med 43:304-377,2017

    [2] Michael D et al. JAMA 317:847-848,2017

    P094 The use of the medication based disease burden index and drug counts to quantify comorbidity during intensive care and to predict long term mortality

    C McMechan1, M Booth2, J Kinsella1

    1University of Glasgow, Glasgow, UK,2Glasgow Royal Infirmary, Glasgow, UK

    Introduction: The aims of the project were to use the Medication based Disease Burden Index (MDBI) and drug counts to quantify comorbidity at ICU admission and assess how levels of comorbidity change during a stay in ICU and to assess how comorbidity affects long term survival after a stay in ICU. Pharmacy data offers an alternative method of quantifying comorbidity and the MDBI is one method of doing this that can predict mortality.

    Methods: Data was collected from patients admitted to Glasgow Royal Infirmary ICU between 01/01/14 to 31/12/15. Ethical approval was sought. This data was used to produce an MDBI score and a drug count before and after an ICU stay. Information on long term mortality was also collected. T tests were used to determine the difference in comorbidity levels pre and post ICU. Kaplan Meier curves were used to establish if comorbidity affects long term mortality. Logistic regression was used to determine which method of measuring comorbidity was better at predicting mortality.

    Results: A paired t test was performed on 437 patients that showed comorbidity increases after a stay in ICU, as measured by the MDBI and drug counts. Kaplan Meier curves demonstrated that as comorbidity increases, long term survival decreases. Survival time was calculated as time from ICU discharge date to the date of death or end of study (31/01/17). The hazard ratio for the high MDBI group was 1.89 when compared to the zero MDBI group. The hazard ratio for the high drug count group was 1.81 when compared to the zero drug count group. Cox proportional hazard models were performed and results remained significant after adjusting for age. Logistic regression showed that the MDBI was better at predicting long term mortality than drug counts.

    Conclusions: This study increases the ever-growing evidence that the MDBI is a useful predictive tool for quantifying comorbidity and predicting long term mortality. Further research is required to replicate its use in other populations, and potentially other specialities.

    P095 Adverse events among those admitted to the ICU: a retrospective cohort study using administrative data

    KM Sauro, A Soo, HT Stelfox

    University of Calgary, Calgary, Canada

    Introduction: The objective of this study is to estimate the frequency and type of, and factors associated with adverse events (AEs) among those with an intensive care unit (ICU) admission. AEs are unintended, negative consequences of care that compromise patients’ health and are costly to the healthcare system. In Canada, an estimated 7.5 per 100 hospital admissions are associated with an AE, but evidence suggests that the rate of AEs varies by hospital unit and by patient population. However, few studies examine AEs in the ICU.

    Methods: This retrospective cohort study included patients admitted to 30 adult ICUs & CCUs in Alberta, Canada (n=30) between May 2014 and April 2017. Validated ICD-10CA algorithms for 18 patient safety indicators were used to estimate the frequency of any AE and each type of AE. Regression analysis was used to examine factors associated with AEs.

    Results: Of 49,447 admissions, the typical admission was a 62 (IQR=21) year old male (64%), admitted for a non-surgical cardiac reason (35%). At least 1 AE was experienced by 12,549 (25%) ICU patients during their hospital admission. The most common AEs were respiratory complications (10%) and hospital acquired infections (9%). Those who were re-admitted to ICU (OR=4.83, 95% CI=4.48, 5.20), admitted for a general surgical vs. non-surgical cardiac reason (OR=9.49, 95% CI=8.84, 10.20) and had >=2 comorbidities (OR=1.82, 95% CI=1.73, 1.92) had increased odds of an AE, while those who spent >50% of their hospital admission in ICU (OR=0.42, 95% CI=0.41, 0.44) had decreased odds of an AE. Those who experienced an AE stayed 5.8 days longer in ICU and 23.5 days longer in hospital, and had increased risk of hospital mortality (OR=2.41, 95% CI=2.27, 2.55) than those who did not experience an AE.

    Conclusions: AEs are common among patients admitted to ICU, highlighting the need for ongoing quality improvement initiatives to improve the safety of care.

    P096 Clinical characteristics and outcomes of toxic epidermal necrolysis in a Tunisian tertiary critical care unit (icu)

    I Ben Saida, W Zarrougui, E Ennouri, N Sma, N Fraj, S Kortli, N Fathallah, M Boussarsar

    Farhat Hached Teaching hospital, Sousse, Tunisia

    Introduction: Toxic epidermal necrolysis (TEN) is a rare, potentially life threatening mucocutaneous disease. The aim of the study was to determine clinical characteristics and outcomes of patients admitted to ICU with a diagnosis of TEN.

    Methods: A retrospective study was performed in the ICU of Farhat Hached hospital of Sousse between January 1995 and September 2017. Data were collected by reviewing the medical patients’ charts. A multivariate regression analysis was used to identify risk factors for ICU mortality in those patients.

    Results: A total of 27 patients were recorded. Mean age was 43 years (range, 17 to 76). 19(70.4%) were male. Median of CHARLSON index was 1 [0-4]. Mean SAPS II was 29.59±16. The average affected skin area was 50.5 ± 28.95% of total body surface area. Mucous membrane involvement was seen in the mouth or pharynx (21, 77.8%), eye (18, 66.7%) and genital area (15, 55.6%). NIKOLSKY sign was positive in 25 patients. The most common drugs that triggered TEN were antibiotics (8/27, 29.62%), allopurinol (6/27, 22.22%), anticonvulsants (5/27, 18.51 %), non-steroidal anti-inflammatory drugs (3/27, 11.11%), and antipsychotic drugs (1/27, 3.7%). 6 patients (22.2%) required mechanical ventilation, 7 (25.9%) vasoactive drugs and 2 (7.4%) renal replacement therapy. The major complications were acute renal failure (51.9%) and sepsis (29.6 %). The mortality rate was 40.6%. This rate was much higher than predicted mortality according to a severity-of-illness scoring system for TEN prognosis (SCORTEN) score. In univariate analysis, predictors of fatal outcome were: invasive mechanical ventilation (p=0.0.27), vasoactive drugs (p=0.026), acute renal failure (p=0,012), age (p=0.042) and CHARLSON index (p=0.01). Acute renal failure was the only independent factor of ICU mortality (OR, 19.8 ; 95%CI, [1.94- 201.62] ; p=0.012).

    Conclusions: The present study demonstrated a severe prognosis in TEN patients. Acute renal failure was identified as the sole independent factor associated to mortality.

    P097 D-dimer and the national early warning score: identifying medical patients at low risk of 30-day mortality in a Danish emergency department

    H Lyngholm1, C Nickel2, J Kellett1, S Chang1, M Brabrand1

    1Hospital of South West Jutland, Esbjerg, Denmark,2University Hospital Basel, Emergency Department, Basel, Switzerland

    Introduction: The aim of this study was to prospectively validate the use of low D-dimer levels in combination with a low National Early Warning Score (NEWS) to identify medical patients at low risk of 30-day mortality in an unselected cohort representative of acutely ill patients normally seen in an emergency department (ED).

    Methods: In this prospective observational study, plasma D-dimer levels and NEWS of all acute adult consenting medical patients presenting to the ED at the Hospital of South West Jutland were assessed at arrival. 30-day survival status was extracted from the Danish Civil Registration System which ensured complete follow-up. Patients were sorted by high and low d-dimer with a cut-off value of 0.50 mg/L and additionally by high and low NEWS with a cut-off score of 2.

    Results: The final study population consisted of 1516 patients with a median (25-75 percentile) age 66 years (52-77) of which 49.4% were female. 791 (52.2%) patients had a low D-dimer (<0.50 mg/L) of which 3 (0.38%, 95%CI 0.12-1.17%) died within 30 days; all of these patients had a low NEWS (<2). 725 (47.8%) patients had a high d-dimer (>=0.50 mg/L) of which 32 (4.4%, 95%CI 3.14-6.18) died; 12 (37.5%) of these had a low NEWS (<2). Comparing 30-day mortality for patients with high and low D-dimer, the odds ratio is 12.3 (95%CI 3.7-40.3). 14 of the 35 patients (40.0%) who died had a low NEWS at presentation to the ED.

    Conclusions: Low D-dimer levels appear to identify patients at low risk of 30-day mortality. The addition of NEWS does not appear to increase this ability. Further validation is needed.

    P098 Comparison of severity score models based on different sepsis definitions for predicting in-hospital mortality of sepsis patients in medical intensive care unit

    T Songsangjinda, B Khwannimit

    Prince of Songkla University, Hat Yai, Thailand

    Introduction: There are three generations of sepsis definition concepts: Systemic Inflammatory Response Syndrome (SIRS), Predisposition, Insult, Response, Organ dysfunction (PIRO), and Sequential Organ Failure Assessment (SOFA). However, the performance between these concepts had not been compared. The aim of our study to evaluate and compare the performance between severity score models based on different sepsis definitions in order to predict outcomes among sepsis patients.

    Methods: A retrospective analysis over a 10-year period. The primary outcome was in-hospital mortality and the secondary outcome was the composite of hospital death and ICU stay of more than 72 hours.

    Results: A total of 2,152 sepsis patients were enrolled. The hospital mortality was 45.9%. Mean APACHE-II score was 23.9. The SOFA score had the highest performance for predicting hospital mortality with an area under the receiver operating characteristic curve (AUC) of 0.86. The AUC of SOFA score was statistically greater than the other scores (p<0.001, Fig. 1). Also, the SOFA and qSOFA presented good discrimination for secondary outcome. The AUC of SOFA (0.76) and qSOFA (0.76) for predicting secondary outcome was statistically greater than those of SIRS (0.59, p<0.001) and the PIRO models (Howell 0.72, Robulotta 0.71, p=0.01). In the subgroup analysis (n=1,239), serum lactate value (>2 mmol/L) was shown to improve qSOFA specificity from 32.4% to 54.1% with comparable sensitivity (96.9% and 94.7%). Howell’s PIRO performance was not significantly changed.

    Conclusions: The SOFA score had the best performance for predicting hospital mortality among ICU sepsis patients. Our findings support the Sepsis-3 using SOFA in an ICU setting.

    Fig. 1 (abstract P098).

    Comparison of the area under the receiver operating characteristic curve of all scores for predicting hospital mortality in ICU sepsis patients.

    P099 Trends in infection and sepsis incidence and mortality in Germany

    A Mikolajetz

    Jena University Hospital, Jena, Germany

    Introduction: Sepsis is one of the most prevalent diseases among hospitalized patients and an important contributor to hospital mortality. The study aims to assess trends in infection and sepsis incidence and mortality between 2010-2015 in Germany.

    Methods: We analyzed hospital discharge data from the years 2010-2015 by using the Diagnosis-related Groups Statistics of the German Federal Statistical Office, which contains nearly complete data of all inpatient hospital treatments in Germany. We identified cases of infection, infection and organ dysfunction, sepsis (incl. severe sepsis and septic shock) and severe sepsis (incl. septic shock) using ICD-10 codes coded as primary and secondary discharge diagnoses and procedural OPS codes. We assessed incidences and discharge disposition incl. mortality.

    Results: Incidences, mortalities and discharge disposition comparing 2010 and 2015 and the mean annual increase in incidence rates are reported in Tables 1 and 2.

    Conclusions: The annual increase in standardized sepsis incidence rates is greater than in infections, but similar to the increase in infectious disease patients with organ dysfunction, which are less prone to coding incentives than sepsis codes. An increasing number of patients is discharged to nursing homes and hospice. Given the alarming increase in sepsis cases and deaths, this analysis confirms sepsis as a key priority for health care systems.

    Table 1 (abstract P099). Infections, 2010-2015 in Germany: Incidence, mortality and discharge disposition
    Table 2 (abstract P099). Sepsis, 2010-2015 in Germany: Incidence, mortality and discharge disposition

    P100 Initial management of sepsis by day of the week in the north west of England

    W Angus1, P Turton2, E Nsutebu2, C McGrath1

    1Wirral University Teaching Hospital NHS Foundation Trust, Wirral, UK,2Royal Liverpool and Broadgreen University Hospitals NHS Trust, Liverpool, UK

    Introduction: The Advancing Quality Sepsis Programme is an established approach to reducing variation and improving outcomes in the North West of England. It aims to improve clinical care by producing and implementing evidence-based bundles of care across a collaborative network of hospitals. Data is collected, analysed and fed back enabling monitoring and comparison of quality of sepsis care in the form of an Appropriate Care Score (ACS), mortality rate and length of stay.

    Methods: Between September 2014 and 2016 data from 25,358 patients who generated an inpatient sepsis code (ICD10) were collected. Of these 11,301 patients had confirmed sepsis (Sepsis 2 criteria) at presentation. 5207 patients were either hypotensive (SBP <90mmHg) or hyperlactataemic (Lactate >4mmol/L) at presentation. ACS, mean time to antibiotics, blood cultures and lactate measurement were calculated for each day of the week. Mortality and length of stay were measured, enabling comparison of weekday and weekend presentation. Data was analysed using SPSS software.

    Results: Comparing weekend to weekday presentation did not reveal any significant differences in ACS, time to antibiotics, blood cultures or lactate measurement. Mortality rates and length of stay were not significantly different between the groups. There does not appear to be a weekend effect in sepsis care for this cohort of patients. There were more patients with hypotension and/or hyperlactaemia presenting on a Monday.

    Conclusions: Quality of sepsis care was not significantly different between weekend and weekday presentation for patients in this cohort. There were no significant differences in mortality or length of stay when comparing weekday or weekend presentation. There were more septic patients with hyperlactataemia and/or hypotension presenting on a Monday, which may indicate a reluctance of septic patients to present over the weekend.

    P101 Weekend effect is not associated with delays in antibiotic administration in septic patients in the emergency department

    R Passos, J Ramos, B Fahel, C Starteri, G Silva, M Manciola, M Barbosa, J Caldas, S Farias, M Teixeira, A Gobatto, M Ribeiro, P Batista

    Hospital Sao Rafael, Salvador, Brazil

    Introduction: Patients with urgent admissions to the hospital on weekends may be subjected to a higher risk of worse outcomes, which may be due to differences in compliance to established processes. Because delays to antibiotic administration is an important measure of sepsis protocol efficiency and has been associated to worse outcomes, we aimed to assess the association of the weekend effect (admissions on weekend) with timing to antibiotic administration.

    Methods: Patients included in the sepsis protocol in the emergency department (ED) of Hospital Sao Rafael, from January 2016 to July 2017 were retrospectively evaluated. Sepsis protocol is supposed to be activated to every patient with a suspected sepsis diagnosis in the ED. We evaluated the association of weekend (saturday or sunday) admission with timing to antibiotic administration.

    Results: In the study period, 257 patients were evaluated, of which 121 (47%) were male, with a mean age of 59±23 years. Mortality was 27% (70 patients) and 113 (44%) were admitted to the ICU. Mean SOFA score was 2±1.8 and mean Charlson comorbidity index was 4±3.2. Sixty-eight (26%) patients were admitted during weekend. There was no difference in time to antibiotic administration between patients admitted during weekend (31±42 minutes) and patients admitted during weekdays (31±41 minutes). Also, mortality was similar for both groups of patients [OR(95%CI)=0.83(0.47-1.59)].

    Conclusions: In this cohort of patients with a suspicion of sepsis in the ED, admission during weekend was not associated to worse outcomes.

    P102 Relationship between intensive care unit hypotension and morbidity in patients diagnosed with sepsis

    K Maheswari1, M Stevens2, S Munson3, B Nathanson4, S Hwang3, A Khanna1

    1Cleveland Clinic, Cleveland, OH, USA,2Edwards Lifesciences, Irvine, CA, USA,3Boston Strategic Partners, Inc., Boston, MA, USA,4OptiStatim, LLC, Longmeadow, MA, USA

    Introduction: Current sepsis guidelines emphasize resuscitation of hypotension to a mean arterial pressure (MAP) of at least 65 mmHg [1]. A MAP less than 90 mmHg appears to be associated with poor outcomes in postoperative patients in the intensive care unit (ICU) [2]. However, extent of hypotension in critically ill septic patients during ICU stay and its relationship with adverse outcomes is poorly defined. We determined the magnitude of hypotension in ICU patients with a diagnosis of sepsis and its association with major complications.

    Methods: With IRB approval we evaluated records from a large US electronic health records database (Cerner HealthFacts®, Kansas City, MO) of adult patients with a diagnosis of sepsis and ICU stay >= 24 hours from Jan 2010 - Nov 2016. Patients with a history of acute myocardial infarction or acute kidney injury (AKI) for six months prior to ICU admission or < 5 MAP readings/ICU day were excluded. Hypotension exposure was defined and analyzed via three methods: total time spent below MAP <65 mmHg; time-weighted average (TWA) MAP <65 mmHg, and the number of MAP readings <65 mmHg. Analyses were repeated for different MAP thresholds (<55, <75, <85 mmHg). We estimated association between hypotension exposure and a major morbidity composite defined by mortality, myocardial injury and AKI using multivariable logistic regression models.

    Results: 10,495 patients met all qualifying criteria. 74% of sepsis patients experienced ICU hypotensive events with MAP <65 mmHg; 40% with MAP <55 mmHg. The number of minutes the average patient spent with MAP < 65 mmHg per ICU day will be presented, as will unadjusted/adjusted rates for the morbidity outcome, and unadjusted rates for composite components for all MAP thresholds.

    Conclusions: The result of this analysis will determine the amount and duration of ICU hypotension that is associated with major morbidity in patients with sepsis.


    1. Rhodes et al. Crit Care Med 45:486-552, 2017

    2. Khanna AK et al. SCCM 2018 (Abstract #177)

    P103 The role of hyperoxia in sepsis mortality

    CV Cosgriff1, LA Celi2

    1Harvard T.H. Chan School of Public Health, Boston, MA, USA,2Massachusetts Institute of Technology, Cambridge, MA, USA

    Introduction: The role of hyperoxia during oxygen administration in sepsis mortality remains uncertain and controversial [1, 2]. We hypothesized that the duration of hyperoxia while ventilated in the ICU was associated with increased in-hospital expiration in septic patients.

    Methods: The Medical Information Mart for Intensive Care database (MIMIC-III), containing data for ~60,000 ICU admissions at Beth Israel Deaconess Medical Center from 2001 to 2012, was used to derive a cohort of ventilated sepsis patients [3]. Hyperoxia was defined as arterial oxygen saturation by pulse oximetry (SpO2) >98%. Extracted SpO2 were transformed to estimate time spent hyperoxic. Patients were grouped by hyperoxic duration into bins derived from quartiles of the hyperoxic duration. Group 1 (lowest quartile) was taken as the reference. The association between hyperoxic group and in-hospital mortality was examined by logistic regression.

    Results: Of the 46,476 patients in MIMIC-III, 2,591 met criteria for inclusion. After adjustment for for age, gender, ethnicity, unit type, length of stay, duration of ventilation, disease severity, burden of comorbidity, and the use of vasopressors, hyperoxic group 4 was significantly associated with in-hospital mortality (OR = 2.47, p = 0.004, 95%CI: [1.35, 4.59]), as was group 3 (OR = 1.93, p = 0. 013, 95%CI: [1.16, 3.28]). Group 2 was not associated (OR = .84, p = 0. 564, 95%CI: [0.46, 1.53]). Figure 1 summarises these results.

    Conclusions: Odds of in-hospital death were nearly double and more than double in group 3 and 4 respectively, and we conclude that longer durations of hyperoxia are associated with increased in-hospital mortality in sepsis patients.


    [1] Vincent JL et al. Can Respir J 2017: 2834956, 2017

    [2] Hafner S et al. 5:42. 2015

    [3] Johnson AEW et al. Scientific Data 3:160035, 2016

    Fig. 1 (abstract P103).

    Adjusted OR for in-hospital expiration for the hyperoxic groups as compared to group 1 (lowest quartile of hyperoxic duration).

    P104 Treatment and outcomes of vasodilatory shock in an academic medical center

    ND Nielsen 1, F Zeng2, ME Gerbasi3, G Oster3, A Grossman3, NI Shapiro4

    1Tulane School of Medicine, New Orleans, LA, USA,2LaJolla Pharmaceutical Company, San Diego, CA, USA,3Policy Analysis Inc, Brookline, MA, USA,4Beth Israel Deaconess Medical Center, Boston, MA, USA

    Introduction: Consensus clinical guidelines recommend maintaining mean arterial pressure (MAP) >=65 mmHg for vasodilatory shock (VS) patients. This study uses the Medical Information Mart for Intensive Care (MIMIC-III) database to examine treatment and outcomes in patients with severe VS in a real-world setting.

    Methods: We identified patients in the MIMIC-III database which contains information for 61,532 admissions to intensive care units (ICU) at Beth Israel Deaconess Medical Center (BIDMC) in Boston, Massachusetts between 2001 and 2012. Inclusion criteria: 1) aged >=18 years, 2) treated with vasopressors for >=6 hours. Exclusion criteria: cardiac surgery, vasoplegia, cardiogenic shock, intra-aortic balloon pump, extracorporeal membrane oxygenation, large amount of blood transfusion, cardiac tamponade or pulmonary embolism. The primary outcome was mortality.

    Results: There were 5,922 ICU admissions. Among these patients, those who consistently maintained MAP above 65 mmHg (n=167, 3%), 60 mmHg (n=418, 7%), and 55 mmHg (n=949, 16%) while in ICU had lower mortality rates than patients with one or more MAP excursions below these thresholds (n=5755, n=5504, and n=4973, respectively): 11% vs 31% for MAP <65 (p<0.0001); 10% vs 32% for MAP <60 (p<0.0001); and 10% vs 34% for MAP <55 (p<0.0001). When assessing the exposure of hypotension for >=2 continuous hours, ICU mortality rates were 31% for 65 threshold (n=4741), 34% for 60 threshold (n=3498), and 41% for 55 threshold (n=2090). ICU mortality rates were 41%, 52% and 66% for patients with MAP below 65 (n=1686), 60 (n=746), and 55 (n=354), respectively, for >=8 continuous hours.

    Conclusions: Most patients did not have MAP control based on current clinical guidelines, and not achieving the recommended target MAP was associated with worse outcomes. However, since association does not imply causation, trials aimed at more aggressively achieving a MAP >=65 are warranted, and quality of care initiatives aimed at improving MAP control for patients with VS may be helpful.

    P105 Igm enriched immunoglobulins as adjunctive treatment to antimicrobial therapy for patient on septic shock

    A Corona, A Veronese, C Soru, F Cantarero, S Santini, I Cigada, G Spagnolin, E Catena

    ASST Fatebenefratelli Sacco, PO SACCO - Milano, Milano, Italy

    Introduction: Sepsis is responsible of both an immune hyperactivity damage from inflammation and an immune suppression and paralysis. A few studies support the role of IgM enriched immunoglobulins G as adjunctive of antimicrobial treatment [1].

    Methods: Case-control prospective study. Since 12/2016 to 11/2017, patients experiencing a septic shock - admitted with a first 24h SAPS II > 25, associated with a SOFA-score > 4 – underwent treatment with IgM-e-IG, given for three days at the total dosage of 500 mg/kg. The therapy response was based on clinical, microbiological and rheological data. All cases were 1:1 matched with analogous controls.

    Results: Over the study period 17 patients [cases, median age 50 (49-62),] experiencing severe infection (peritonitis, meningitis, community acquired pneumonia and UTI) and in septic shock, were recruited and treated with IgM enriched IgG (Pentaglobin) and matched with specific controls. No differences were found in basic patients characteristics but in median (IQR) SAPS II, little bit higher in cases [56 (43-67) vs. 46 (35-61), p=0.464]. No differences were found in the trend of the 1st 72 hrs in WBC, PCT, CRP, Lactate and PLT, even though SOFA score decreased significantly more in cases [-4 (-5;-2) vs. -1.5 (-2;-0.5), p=0.015]. 28th day survival was 100% in cases and 75.5% in controls (Log Rank = 1.93, p=0.165. VLAD (variable life adjusted display) showed 4.48 more lives saved than those expected by SAPS II for cases than controls (0.77), despite a similar SMR (p=0.419).

    Conclusions: Reduced mortality may be supposed to be correlated to a quicker recovery of organ damage sepsis related. PCRTs should be warranted in the future to corroborate these preliminary data.


    1. Cavazzuti I et al. Intensive Care Med 40(12):1888-96, 2014

    P106 A phase 1b study of anti-PD-L1 (BMS-936559) in sepsis

    RS Hotchkiss1, E Colston2, S Yende3, DC Angus4, LL Moldawer5, ED Crouser6, GS Martin7, CM Coopersmith8, S Brakenridge5, FB Mayr3, PK Park9, K Zhu2, M Wind-Rotolo2, T Duan2, J Ye2, Y Luo2, IM Catlett2, K Rana2, DM Grasela2

    1Washington University School of Medicine, St Louis, MO, USA,2Bristol-Myers Squibb, Inc., Lawrenceville, NJ, & Wallingford, CT, USA,3Veterans Affairs Pittsburgh Healthcare System and University of Pittsburgh, Pittsburgh, PA, USA,4University of Pittsburgh Critical Care Medicine CRISMA Laboratory, Pittsburgh, PA, USA,5University of Florida, Gainesville, FL, USA,6The Ohio State University, Columbus, OH, USA,7Dept. of Medicine, Division of Pulmonary, Allergy, Critical Care & Sleep Medicine, Emory University, Atlanta, GA, USA,8Dept. of Surgery, Emory University, Atlanta, GA, USA,9University of Michigan, Ann Arbor, MI, USA

    Introduction: The PD-1/PD-L1 immune checkpoint pathway is involved in sepsis-associated immunopathy. We assessed the safety of anti-PD-L1 (BMS-936559, Bristol-Myers Squibb) and its effect on immune biomarkers and exploratory clinical outcomes in participants with sepsis-associated immunopathy.

    Methods: Participants with sepsis/septic shock and absolute lymphocyte count <=1100 cells/μ L received BMS-936559 i.v. (10–900mg; n=20) or placebo (PBO; n=4) + standard of care and were followed for 90d. Primary endpoints were death and adverse events (AEs); secondary endpoints were monocyte (m)HLA-DR levels and clinical outcomes.

    Results: Apart from the treated group being older (median 62y treated pooled vs 46y PBO) and sicker ([>=]3 organ dysfunctions: 55% treated pooled vs 25% PBO), baseline characteristics were comparable. 6/24 (25%) participants died (10mg: 2/4 [50%]; 30mg: 2/4 [50%]; 100mg: 1/4 [25%]; 300mg: 1/4 [25%] 900mg: 0/4; PBO: 0/4). All participants had AEs (grade 1–2: 75%), with one participant (30mg) having potentially drug-related AEs (grade 1–2 increases in amylase, lipase and LDH). 3/20 (15%) treated pooled and 1/4 (25%) PBO had a serious AE, with none deemed drug-related. AEs of special interest (AEOSI, i.e. potentially immune-related) occurred in [>=]1 participant per group, with diarrhea (33%) the most common. All but 3 AEOSI (1 lung infiltration, 2 diarrhea) were grade 1–2. At the two highest doses there was a trend toward an increase in mHLA-DR expression (>5000 mAb/cell) that persisted beyond 30d. No clear dose-relationship or between-group difference in clinical outcomes (duration of organ support, viral reactivation, ICU/hospital length of stay) was seen.

    Conclusions: In this sick population, BMS-936559 was well tolerated. There were no AEs indicative of an excessive drug-induced pro-inflammatory state. At higher doses, a trend toward sustained restoration of mHLA-DR expression was seen. These findings justify further study of PD-1/PD-L1 inhibitors in sepsis.

    P107 Effects of a non-neutralizing humanized monoclonal anti-adrenomedullin antibody in a porcine two-hit model of hemorrhage and septic shock

    C Thiele1, TP Simon1, J Szymanski1, C Daniel2, C Golias1, J Struck3, G Marx1, T Schürholz4

    1Uniklinik RWTH Aachen, Aachen, Germany,2Universität Erlangen-Nürnberg, Erlangen, Germany,3Adrenomed AG, Hennigsdorf, Germany,4Universitätsmedizin Rostock, Rostock, Germany

    Introduction: Adrenomedullin (ADM) is a vasoactive peptide improving endothelial barrier function in sepsis, but may cause hypotension and organ failure. Treatment with an ADM monoclonal antibody (mAB) showed improvement in murine sepsis models. Here, we tested effects of the humanized anti-ADM mAB Adrecizumab (AC) in a porcine two hit model of hemorrhagic (HS) and septic shock (SSH).

    Methods: In a randomized, blinded study 12 German Landrace pigs (31+/-2 kg) were bled to half of baseline MAP for 45 minutes (HS). SSH was induced using an E.coli clot (7-9x10^11CFU/kg BW) placed into the abdominal cavity 6 hours after HS. Animals received either 2 mg/kg BW Adrecizumab or vehicle (VH) immediately after SSH induction. After 4 hours, resuscitation was initiated using balanced crystalloids and noradrenalin to maintain a CVP of 8-12 mmHg, a MAP >65mmHg and a ScvO2 >70% for another 8 hours. Hemodynamics, laboratory parameters and kidney histology were assessed. General linear model, MWU or Chi2 test were used for statistics where appropriate. P<0.05 was considered significant.

    Results: Volume resuscitation was significantly lower in the AC compared to VH group (5300 vs. 6654ml; p=0.036). Vasopressor therapy was necessary in significantly less animals in the AC group (33 vs. 100%; p=0.014). Horowitz index was higher in the AC group (375 vs 286mmHg, p=0.055). Kidney histology showed significantly lower granulocytes in both cortex (9.1 vs. 31.1 n/mm2; p=0.02) and medulla (19.3 vs 53.0 n/mm2; p=0.004) in AC treated animals. After induction of sepsis, plasma ADM increased immediately in both groups, but increased quicker and more pronounced in the AC group (p=0.003 for time*group effect).

    Conclusions: In this two hit shock model treatment with Adrecizumab overproportionally increased plasma ADM levels. Hemodynamics and pulmonary function were improved and histological kidney damage was reduced. Thus, therapy with Adrecizumab may provide benefit in septic shock, and clinical investigation candidate is warranted.

    P108 Prognosis of patients excluded by the definition of septic shock based on their lactate levels after initial fluid resuscitation: a prospective multi-center observational study

    B Ko1, W Kim2, T Lim1

    1Hanyang University Hospital, Seoul, Korea, Seoul, South Korea,2University of Ulsan College of Medicine, Asan Medical Center, Seoul, South Korea

    Introduction: Lactate levels should be measured after volume resuscitation (as per the Sepsis-3 definition) [1]. However, currently, no studies have evaluated patients who have been excluded by the new criteria for septic shock. The aim of this study was to determine the clinical characteristics and prognosis of these patients, based on their lactate levels after initial fluid resuscitation.

    Methods: This observational study was performed using a prospective, multi-center registry of septic shock. We compared the 28-day mortality between patients who were excluded from the new definition (defined as <2 mmol/L after volume resuscitation) and those who were not (lactate level >=2 mmol/L after volume resuscitation), from among a cohort of patients with refractory hypotension, and requiring the use of vasopressors.

    Results: Of 567 patients with refractory hypotension, requiring the use of vasopressors, 435 had elevated lactate levels, while 83 did not have elevated lactate levels (neither initially nor after volume resuscitation), and 49 (8.2%) had elevated lactate levels initially, which normalized after fluid resuscitation (Fig. 1). Thus, these 49 patients were excluded by the new definition of septic shock. Significantly lower 28-day mortality was observed in these patients than in those who had not been excluded (8.2% vs 25.5%, p=0.02).

    Conclusions: It seems reasonable for septic shock to be defined by the lactate levels after volume resuscitation, however due to small sample size further large scale study is needed.


    1. Shankar-Hari M et al. JAMA 315(8):775-787, 2016.

    Table 1 (abstract P108). Comparison of outcomes between patients with restored perfusion and those compatible with the Sepsis-3 definition
    Fig. 1 (abstract P108).

    Patient Flow Diagram

    P109 Apoptotic cells induce cytokine homeostasis following LPS treatment

    D Mevorach

    Hadassah-Hebrew University, Jerusalem, Israel

    Introduction: Allocetra™, donor leukocytes containing early apoptotic cells and no necrotic cells, was shown as safe and potentially efficacious for the prevention of aGVHD(Mevorach et al. BBMT 2014). We tested the effects of early apoptotic cells on cytokines/chemokines of patients with aGVHD, and in mice treated with LPS and IFN-g.

    Methods: LPS and IFN-g were used to trigger cytokine/chemokine release in vitro and in vivo in mice, and in patients treated for aGVHD. Cytokines/chemokines were evaluated in 13 patients. Mouse and human IL-1ß, IL-2 to 10, IL-12p70, IL-13, IL-15, IL-17A, IL-22, IL-27, Il-31, IL-32, IP-10, RANTES, GRO, IFN-g, GM-CSF, TNF-a, MIP-1a, MIP-1ß, MIP-2, MCP-1, MCP-3, MIG, ENA-78, were evaluated (Luminex technology, Merck Millipore). The IFN-g effect was evaluated by STAT1 phosphorylation.

    Results: Significant downregulation (p<0.01) of about 30 pro- and anti-inflammatory cytokines, including IL-6, IP-10, TNF-a, MIP-1a, MIP-1ß, IL-10, was documented. IFN-g effect on macrophages and dendritic cells was inhibited at the level of phosphorylated STAT1. IFN-g-induced expression of CXCL10 and CXCL9 in macrophages was reduced. Patients treated in vivo with higher dosages of apoptotic cells had lower cytokine/chemokine levels compared to those treated with lower levels, and in inverse correlation to aGVHD staging. In vitro binding of apoptotic cells to LPS was documented.

    Conclusions: The cytokine storm is significantly modified towards homeostasis following apoptotic cell treatment. The mechanism is multifactorial and was shown to include TAM receptor triggering, NFkb inhibition, and LPS binding. These results together with previous studies showing significantly higher murine survival in sepsis models of LPS and cecal ligation puncture suggest that apoptotic cells may be used to treat patients with sepsis. A multicenter clinical trial in septic patients is planned in 2018.

    P110 Efficacy of continuous haemodiafiltration using a polymethylmethacrylate membrane haemofilter (PMMA-CHDF) in the treatment of sepsis and acute respiratory distress syndrome (ARDS)

    M Sakai

    12628 tomioka takeo-cho, Takeo, Japan

    Introduction: CHDF using with a polymethymethacrylate membrane is currently widely applied for non-renal indications in Japan, this technique is used in the treatment not only of patients with sepsis but also of those with cytokine-induced critical illness such as ARDS and pancreatitis. This study aimed to investigate the clinical efficacy of PMMA-CHDF in the treatment of a patients with sepsis and ARDS.

    Methods: Seventy- five patients diagnosed with sepsis (ARDS[n=30], Pyelonephritis [n=10], Cholangitisn [n=10], Tsutugamusi in Scrub typhus disease[n=1], Snake Mamushi biten[n=1], haemophagocytic syndrome[n=1],anti neutrophil cytoplasmic antibody(ANCA)lung disiese[n=1],beriberi heart disease[n=1] and unknown causes[n=18]) were enrolled in this study between August 2010 and March 2017. The common cause for ARDS in elderly patients aspiration pneumonia in elderly patients.

    Results: Following initiation of PMMA-CHDF teatment, early improvement of haemodynamics was observed, along with an increase in the urine output. The average survival rates of patients were75.6%. The low survival rate among diseases 35% belonged to the Unknown group. The highest survival rate for patients with ARDS was 95%. Moreover, the urine output significantly increased in survival group.

    Conclusions: The present study suggests that cytokine-oriented critical care using PMMA-CHDF might be effective the treatment of sepsis and ARDS, particularly,in the treatment of ARDS associated with aspiration pneumonia in elderly patients.

    P111 The polymyxin b immobilized fiber column direct hemoperfusion has an effect for septic shock but has no effect on sepsis: a cohort study and propensity-matched analysis

    K Hoshino1, H Ishikura1, Y Irie1, K Muranishi1, F Kiyomi1, M Hayakawa2, Y Kawano1, Y Nakamura1

    1Fukuoka University Hospital, Fukuoka, Japan,2Hokkaido University Hospital, Sapporo, Japan

    Introduction: Many patients with sepsis receive Polymyxin B immobilized fiber column direct hemoperfusion (PMX-HP) as a rescue therapy. Recently, we are reported that PMX-HP reduces all-cause hospital mortality in patients with septic shock [1]. However, it is unclear that whether PMX-HP reduce not only septic shock patients but also sepsis patients. A purpose of this study is to clarify an effect of PMX-HP for the prognosis of patients with sepsis.

    Methods: Data from patients admitted for severe sepsis (including septic shock) to Japanese ICUs were retrospectively collected from Jan 2011 to Dec 2013 through the Japan Septic Disseminated Intravascular Coagulation (J-SEPTIC DIC) study data base set. We analyzed the potential benefit of PMX-DHP using a propensity score–matched (1:1) cohort analysis in patients with sepsis.

    Results: Of 2,952 eligible patients, 664 underwent PMX-HP. Propensity score matching created a matched cohort of 740 patients (370 pairs with and without PMX-HP). There was no significant difference between the two matched cohorts for the hospital and ICU mortality [Odds ratio (OR); 1.20, 95% confidence interval (CI), 0.93-1.52, p=0.150), OR; 0.98, 95%CI; 0.79-1.21, p=0.828, respectively].

    Conclusions: In this demonstrated that PMX-HP had a no benefit on hospital and ICU survival when compared with conventional management (non-PMX-HP) in matched patients with sepsis. From this study we concluded that PMX-HP has an effect for septic shock but has no effect on sepsis.


    1. Nakamura Y et al. Crit Care 21(1):134, 2017

    P112 A real world experience of novel extracorporeal cytokine adsorption therapy (Cytosorb) to manage sepsis and septic shock patients at tertiary care hospital in India

    SK Garg, D Juneja, O Singh

    Max Hospital, New Delhi, India

    Introduction: Sepsis and septic shock with very high mortality rate (30-50%) is associated with an inflammatory cascade and responsible for multiple organ dysfunction [1]. Extracorporeal cytokine adsorption device (Cytosorb) is an adjunctive therapy to modulate systemic inflammation in Sepsis and Septic shock patients. This retrospective data analysis from real world provides more insight in management of septic shock, as they reflect the management of patients in heterogeneity routine clinical settings.

    Methods: In this retrospective study, data of 30 septic shock patients with SOFA score >10 admitted in ICU treated with hemoadsorption (Cytosorb) therapy was collected and analysed.

    Results: 30 patients (22 Male and 8 Females; mean age 59.33 yrs.) were administered cytosorb in addition to standard of care, an average of 1.1 cartridge was used for every patients for 4-6 hrs. Out of 30, 13 patients showed substantial reduction of 40% in SOFA score. 7 out of 30 patients had their MAP above 70mm hg after Cytosorb treatment and vasopressors were reduced to 50% from baseline. 10 & 15 patients showed good improvement in Serum Lactate 44.2% (mean 6.64 Vs 3.37) and serum creatinine 43.5% (2.80 Vs 1.58) respectively.

    Conclusions: We conclude that substantial difference was seen in Serum lactate, Serum Creatinine and vasopressor requirement after cytosorb therapy, however multi organ failure had already set in all patients before initiating cytosorb therapy, hence the above mentioned outcomes were not demonstrated in all patients.


    1. Kogelmann et al. Crit Care 21,74, 2017

    P113 Early cytokine adsorption in septic shock (ACESS-trial): results of a proof concept, pilot study

    N Öveges1, F Hawchar1, I László1, M Forgács1, T Kiss1, P Hankovszky1, P Palágyi1, A Bebes1, B Gubán1, I Földesi1, Á Araczki1, M Telkes1, Z Ondrik1, Z Helyes2, Á Kemény2, Z Molnár1

    1University of Szeged, Szeged, Hungary,2University of Pécs, Pécs, Hungary

    Introduction: Overwhelming cytokine release often referred to as “cytokine storm” is a common feature of septic shock, resulting in multiple organ dysfunction and early death. Attenuating this cytokine storm early by eliminating cytokines may have some pathophysiological rationale. Our aim was to investigate the effects of extracorporeal cytokine removal (CytoSorb) therapy on organ dysfunction and inflammatory response within the first 48 hours from the onset of septic shock.

    Methods: Patients with: sepsis of medical origin, on mechanical ventilation, noradrenaline >10mg/min, procalcitonin >3ng/mL and no need for renal replacement therapy, were randomized into CytoSorb and Control groups. CytoSorb therapy lasted for 24 hours. In addition to detailed clinical data collection, blood samples were taken to determine IL-1, IL-1ra, IL-6, IL-8, IL-10, TNF-α, PCT, CRP levels. At this stage of the study, only PCT and CRP levels were analyzed. Data were recorded on enrollment (T0) then at T12, T24, and T48 hours. For statistical analysis, Mann-Whitney test was used.

    Results: Twenty patients were randomized into CytoSorb (n=10), and Control-groups (n=10). Overall organ dysfunction as monitored by SOFA and MODS scores did not differ between the groups. In the CytoSorb-group noradrenaline requirement (T0=76±63, T24=48±43, T48=23±24 μg/min, p=0.016) showed a significant reduction in the CytoSorb-group but not in the Control-group. There was no difference in CRP, but PCT decreased significantly in the CytoSorb-group (T0=147.8±216.3, T48=78.9±140.1 mmol/L, p=0.004). Lactate decreased significantly in both groups.

    Conclusions: These results suggest that a 24-hour long CytoSorb treatment at the early stages of septic shock has significant beneficial effects on noradrenaline requirement and PCT concentrations within the first 48 hours. Based on the results of this current pilot study we are planning to design a prospective randomized multicenter trial.

    P114 Usage pattern of polymyxin-b in clinical practice for critical care management in Indian patients: a multicentric prospective observational study

    Y Mehta1, K Zirpe2, R Pande3, S Katare4, A Khurana4, S Motlekar4, A Qamra4, A Shah4

    1Medanta The Medicity, Gurgaon, India,2Ruby Hall Clinic, Pune, India,3BLK Super Speciality Hospital, New Delhi, India,4Wockhardt Limited, Mumbai, India

    Introduction: Emergence of bacterial pathogens with acquired resistance to almost all available antimicrobial agents has severely jeopardized therapeutic choices in the last decade. Furthermore, treatment of MDR gram negative infections has been a major challenge in developing countries like India. Weak research pipeline has led to re-emergence of older antibiotics like Polymyxin B with limited clinical data [1]. This study aims to evaluate the utilization profile of Polymyxin B in Intensive care practice in India.

    Methods: This on-going prospective, observational multi-centric study has been approved by IRBs and is being conducted in 3 tertiary care centers in India. The interim data of 101 patients, who received polymyxin B as part of intensive care management, was analysed for utilization profile in terms of demographics, indication, dosage regimen, safety and clinical outcomes.

    Results: Patients with mean age 53.1±19.3 years, (67.7% males) had baseline APACHE II, SOFA and GCS scores of 17.2±5.3, 13.7±4.9 and 9.8±5.2, respectively. Majority of patients had sepsis involving pulmonary (51.6%), neurological (25.3%) and cardiovascular (21.1%) systems (Fig. 1, Table 1). Commonest organisms isolated were K. pneumoniae and Acinetobacter spp. Major reasons for intensive care management were hemodynamic instability (52.5%), respiratory failure (42.6%), renal failure (14.9%) and trauma/surgery (37.6%). Mean daily Polymyxin B dose was 1.1±0.4 MIU administered for up to 14 days in 73.2% patients. Clinical response was observed in 67.2% patients. Clinical deterioration of serum Creatinine and all-cause mortality was seen in 28.8% and 32.7% patients, respectively.

    Conclusions: In light of good clinical response, Polymyxin B can be considered as a feasible option in the intensive care management of gram negative infections.


    1. Garg SK et al. Critical Care Research and Practice, 3635609:1-10, 2017

    Table 1 (abstract P114). Distribution of body systems involved in patients receiving Polymyxin-B therapy
    Fig. 1 (abstract P114).

    Distribution (Percentage) of Supportive interventions required

    Fig. 2 (abstract P114).

    Distribution (percentage) of severity of bacterial infection

    P115 Determinants & outcomes of polymyxin-b use: single center critical care experience, India

    K Zirpe, A Deshmukh, S Patil

    Ruby Hall Clinic, Pune, India

    Introduction: Polymyxin B, though available since 1950s, was side-lined due to availability of safer antimicrobials. However, surge of multi-drug resistant gram negative infections has triggered resurgence of these older antimicrobials, despite paucity of available clinical data [1]. This paper reports clinical experience of determinants and outcomes associated with Polymyxin B therapy at our institute

    Methods: This study prospectively captures clinical and drug usage profile of patients receiving Polymyxin B, from their medical records, at our tertiary care hospital in Pune, India. The analysis of first 28 completed patients has been summarized here.

    Results: The analysed patients (n=28) included 78.6% males, with mean age of 45.7 (±19.9) years. Polymyxin B was most commonly used in sepsis involving respiratory (63%) and abdominal (37%) systems. All the patients were treated in intensive care setup, among which 96.4% required mechanical ventilation (Fig. 1). Most of patients were initiated on Polymyxin B presumptively and Klebsiella pneumoniae and Acinetobacter spp were most common isolated organisms. Polymyxin B was initiated using bolus dose (equivalent to total daily dose) and administered at mean daily dose of 0.97±0.33 MIU in two divided doses. Majority (66.7%) of patients received Polymyxin B therapy for <7 days, (mean duration of therapy, 5.37±4.22 days) (Fig. 2). Meropenem (85.7%) was most commonly co-administered antimicrobial. No unlisted adverse drug reactions were reported. The all-cause mortality rate was 28.6%

    Conclusions: Our experience suggests Polymyxin B to be favourable choice for management of MDR gram-negative infection. Further systematic evaluations are required for cementing therapeutic status of Polymyxin B in multi-drug resistant gram negative infections in ICU set-up.


    1. Garg SK et al. Critical Care Research and Practice, 3635609:1-10, 2017

    Fig. 1 (abstract P115).

    Distribution of supportive interventions required in Intensive Care Setup

    Fig. 2 (abstract P115).

    Distribution of duration of Polymyxin B therapy

    P116 Polymyxin b usage and outcomes: real world experience from an intensive care unit of tertiary care hospital in northern India

    Y Mehta, C Mehta, S Nanda, J Chandel, J George

    Medanta The Medicity, Gurgaon, India

    Introduction: Limited antimicrobial agents and dry pipeline has compelled the intensivists to revisit older antibiotics such as Polymyxins, Fosfomycin, etc. for MDR/XDR gram negative infections. With limited clinical data available, especially from India, we planned to assess the drug utilization pattern of Polymyxin B in our ICU settings.

    Methods: After Institutional Ethics Committee approval, this prospective observational study was initiated to collect the usage, demography, clinical presentations, indications, bacterial species isolated, treatment regimens, concomitant antimicrobials used and clinical outcomes of Polymyxin B based regimens. The data is collected using structured case record forms and summarized using descriptive statistics. This is the interim analysis of first 73 patients of the ongoing study.

    Results: The mean age of patients (n=73, 62.5% males) was 56.0 (±18.26) years with mean baseline APACHE II score of 17.1 (±5.44). Polymyxin-B was most commonly prescribed for Sepsis (72.1%) and the most commonly system involved was Respiratory (47.1%). The most common bacterial species isolated was Klebsiella pneumoniae (23.1%) followed by Acinetobacter spp & Pseudomonas aeruginosa (5.5% each) (Table 1). The mean daily dose of Polymyxin-B was 1.19 (±0.39) MIU with mean duration of therapy of 15.36 (±14.2) days (Fig. 1). Clinical response was observed in 63.6% patients (bacteriological cure 60%) (Fig. 2). All-cause mortality was 34.2%. 71.2% of patients did not have clinically significant increase in serum creatinine levels, though 31.1 % of these had raised serum creatinine at baseline. No unexpected treatment emergent adverse events were reported.

    Conclusions: This data suggests Polymyxin B to be an effective antimicrobial agent with good clinical response and acceptable all-cause mortality in an Indian intensive care setup.

    Table 1 (abstract P116). Profile of bacterial species isolated in patients receiving Polymyxin-B based antimicrobial regimen
    Fig. 1 (abstract P116).

    Duration of Polymyxin B Therapy (Days)

    Fig. 2 (abstract P116).

    Response rate with Polymyxin B based antimicrobial therapy

    P117 Effect of selective plasma exchange for treatment of patients with severe sepsis due to multiorgan failure

    R Iscimen, P Rahimi, HZ Dundar, N Kelebek Girgin, F Kahveci

    Uludag University, Bursa, Turkey

    Introduction: The mortality rate in severe sepsis ranges between 30-50% and independent liver and renal dysfunction significantly affect intensive care mortality, 4 or more organ failure causes mortality to exceed 90%. Selective plasma exchange is the most commonly used artificial liver support system and effectively removes albumin bound toxins. Selective plasma Exchange (SPE) eliminates low- and medium-molecular weight materials such as cytokines, albumin-bound toxins etc. Therefore, SPE is thought to decrease sepsis mortality by preventing multiple organ failure. In this study, we investigate the effect of selective plasma exchange on the treatment of patients with multiorgan failure and septic shock.

    Methods: Selective plasma exchange was performed with Evaclio™(2c-20) in patients diagnosed with septic shock due to multiple organ failure. Pre-and post-treatment laboratory values, Sequential Organ Failure Assessment (SOFA) scores and 28days ICU mortality were calculated. Demographic data of patients are shown on the Table 1.

    Results: After ethics committee approval 64 patients(29female/35male) diagnosed with septic shock were included in the study, 125 sessions of selective plasma exchange were performed totally. Value changes before and after the first SPE are shown on the Table 2. There was statistically significant decrease in respiratory, coagulation, renal and liver SOFA scores by SPE. ICU mortality rate was 65%.

    Conclusions: Selective plasma exchange is a technique developed to eliminate toxins and cytokines and appear be useful in reducing severe sepsis mortality by decreasing SOFA score and preventing multiple organ failure.

    Table 1 (abstract P117). Demographic data
    Table 2 (abstract P117). SOFA scores before and after SPE

    P118 Troponin elevation in septic shock: a study of the prognosis

    A Chaari, W Assar, K Abdelhakim, K Bousselmi, M El Koumy, V Kumar, V Kauts, M Al Ansari

    King Hamad University Hospital, Bussaiteen, Bahrain

    Introduction: The aim of the current study is to assess the prognostic value of increased troponin on the outcome of patients with septic shock.

    Methods: Retrospective study conducted between 01/01/217 and 30/07/217. All adult patients admitted with septic shock were screened for inclusion. Patients with known ischemic heart disease were excluded. Demographic data, baseline clinical and laboratory findings were recorded. Troponin I level on admission and the highest troponin level during intensive care unit (ICU) stay were collected. The echocardiographic findings on admission were reviewed. Two groups (survivors and non-survivors) were compared

    Results: Thirty-one patients were included in the study. Median age was 71[62-78] years. Median APACHEII score was 19 [16-26]. Troponin I level on admission was 0.06 [0 - 0.43] ng/ml. The highest troponin level during ICU stay was 0.09 [0.36-1.11] ng/ml. Nine patients (29 %) had wall motion abnormalities. Median left ventriculqr ejection fraction (LVEF) was 55 [30 - 60] %. Median duration of ICU stay was 4 [2.8-12.8] days. ICU mortality was 22.6 %. Troponin level on admission was comparable between survivors and non-survivors (respectively 0.08 [0-0.6] and 0.01 [0-0.16]; p=0.242) whereas the highest troponin during ICU stay was significantly higher in non-survivors (1 [0.36-11.1] versus 0.31 [0.06 - 0.65]; p=0.036). LVEF was comparable between survivors and non-survivors (respectively 55 [29-60] versus 56 [55- 58] %; p = 0.547). The highest troponin was not identified by the multivariate analysis as independent factor predicting ICU mortality (OR=1.1, CI95% = 1.1 [0.92-1.16]; p=0.489). The only factor identified by the analysis was acute kidney injury requiring renal replacement therapy (OR=105, CI95% [5.5-198]; p=0.002).

    Conclusions: Increased serum troponin I level is common in patients with septic shock. Our study suggests that the increase of troponin is higher in non-survivors.

    P119 Real time needle visualisation pericardiocentesis in acute traumatic pericardial tamponade

    O Adi, A Azma Haryaty

    Hospital Raja Permaisuri Bainun, Perak, Malaysia

    Introduction: Blind pericardiocentesis leading to low success rate and high complication rates such as ventricular wall or oesophageal perforations, pneumothorax or upper abdominal organ injury.Real time needle visualisation is allowing us to avoid this major complication [1].

    Methods: We presented 2 cases of acute traumatic cardiac tamponade secondary to severe chest injury. Both patients presented with haemodynamic instability and echocardiographic features of pericardial tamponade. Pericardiocentesis under ultrasound guidance at left parasternal area with needle directed from medial to lateral technique were performed(Fig. 1). Real time needle tip visualisation done throughout the procedure(Fig. 2a). Needle placement in pericardial space was confirmed with agitated saline and guidewire visualisation(Fig. 2b). Pigtail catheter was inserted and blood was aspirated until the patient were haemodynamically improved. Repeated ultrasound was done to confirm the absence of ultrasonographic features of tamponade and complications.

    Results: We demonstrated a successful real time needle visualisation ultrasound guided pericardiocentesis in 2 cases acute traumatic pericardial tamponade. Procedural time (time from needle piercing the skin to time needle entering the pericardium) in both cases were less than 1 minute. Post procedural ultrasound confirmed no major complications.

    Conclusions: The real time needle visualisation using ultrasound was important to reduce major complications during pericardiocentesis. The safety of the highly invasive procedure can be improved with real time needle visualisation.


    Osman A et al. Eur J Emerg Med (in press), 2017

    Fig. 1 (abstract P119).

    Parasternal approach with needle directed from medial to lateral technique.

    Fig. 2 (abstract P119).

    a: Real time needle visualisation. b: Agitated saline or micro bubble test for tip placement confirmation

    P120 A novel method for early identification of cardiac tamponade in patients with continuous flow left ventricular assist devices by use of sublingual microcirculatory imaging

    S Akin, C Ince, C Den Uil, A Struijs, R Muslem, I Ocak, G Guven, AA Constantinescu, OI Soliman, F Zijlstra, A J.J.C Bogers, K Caliskan

    Erasmus MC, University Medical Center Rotterdam, Rotterdam, Netherlands

    Introduction: Diagnosis of cardiac tamponade post continuous-flow left ventricle assist devices (cf-LVADs) is challenging due to missing pulsatility. Recent case study of sublingually microcirculation with incident dark-field imaging (IDF) provide a new improved imaging for clinical assessment of cardiac tamponade in a patient with cf-LVAD. We sought to examine the changes in microvascular flow index (MFI) as a sign of cardiac tamponade following LVAD implantation.

    Methods: Off-site quantitative analysis of sublingual microcirculation clips with Automated Vascular Analyses software (AVA; MicroVision Medical©), and the velocity distributions followed during admission till discharge in patients with end-stage heart failure treated with cf-LVAD complicated by cardiac tamponade.

    Results: Eleven out of thirty LVAD implantations, 9 males, mean age 58 ± 10 years, April 2015 to January 2017, ((8 Heart Mate 3 (HM 3) and 3 HeartMate II (HM II) (Thoratec Corp., CA)), were complicated by rethoracotomy due to early postoperative cardiac tamponade within 1 week. There sublingual microcirculation was examined by a novel incident dark-field imaging (IDF) before and daily post-LVAD implantation. Pre-LVAD microcirculation was typical for heart failure, characterized by slowly, sludging movement of red blood cells (RBCs), (Fig. 1A arrows). Directly after implantation, a normal microcirculatory flow was seen with a high RBCs velocity (Fig. 1B). On the day of tamponade the patients were stable except for severe failure of microcirculation as reflected by drop in MFI (Fig. 1C) and congestion in venules (* in Fig. 1C). In 8 out of 11 patients there was a significant drop in MFI before tamponade was clinically recognized (p<0.05). Shortly after rethoracotomy a quick restoration of microcirculatory flow has been found.

    Conclusions: Sublingual microcirculation imaging is a simple and sensitive non-invasive tool in early detection of cardiac tamponade.

    Fig. 1 (abstract P120).

    See text for description

    P121 Survey on the use of cardiovascular drugs in shock (ucards) – inotropes

    T Kaufmann1, I Van der Horst1, JL Teboul2, T Scheeren1

    1University Medical Centre Groningen, Groningen, Netherlands,2Hôpitaux Universitaires Paris-Sud, Paris, France

    Introduction: Treatment decisions on patients with shock lack consensus. In an international survey we aimed to evaluate the indications, current practice, and therapeutic goals on the use of inotropes in the treatment of shock states.

    Methods: From November 2016 to February 2017 an anonymous 27-question web-based survey was accessible to members of the European Society of Intensive Care Medicine (ESICM). A total of 27 questions focused on the profile of respondents, and the triggering factors, first line choice, dosing, timing, targets, additional treatment strategy, and suggested effect of cardiovascular drugs.

    Results: A total of 827 physicians responded. As detailed in Table 1, the respondents think that dobutamine is first-line inotrope to increase cardiac pump function (N=695, 84%) and should be started when signs of hypoperfusion or hyperlactatemia despite adequate use of fluids and vasopressors in the context of low left ventricular ejection fraction are present (N=359, 43%). The most accepted target was an adequate cardiac output (N=369, 45%). The combination of noradrenaline and dobutamine was preferred to single treatment with adrenaline mainly due to possibility to titrate individually (N=366, 44%). The main reason for adding another inotrope was to use synergistic effects of two different mechanisms of action (N=229, 27%). According to respondents, phosphodiesterase-inhibitors should be used in the treatment of predominant right heart failure because of prominent vasodilatory effect on the pulmonary circulation (N=360, 44%). They also believe levosimendan is the only inotrope that does not increase myocardial oxygen demand (N=350, 42%). Vasodilators are used in cardiogenic shock to decrease left ventricular afterload (N=244, 30%). There is no experience or no opinion about the use of ß-blockers in shock states (N=268, 32%).

    Conclusions: This web-based survey provided latest trends on inotrope use in shock states which showed considerable diversity among respondents in opinions about its use.

    Table 1 (abstract P121). Survey questions on inotropes and other cardiovascular drugs with the most frequent response

    P122 Thermoshock: thermography assessment of thermal patterns in patients with shock: preliminary results

    M Tosi1, A Andreotti1, M Girardis1, F Despini2, A Muscio2, P Tartarini2

    1University Hospital of Modena, Modena, Italy,2University of Modena, Modena, Italy

    Introduction: Recent literature data clearly indicated that in patients with shock the resuscitation of macro-circulation often does not match with microcirculation and tissue perfusion improvement.

    Unfortunately, the bed-side assessment of regional perfusion remains difficult, particulary in critically ill patients. In the last years thermography has been used in different medical fields but no studies have been performed on the use of this technique in critically ill patients.

    The aim of this study was to evaluate whether thermography is feasible and may provide useful data during resuscitation of patients with septic shock.

    Methods: In 4 patients with septic shock we collected central systemic temperature and infrared images (FLIR-T640 digital camera) of limbs at 0, 3, 6 and 24 hours after shock occurrence. Thermal pattern distribution of the limbs was obtained by a specific analysis of the images (ThermaCAM™Researcher P). A systemic to peripheral temperature gradient called “∆ systemic-limb temperature” was calculated for each single temperature data collected.

    Results: Macrocirculatory and perfusion parameters improved in all the patients throughout the study period: mean values of noradrenaline dose decreased from 0.21 to 0.13 γ/kg/min, mean MAP increased from 65 to 81 mmHg and mean blood lactate decreased from 6.6 to 4.2 mMol/L. The “∆ systemic-limb temperature” pattern showed an heterogenous time course in the 4 patients with a mean overall increase at 6 and 24 hours (Fig. 1).

    Conclusions: As expected, the regional data obtained by thermography did not match with macrocirculatory and systemic perfusion parameters. The significance and the relationship between treatments and data observed will be investigated by appropriate studies.

    Fig. 1 (abstract P122).

    See text for description

    P123 Regional differences in the treatment of refractory septic shock – an analysis of the ATHOS-3 data

    M Abril1, A Khanna2, C McNamara2, D Handisides3, L Busse4

    1Emory University, Atlanta, GA, USA,2Cleveland Clinic, Cleveland, OH, USA,3La Jolla Phapmaceutical Company, La Jolla, CA, USA,4Emory St. Joseph’s Hospital, Atlanta, GA, USA

    Introduction: Vasodilatory shock is a common syndrome with high mortality. Despite established care protocols, regional differences in treatment remain. We sought to characterize these differences using data from the recently published ATHOS-3 study [1].

    Methods: Individual patient data were analyzed at baseline and at 48h for regional differences in demographics, clinical characteristics, and treatment patterns, and grouped according to four geographical areas: the United States (US), Canada (CA), Europe (EU) and Australasia (AU). P-values were calculated by Kruskal-Wallis tests for continuous data and chi-square tests for categorical data. Subsequent temporal analysis compared changes in the treatment of shock, indexed by changes in patient acuity level.

    Results: Regional differences existed with respect to BMI (p=0.0076), albumin (p<0.0001), CVP (p=0.0383), MELD score (p=0.0191), APACHE II score (p=0.0007) and SOFA score (p=0.0076). Baseline norepinephrine (NE) and NE equivalent doses were significantly higher in EU (p<0.0001 and p=0.0494, respectively), and utilization of vasopressin was correspondingly lower (p<0.0001). At baseline, stress dose steroids were utilized to a greater extent in the US and CA (p=0.0011). Temporal analysis revealed differences in the utilization of vasopressin and steroids with changes in patient acuity: in EU, increasing acuity was associated with a lower utilization of vasopressin, and in CA, increased acuity was associated with a lower utilization of steroids. Steroid utilization was higher with increased level of acuity in AU and the US.

    Conclusions: Significant differences in the treatment of vasodilitory shock exist globally, with important implications: (a) there are widespread differences of best practices, (b) heterogeneity may render global studies of shock difficult to interpret, and (c) outcomes may be improved through appropriate use of adjunctive therapies like non-catecholamine vasopressors or corticosteroids.


    [1] Khanna et al. N Engl J Med; 377(5):419-430, 2017

    P124 Association of angiotensin II dose with all-cause mortality in patients with vasodilatory shock

    M McCurdy1, LW Busse2, MN Gong3, DW Boldt4, SN Chock5, R Favory6, KR Ham7, K Krell8, XS Wang9, LS Chawla10, GF Tidmarsh10

    1University of Maryland, School of Medicine, Baltimore, MD, USA,2Emory University, Atlanta, GA, USA,3Albert Einstein College of Medicine, Montefiore Medical Center, Bronx, NY, USA,4University of California, Los Angeles, Los Angeles, CA, USA,5Sunrise Hospital, Las Vegas, NV, USA,6CHU Lille, Critical Care Center and University of Lille School of Medicine, Lille, France,7Regions Hospital, University of Minnesota, St. Paul, MN, USA,8Eastern Idaho Regional Medical Center, Idaho Falls, ID, USA,9Duke University Medical Center, Durham, NC, USA,10La Jolla Pharmaceutical Company, San Diego, CA, USA

    Introduction: Vasodilatory shock is associated with high risk of mortality. In the ATHOS-3 study, the addition of angiotensin II (Ang II) to standard vasopressors significantly increased mean arterial pressure (MAP) and decreased vasopressor utilization, with a trend towards improved survival to day 28. In this analysis of Ang II recipients in the ATHOS-3 study, we assess the relationship between different dose ranges of Ang II with MAP response and mortality.

    Methods: Patients with persistent vasodilatory shock despite receiving >0.2 μg/kg/min of norepinephrine-equivalent dose vasopressors were randomized to receive IV Ang II, titrated per study protocol, or placebo with other vasopressors held constant for hours 0-3; for hours 3-48, all vasopressors could be titrated to maintain MAP. Ang II responsiveness was defined as the Ang II dose required to achieve a MAP of 75 mmHg. In this analysis, 28-day all-cause mortality was evaluated per prespecified categories based on the Ang II dose received 30 min after the start of dosing. These categories included “super-responders,” defined as patients who required physiological doses of Ang II (<=5 ng/kg/min), and patients who required higher doses.

    Results: Among the 163 Ang II recipients, 79 (48.5%) were super-responders. Mortality at day 28 was 32.9% in super-responders vs 58.6% in patients requiring Ang II doses >5 ng/kg/min (n=84) and 53.9% in the placebo group (n=158). The hazard ratio for all-cause mortality in super-responders vs Ang II patients requiring higher doses was 0.45 (95% CI 0.28-0.72), p=0.0007; and 0.50 (95% CI 0.32-0.78), p=0.0018 for super-responders vs placebo.

    Conclusions: Ang II super-responders, a large subgroup of Ang II recipients in ATHOS-3, had significantly reduced all-cause mortality at day 28 vs patients receiving placebo or higher doses of Ang II. Ang II doses <=5 ng/kg/min equates to a physiological level of Ang II, and may reflect a novel means of achieving normal homeostatic mechanisms to correct vasodilatory shock.

    P125 Outcomes in patients with acute respiratory distress syndrome receiving angiotensin II for vasodilatory shock

    L Busse1, T Albertson2, M Gong3, BT Thompson4, R Wunderink5, D Handisides6, G Tidmarsh6, L Chawla6

    1Emory St. Joseph’s Hospital, Atlanta, GA, USA,2University of California, Davis, Davis, CA, USA,3Montefiore Medical Center, Bronx, NY, USA,4Harvard Medical School, Boston, MA, USA,5Northwestern University, Feinberg School of Medicine, Chicago, IL, USA,6La Jolla Pharmaceutical Company, San Diego, CA, USA

    Introduction: ATHOS-3 was a randomized, placebo-controlled, double-blind study of patients with severe vasodilatory shock (VS), which demonstrated that the addition of angiotensin II (Ang II) to standard vasopressors significantly increased mean arterial pressure (MAP) and decreased vasopressor utilization, with a trend towards improved survival to day 28. Given the location of angiotensin-converting enzyme in the pulmonary endothelium, we analyzed the effect of Ang II treatment in the subset of ATHOS-3 patients with acute respiratory distress syndrome (ARDS), in whom the pulmonary endothelium may be damaged.

    Methods: Patients with persistent VS despite >0.2 μg/kg/min norepinephrine equivalent dose vasopressors were randomized to receive either IV Ang II or placebo. Patients with ARDS were classified based on the PaO2/FIO2 ratio, per the Berlin definition, as mild, moderate, and severe. Clinical outcomes, including MAP response were compared between groups.

    Results: In the Ang II cohort, 34%, 30%, and 11% of the 163 patients were classified with mild, moderate, and severe ARDS, respectively as compared to 32%, 32% and 12% of the 158 patients in the placebo cohort. MAP response, defined as an increase from baseline of at least 10 mm Hg or an increase to at least 75 mm Hg, was achieved by 24%, 25%, and 16% of patients in the placebo group, for mild, moderate and severe ARDS respectively. In the Ang II group, MAP responses were achieved by 67% (OR=6.7, p<0.001), 69% (OR=6.6, p<0.001), and 61% (OR=8.4, p=0.005). In the placebo group, the 28-day mortality for mild, moderate and severe ARDS increased with severity, while the trend for worse survival was reduced in the Ang II cohort when compared to placebo for mild and severe ARDS (Table 1).

    Conclusions: In patients with ARDS, MAP response was significantly improved in each ARDS subgroup, and the trend for increased 28-day mortality with an increase in ARDS severity was reduced in the Ang II group compared to the placebo group.

    Table 1 (abstract P125). 28-day mortality % (95% Confidence Interval)

    P126 Perioperative use of levosimendan: harmful or beneficious?

    F Guarracino1, S Bouchet2, M Heringlake3, P Pollesello4

    1Azienda Ospedaliero-Universitaria Pisana, Pisa, Italy,2University Hospital, Ghent, Ghent, Belgium,3Universitätsklinikum Schleswig-Holstein, Lübeck, Germany,4Orion Pharma R&D, Espoo, Finland

    Introduction: Levosimendan is a calcium sensitizer and KATP-channel opener exerting sustained hemodynamic and symptomatic effects. In the past fifteen years, levosimendan has been used in clinical practice also to stabilize at-risk patients undergoing cardiac surgery. Recently, the three randomized, placebo-controlled, multicenter studies LICORN [1], CHEETAH [2] and LEVO-CTS [3] have been testing the peri-operative use of levosimendan in patients with compromised cardiac ventricular function. Over 40 smaller trials conducted in the past [4] suggested beneficial outcomes with levosimendan in peri-operative settings. In contrast, the latest three studies were neutral or inconclusive. We aim to understand the reasons for such dissimilarity.

    Methods: We re-analyzed the results of the latest trials in the light of the previous literature to find sub-settings in which levosimendan can be demonstrated harmful or beneficious.

    Results: None of the three latest studies raised any safety concern, which is consistent with the findings of the previous smaller studies. In LEVO-CTS, mortality was significantly lower in the levosimendan arm than in the placebo arm in the subgroup of isolated CABG patients (Fig. 1) [3]. The trend towards both hemodynamic and long term mortality benefits is maintained in recent meta-analyses [5,6] including the three larger recent studies.

    Conclusions: Despite the fact that the null hypothesis could not be ruled out in the recent trials, we conclude that levosimendan can still be viewed as a safe and effective inodilator in cardiac surgery. Statistically significant mortality benefits seem to be limited to sub-groups, such as the isolated CABG procedures, and/or the low EF patients.


    1. Cholley B et al. JAMA 318:548-556, 2017

    2. Landoni G et al. N Engl J Med 376(21):2021-2031, 2017

    3. Mehta RH et al. N Engl J Med 376(21):2032-2042, 2017

    4. Harrison RH et al. J Cardiothorac Vasc Anesth 27:1224-1232, 2013

    5. Sanfilippo F et al. Crit Care 21(1):252, 2017

    6. Chen QH et al. Crit Care 21(1):253, 2017

    Fig. 1 (abstract P126).

    Ninety-day mortality among patients in the LEVO-CTS trial [3] in the subgroup of isolated CABG patients (n=563)

    P127 Effects of levosimendan on weaning from mechanical ventilation of patients with left ventricular dysfunction

    I Kaltsi, C Gratsiou, S Nanas, C Routsi

    National and Kapodistrian University of Athens, Athens, Greece

    Introduction: This study aims to assess the effects of levosimendan, a calcium sensitizer, in the treatment of patients with impaired left ventricular function and difficult weaning from mechanical ventilation.

    Methods: Difficult- to- wean from mechanical ventilation [failed >= 3 consecutive spontaneous breathing trials (SBTs)] patients, who had left ventricular dysfunction defined as left ventricular ejection fraction (LVEF) of less than 40%, were studied.For each patient, 2 SBTs were studied: the first one before and the second one after a continuous infusion of levosimendan over 24 h (Control day and Study day, respectively). On both days transthoracic echocardiography (TTE) was performed on mechanical ventilation and at the end of the SBT. Also, serum levels of troponin I were measured at the same time points.

    Results: Eleven patients (7 men; mean age of 72 ± 9 years) were studied. Whereas weaning failed in all patients on Control day, levosimendan administration enabled a successful spontaneous breathing trial in 8 out of 11 patients on Study day. Compared to the Control Day, left ventricular ejection fraction significantly increased on Study Day (from 27.5±9.6% to 36.9±2.8%, p< 0.05) whereas E/A significantly decreased both on mechanical ventilation and at the end of SBT: from 1.14±0.76 to 0.92±0.26, p<0.05 and from 1.29±0.73 to 0.88±0.12, respectively, p<0.05). Also, troponin I value decreased significantly on mechanical ventilation (from 217±268 to 163±194 mg/l) and at the end of SBT (from 196±228 to 146±180 mg/l). At the end of SBT, levosimendan treatment resulted in a lesser extent of arterial PO2 decrease: 96±35 vs. 66±27 mmHg, p<0.05) and similarly of ScvO2 decrease: 67±10 vs.60±8, p<0.05 (compared to the values on mechanical ventilation).

    Conclusions: Levosimendan may provide significant benefit to difficult- to- wean patients with impaired left ventricular function.

    P128 Levosimendan for weaning veno-arterial ECMO (VA ECMO)

    G Haffner, G Ajob, M Cristinar, S Marguerite, W Oulehri, B Heger, M Kindo, PM Mertes, A Steib

    Hôpitaux Universitaires, Strasbourg, France

    Introduction: VA ECMO weaning is a challenging process. The aim of the study was to evaluate the putative benefit of levosimendan, for VA ECMO weaning.

    Methods: This retrospective study, from 2014 to 2016, included patients referred to our ICU for primary cardiogenic shock or following cardiotomy, with VA ECMO in whom an attempt was made to wean mechanical support (death under VA ECMO or bridge to long-term device or transplantation were excluded). Incidence of weaning failure, VA ECMO support duration, length of stay in ICU and length of mechanical ventilation were compared in patients who received levosimendan or not in the whole population and in the post-cardiotomy sub-group. Levosimendan was used at doctor’s discretion. Independent factors associated with weaning failure were determined. Statistics were made throughout bayesian paradigm.

    Results: 27 patients were included in levosimendan group and 36 in control group. In the whole population, weaning failure incidence and mortality was comparable between the 2 groups (respectively 24% vs 20%, Pr 0, 34 and 36% vs 38%, Pr=0,6). Higher assistance duration, longer stay under mechanical ventilation and longer duration of stay in critical care unit were observed in Levosimendan group. In the post-cardiotomy sub-group (Table 1), weaning failure was lower in levosimendan group (12% vs 29%, Pr 0,9) and levosimendan was an independent protective factor from weaning failure (OR 0,073, Pr 0,92). Positive impact of levosimendan may be explained in part by his calcium sensitizer effect and by facilitating recovery of myocardial calcium homeostasis in postcardiotomy cardiac stunning.

    Conclusions: Levosimendan failed to reduce the incidence of ECMO weaning failure, except for post-cardiotomy population.

    Table 1 (abstract P128). Multivariate analysis for weaning failure in post-cardiotomy sub-group (IACPB : Intra-Aortic ConterPulsion Balloon)

    P129 Renal outcomes of vasopressin and its analogues in distributive shock: a systematic review and meta-analysis of randomized trials

    T Rech, W Nedel, J Pellegrini, R Moraes

    Hospital de Clínicas de Porto Alegre, Porto Alegre, Brazil

    Introduction: Previous trials have suggested a lower incidence of the need for renal replacement therapy (RRT) and acute kidney injury (AKI) in patients with shock treated with vasopressin or its analogues (VA) compared to other vasopressors [1, 2]. The aim of the present study was to systematically review the literature and synthesize evidence concerning the effects of VA compared to other vasopressors in distributive shock, focusing on renal outcomes.

    Methods: MEDLINE, Embase, Cochrane CENTRAL and databases were searched through June, 2017 without language restrictions. Randomized clinical trials that compared VA with other vasopressors and reported renal outcomes in adult patients with distributive shock were included. Paired reviewers independently screened citations, conducted data extraction and assessed risk of bias. Odds ratio (OR) and weighted mean differences (WMD) with 95% confidence intervals (CI) were used to pool effect estimates from trials. Four prespecified subgroup analysis was conducted. Sensitivity analysis was applied to explore heterogeneity. The quality of evidence for intervention effects was summarized using GRADE methodology.

    Results: 3,026 potentially relevant studies were identified and 30 papers were reviewed in full. Sixteen studies met the inclusion criteria, including a total of 2,054 individuals. Of these, 10 studies (1,840 individuals) were suitable for quantitative meta-analysis. Overall, the evidence was of low to moderate quality. Patients who received VA had a reduced need for RRT (OR 0.5 [0.33 - 0.75]; I2 = 7%, P for heterogeneity = 0.37) and a lower AKI incidence (OR 0.58 [0.37 - 0.92]; I2 = 63%, P for heterogeneity = 0.004).

    Conclusions: In patients with distributive shock, VA use is associated with a reduced need for RRT and lower AKI incidence, although these results are supported by low quality evidence.


    1. Hajjar LA et al. Anesthesiology 126:85-93, 2017

    2. Russell JA et al. N Engl J Med 358:877-87, 2008

    P130 The effect of norepinephrine on venous return during VA-ECMO

    A Hana1, PW Moller2, PP Heinisch1, J Takala1, D Berger1

    1Inselspital, Bern University Hospital, Bern, Switzerland,2Institute of Clinical Sciences at the Sahlgrenska Academy, University of Gothenburg, Sahlgrenska University Hospital, Gothenburg, Sweden

    Introduction: Venous return (VR) is driven by the difference between mean systemic filling pressure (MSFP) and right atrial pressure (RAP) and determines the maximum ECMO flow. MSFP depends on stressed volume and vascular compliance. It can be modified by absolute blood volume changes and shifts between stressed and unstressed volume. Norepinephrine (NE) may increase stressed volume by constriction of venous capacitance and at the same time increase the resistance to systemic flow. We therefore studied the effects of NE on MSFP, maximum ECMO flow and the ECMO pressure head (MAP-RAP).

    Methods: MSFP was measured with blood volume at Euvolemia and NE 1 to 3 (0.05, 0.125 and 0.2μg/kg/h) in a closed-chest porcine VA-ECMO model (n=9, central cannulation with left atrial vent and av-shunt) in ventricular fibrillation. The responses of RAP and VR (measured as ECMO flow, QECMO) were studied at variable pump speeds including maximum possible speed without clinically apparent vessel collapse at constant airway pressure.

    Results: The ECMO pump speed and QECMO showed a strictly linear relationship (r2 0.95 to 0.995, range over all conditions) despite increased pressure head, indicating that the maximum QECMO was determined by VR alone. NE led to both increases in MSFP and QECMO in a dose dependent way, indicating a rightward shift in the VR plot (Fig. 1) via recruitment of stressed from unstressed volume (Table 1, Fig. 2). This resulted in an increased MSFP during NE despite decreased absolute blood volume (3.9±0.4 L vs. 3.3±0.3L, p=0.009). The reduced blood volume was associated with hemoconcentration suggesting plasma leakage.

    Conclusions: NE shifts the VR curve to the right, allowing a higher maximum ECMO flow. The NE induced increase in MSFP results from recruitment of unstressed volume to stressed volume, which may be modified by changes in vascular compliance. The effects on pump afterload were not limiting.

    Table 1 (abstract P130). Effects of NE on MSFP, QECMO, pressure head and hemoglobin as compared to euvolemia
    Fig. 1 (abstract P130).

    Effect of NE on VR curve as compared to euvolemia

    Fig. 2 (abstract P130).

    Effects of NE on blood volume, MSFP and vascular elastance

    P131 Predictive value of right heart hemodynamics for acute kidney injury after heart transplantation

    G Guven, M Brankovic, AA Constantinescu, JJ Brugts, DA Hesselink, S Akin, A Struijs, O Birim, C Ince, OC Manintveld, K Caliskan

    Erasmus MC, University Medical Center Rotterdam, Rotterdam, Netherlands

    Introduction: Acute kidney injury (AKI) is a major complication after heart transplantation (HTx), but the relation with the preoperative right heart hemodynamics (RHH) remain largely unknown. The aim of this study was to determine whether the routine and novel RHH could predict the development of AKI and 1-year patient survival in patients with HTx.

    Methods: Data of all consecutive HTx patients (n=595) in our tertiary referral center, between 1984 and 2016, were collected and analyzed for the occurrence of AKI and survival at 1 year.

    Results: AKI was developed in 430 (72%) patients; 278 (47%) stage-1, 66 (11%) stage-2, and 86 (14%) stage-3. Renal replacement therapy (RRT) was needed in 41 (7%) patients, with subsequent increased risk for chronic RRT-dependency at 1-year (odds ratio: 3.3 [95% CI: 1.6–6.6], p=0.001). Patients with higher AKI stages had also higher baseline right atrial pressure (RAP) (median: 7, 7, 8, 11 mmHg, p-trend=0.021), RAP/PCWP ratio (0.37, 0.36, 0.40, 0.47, p-trend=0.009), and lower pulmonary artery pulsatility index (PAPi) values (2.83, 3.17, 2.54, 2.31, p-trend=0.012). Patients with higher AKI stages had significantly lower survival at 1-year (5, 7, 15, 14 %, log-rank, p-trend=0.021) with worst outcome for RRT patients (1-year mortality with RRT vs no RRT: 22% vs 8%, log-rank, p=0.001).

    Conclusions: AKI is highly frequent early after HTx and is inversely associated with 1-year patient survival. The routinely collected preoperative PAPi and RAP predict the development and severity of AKI early after HTx and could be used to timely intervene and prevent the development of AKI.

    P132 Haemorrhage versus sepsis in maternity HDU

    A Taylor, S Stevenson, J Gardner, K Lake, K Litchfield

    Princess Royal Maternity Unit, Glasgow Royal Infirmary, Glasgow, UK

    Introduction: Haemorrhage and sepsis are the most common reasons for admission to our maternity HDU.

    Methods: Data was gathered over a 26 month period from July 2015 to September 2017 looking at maternity HDU admissions with sepsis or haemorrhage. Using Excel length of stay, cardiovascular instability, monitoring, vasoactive drug use and baby with Mum was analysed. Local ethical approval granted.

    Results: The total number of patients admitted was 514. 259 (50.4%) had a primary diagnosis of haemorrhage or sepsis. Haemorrhage was separated in to antepartum, 14 (9%) and postpartum 37 (90.7%). The most common reason for postpartum haemorrhage was uterine atony 62 (45.5%). 38 (35%) of sepsis admissions were antepartum and 70 (65%) postpartum. 1 (0.6%) was discharged to surgical HDU and 150 (99.3%) to ward. The most common source was gynaecological infection 29 (26%). 46 (42%) of admissions were from home; 26 (24%) antepartum and 20 (18%) postpartum. Most admissions 62 (57%) were via surgical or labour ward or post-theatre. 5 (3.9%) were discharged to medical or surgical HDU, 3 (2.7%) to ICU and 100 (92%) to ward. See Table 1.

    Conclusions: Most admissions to HDU, 206 (79.5%) are in the postpartum period with haemorrhage accounting for the majority. These patients are more cardiovascularly unstable, require more invasive monitoring and blood pressure treatment but have a shorter overall stay. They are generally discharged straight to ward. This reflects the expected quicker improvement after haemostasis in well young women. The discrepancy between mum with baby (62% in sepsis cf 84% in bleeding) may reflect the higher likelihood of baby needing NICU in the sepsis group. Septic patients stay longer with higher risk of deterioration and escalation of care. This mirrors the recent MBRRACE [1] report stating sepsis as a major cause of morbidity.


    1. Knight M et al. MBRRACE-UK Saving Lives, Improving Mothers’ Care 2016.

    Table 1 (abstract P132). Cardiovascular instability and monitoring of maternity HDU patients with either sepsis or haemorrhage

    P133 Real-time guidance or prelocation using ultrasound for pediatric central venous catheterization; a systematic review and network meta-analysis

    K Hosokawa, N Shime, M Kyo, Y Iwasaki, Y Kida

    Hiroshima University Hospital, Hiroshima, Japan

    Introduction: To locate vessels for percutaneous central venous catheterizations, it may be helpful to apply not only real-time ultrasound (US) guidance but also US-assistance vein prelocation. The aim of this study was to evaluate the superiority of two US methods compared to surface landmark methods by reviewing randomized control trials (RCTs).

    Methods: As updating an earlier systematic review [1], we searched PubMed and CENTRAL in November 2017. We included RCTs which compared the failure rates of internal jugular or femoral venous cannulations among 1) real-time US guidance, 2) US-assistance vein prelocation and 3) surface landmark methods. A frequentist network meta-analysis was conducted using the netmeta package on R.

    Results: Out of 1395 citations, 11 RCTs (935 patients) were eligible. The number of studies comparing outcomes between real-time US guidance vs. surface landmark methods, US-assistance vein prelocation vs surface landmark methods and real-time US guidance vs US-assistance vein prelocation was 7, 3 and 1. Regarding cannulation failure rate, network meta-analysis in a fix-effect model showed that a p-score was lower in the real-time US guidance than US-assistance vein prelocation (0.61 vs. 0.88), by reference to surface landmark methods, and also regarding arterial punctures, a p-score was lower in the real-time US guidance than US-assistance vein prelocation (0.64 vs. 0.83).

    Conclusions: Based on the present network meta-analysis of RCTs, p-scores of cannulation failure and arterial puncture were lower in the real-time US guidance, suggesting that the US-assistance vein prelocation is superior than the real-time US guidance, both of which achieve lower rates of failure and arterial puncture compared to the landmark methods. We speculates that the inferiority of real-time guidance is associated with difficulties in manipulating the needle together with an echo probe in targeting relatively smaller veins in children.


    1. Shime N et al. Pediatr Crit Care Med 16(8):718-25, 2015.

    P134 A beriberi bad heart - a case report

    M Charalambos, A Revill, D Howland

    Torbay Hospital, Torquay, UK

    Introduction: We present a case report of ‘Shoshin beriberi’ in a young female who was ‘fussy with food’ that developed an acutely progressive metabolic acidosis and multi-organ failure requiring intensive care support.

    Methods: Our patient was a 36-year-old British woman who presented to the emergency department (ED) with a ten-day history of diarrhea, vomiting and increasing fatigue. She had a past medical history of gastroparesis, polycystic ovary syndrome (on metformin), laparoscopic cholecystectomy and hysteropexy. She lived with her husband and two children who had viral gastroenteritis two weeks previously.

    Results: The patient had a metabolic acidosis (pH 6.9) with raised lactate (>16) on initial blood gas in the ED. A 1.26% sodium bicarbonate infusion and hemofiltration were commenced overnight. The patient’s pH and lactate remained static with an increasing work of breathing over this period. By morning she developed flash pulmonary oedema and hypotension, the first signs of acute cardiac failure. An echocardiogram displayed severely impaired left ventricular function with ejection fraction of 17%. The patient was intubated and inotropic support was commenced. It was thought that a micronutrient deficiency may have caused a rapid onset cardiac failure. Pabrinex (containing 250ml of Thiamine Hydrochloride) was commenced and within 9 hours the patient’s metabolic acidosis markedly improved (Fig. 1). Complete reversal of the cardiac failure occurred over 96 hours.

    Conclusions: Shoshin is a rare clinical manifestation of thiamine deficiency [1]. It is an important differential diagnosis to bear in mind after excluding more common aetiologies of heart failure. Especially in this case as our patient had no obvious risk factors at the time of presentation. We suggest empiric use of thiamine should be considered in treatment algorithms for young patients presenting with acute cardiac failure. The pateint had provided informed consent for publication.


    1. Naidoo DP, et al. Clinical diagnosis of cardiac beriberi. S Afr Med J 77(3):125-7. 1990.

    Fig. 1 (abstract P134).

    Serum lactate and pH with thiamine administration

    P135 Wells score is not a reliable predictor of the risk of pulmonary embolism in critically ill patients: a retrospective cohort study

    T Rech, A Girardi, R Bertiol, M Gazzana

    Hospital de Clínicas de Porto Alegre, Porto Alegre, Brazil

    Introduction: Pulmonary embolism (PE) is a commonly missed deadly diagnosis [1]. At the emergency department, the pretest probability of PE can be assessed using clinical prediction tools. However, critically ill patients are at a high risk for PE and prediction rules have not been validated in this population. The aim of the present study was to assess the Wells scoring system as a predictor of PE in critically ill patients.

    Methods: Computed tomographic (CT) pulmonary angiographies performed for suspected PE in adult critically ill patients during their intensive care unit stay were identified by the radiology information system. Wells score was retrospectively calculated based on medical records and its reliability as a predictor of PE was determined using a receiver operator characteristic (ROC) curve.

    Results: From the 144 patients evaluated, 39 (27%) were positive for PE based on CT pulmonary angiography. Mean Wells score was 3.9 ± 2.7 in patients with PE versus 2.4 ± 1.5 in patients without PE (P <0.001). Sixty patients (41.6%) were considered as low probability for PE (Wells score <2). From them, 13 patients (22%) presented with filling defects on CT scan, including two patients with main stain pulmonary artery embolism and one patient with lobar artery embolism. The area under ROC curve was 0.656. When a Wells score >4 was used to predict risk of PE, the sensitivity was 43%, specificity was 88%, PPV was 59% and NPV was 88.6%.

    Conclusions: In this population of critically ill patients, Wells score was not a reliable predictor of risk of PE.


    Wiener RS et al. BMJ 347: F3368, 2015

    P136 Acute kidney injury in cardiogenic shock syndrome: prevalence and outcome

    C Facciorusso, M Bottiroli, A Calini, D De Caria, S Nonini, R Pinciroli, F Milazzo, M Gagliardone

    ASST Grande Ospedale Metropolitano Niguarda, Milan, Italy

    Introduction: Cardiogenic shock (CS) is a syndrome due to acute heart failure. Acute kidney injury (AKI) could be secondary to tissue congestion and hypoperfusion. While many studies have evaluated the incidence of AKI in post-ischemic CS there is a paucity of data on non-ischemic CS etiology. The aim of this study is to evaluate clinical and prognostic relevance of AKI in this setting.

    Methods: Monocentric, retrospective observational study on patients with CS. Study period: 2010-2016. Exclusion criteria: ischemic and postcardiotomy etiology of CS. Demographic, clinical and biochemical variables were collected at baseline and during hospital stay. AKI was defined according to AKIN criteria. Continuous variables are presented as median (IQR). Data were analyzed using comparative statistics and multivariate analysis was performed with Cox regression. Survival analysis was performed with Kaplan-Meier.

    Results: We recruited 71 patients. Etiologies of CS were: 44 acute decompensation of chronic cardiomyopathies (CCM), 14 acute myocarditis, and 13 other causes. AKI occurred in 47 (66%) pts; among these AKIN stage 1, 2 and 3 occurred respectively in 16, 6 and 25 patients. In 14 pts a CRRT was required. Patients characteristics at baseline are summarized in Table 1 (* = p <0.05). Hospital mortality was 48% (n=23) in AKI pts vs 16% (n=4) in NON-AKI (p=0.01). 90 days cumulative survival was stratified for AKIN stages (Fig. 1, Log Rank < 0.003).AKIN stage3 and lactate level were independent predictors of death at 90 days: HR were respectively 3.1 (1.3-7.2) and 1.1 (1-1.2) per unit.

    Conclusions: In patients with CS caused by non ischemic etiologies AKI is a frequent clinical complication associated with poor prognosis.

    Table 1 (abstract P136). Baseline data
    Fig. 1 (abstract P136).

    Kaplan-Meier 90 days survival

    P137 Takotsubo syndrome in critically ill patients

    A Shilova, M Gilyarov, E Gilyarova, A Nesterov, A Svet

    Moscow City Hospital #1 n.a. N. Pirogoff, Moscow, Russia

    Introduction: Takotsubo syndrome (TS) is known to be an acute transient cardiac condition accompanied with acute heart failure. TS is often triggered by critical illness but that has been rarely studied in ICU practice.Therefore, it is known, that the use of catecholamines can directly induce TS, worsen LVOT obstruction, and delay spontaneous recovery in TS patients, it is nearly impossible to avoid their administration in critically ill [1].

    Methods: We have analyzed medical records from 23 patients with TS, that were revealed during year 2017 in our hospital. TS was defined due to Mayo criteria, including transient regional wall motion abnormalities, mildly elevated troponin level and no signs of obstructive CAD on coronary angiography.

    Results: Out of 23 patients who developed TS in ICU or ICCU, hemodynamic instability occurred in acute phase of TS in 12 (52%) cases. 9 (39%) of patients were admitted to ICU in due to septic shock (2 patients), major bleeding (1), cerebral mass lesion (1) and ARDS (2) and required treatment with catecholamines. General mortality rate in TS patients was 7 (30%), and 5 (55%) in critically ill TS patients. Mean duration of noradrenalin infusion was 7,2 days, dobutamine infusion 4,3 days. Patients with TS needed more ICU resources and longer ICU-stay. Mortality rate was higher in TS patients (55%) vs the ICU-population (28%), p = 0.02.

    Conclusions: TS seems to be an often cause of LV dysfunction and acute heart failure in critically ill. It seems that TS could be a predictor of worse prognosis in critically ill patients. Although catecholamine administration may worsen the patient prognosis and induce further AHF in critically ill patients it rearely can be avoided.


    1. Templin C et al. N Engl J Med 373:929–938, 2015

    P138 Institutional case-volume and in-hospital mortality after orthotopic heart transplantation: a nationwide study between 2007 and 2016 in Korea

    S Choi1, E Jang2, K Nam1, G Kim3, H Ryu1

    1Seoul National University College of Medicine, Seoul, South Korea, 2Andong National University, Andong, South Korea, 3Kyungpook National University, Daegu, South Korea

    Introduction: The positive effect of case volume on patient outcome seen in complex surgical procedures such as coronary artery bypass graft surgery has not been shown in heart transplantation (HT). The relationship between institutional case volume and patient outcome in adult HTs performed in Korea were analyzed

    Methods: The Health Insurance Review and Assessment Service (HIRA) data from 2007 to 2016 was analyzed for in-hospital and long-term mortality, ICU length of stay, and hospital length of stay in patients undergoing HT, depending on the case volume of the institution.

    Results: A total of 852 heart transplantation were performed between 2007 and 2016. The operative mortality after HTs was 8.6% (73/852). The operative mortality in institutions performing more than 20 cases/year was 3.5% (13/367) as compared to 8.0% (23/287) in institutions performing 10-19 cases/year and 18.7% (37/198) in institutions performing less than 10 cases/year. After adjusting for other potential factors for operative mortality, HT at intermediate volume centers 2.41 (95% CI 1.19–4.87, p=0.014) and low volume centers 6.74 (95% CI 3.41–13.31, p<0.001)were identified as risk factors of in-hospital mortality.

    Conclusions: Our study results showed that HTs performed at institutions with higher case volume were associated with lower mortality.

    P139 Intensive care unit readmission following left ventricular assist device implantation: causes, associated factors, and association with patient mortality

    J Hui1, W Mauermann1, J Stulak2, A Hanson3, S Maltais2, D Barbara1

    1Department of Anesthesiology and Perioperative Medicine, Rochester, USA; 2Department of Cardiovascular Surgery, Rochester, USA; 3Department of Biostatistics, Rochester, USA

    Introduction: Previous studies on readmission following LVAD implantation have focused on hospital readmission after dismissal from the index hospitalization. Since there are very little data existing, the purpose of this study was to examine intensive care unit (ICU) readmission in patients during their initial hospitalization for LVAD implantation to determine reasons for, factors associated with, and mortality following ICU readmission.

    Methods: This was a retrospective, single center, cohort study in an academic tertiary referral center. All patients at our institution undergoing first time LVAD implantation from February 2007 to March 2015 were included. Patients dismissed from the ICU who then required ICU readmission prior to hospital dismissal were compared to those not requiring ICU readmission prior to hospital dismissal.

    Results: Among 266 LVAD patients, 45 (16.9%) required ICU readmission. The most common reasons for admission were bleeding and respiratory failure (Fig. 1). Factors found to be significantly associated with ICU readmission were preoperative hemoglobin level of less than 10 g/dL, preoperative estimated glomerular filtration rate <35mL/min/1.73m2, preoperative atrial fibrillation, preoperative dialysis, longer cardiopulmonary bypass times, and higher intraoperative allogeneic blood transfusion requirements. Mortality at 1 year was 30.2% in patients requiring ICU readmission vs. 11.9% in those not requiring ICU readmission (age-adjusted OR=3.0, 95% CI 1.4 to 6.6, p=0.005).

    Conclusions: ICU readmission following LVAD implantation occurred relatively frequently and was associated with significant one-year mortality. These data can be used to identify LVAD patients at risk for ICU readmission and implement practice changes to mitigate ICU readmission. Future larger and prospective studies are warranted.


    Hasin T et al. J Am Coll Cardiol 61(2):153-63, 2013

    Tsiouris A et al. J Heart Lung Transplant 33(10):1041-7, 2014

    Forest SJ et al. Ann Thorac Surg 95(4):1276-81, 2013

    Fig. 1 (abstract P139).

    Reasons for ICU readmission during index hospitalization after LVAD placement. Of the 18 patients requiring ICU readmission for bleeding, 8 involved chest bleeding (e.g. hemothorax, cardiac tamponade), 7 gastrointestinal bleeding, 1 retroperitoneal bleeding, 1 bilateral subdural hemorrhage, and 1 tracheostomy site bleeding. Other reasons for ICU readmission included non-hemorrhagic cerebrovascular accident (2), LVAD malfunction (2), hyperactive psychosis (1), hyperkalemia (1), right ventricular failure (1), renal failure (1), syncope (1), and acute arterial thrombosis (1)

    P140 Atrial fibrillation and infection among acute patients in the emergency department: a multicentre cohort study of prevalence and prognosis

    T Graversgaard

    Odense University Hospital, Odense, Denmark

    Introduction: Patients with infection presenting with atrial fibrillation (AF) are frequent in emergency departments (ED). This combination is probably related to a poor prognosis compared to lone AF or infection, but existing data are scarce.

    Aim: to describe the prevalence and prognosis for AF and infection individually and concomitantly in an ED setting.

    Methods: Cohort study in adult (>=18 years) ED patients with ECG performed on presentation at Odense University Hospital and Hospital of South West Jutland, Denmark, from March 13 2013 to April 30 2014. AF was identified by electronic ECG records, and infection was identified based on discharge diagnoses. The absolute 30-day mortality and stroke rate were calculated for all patients, for those with AF, infection and for those with both.

    Results: Among 39393 contacts to the ED, 27879 patients (median age 66, 50 % women) had an ECG recorded and were included in the study. 2341 (8.4%) had AF, 5672 (20.3%) had an infection and 670 (2.4%) had both infection and AF, of which 230 (34.3%) had no previous AF diagnosis or AF identified by ECG in the past 10 years (new-onset AF). In these groups, 30-day mortality was 11.3% in patients with infection, 10.4% in patients with AF and 22.6% in patients with new-onset AF and infection. One-year stroke rate in patients with AF was 61.7/1000 person-years (95% CI, 49.6 to 76.7), 21.2/1000 person-years (95% CI, 17.2 to 26.2) in patients with infection and 62.5/1000 person-years (95% CI, 39.1 to 120.2) in patients with new-onset AF and infection. Among patients with new-onset AF and infection, 42.6% had registered further AF episodes within one year after discharge, compared to 36.4% in patients with new-onset AF without infection.

    Conclusions: Compared to ED patients with lone AF or infection, patients with concomitant new-onset AF and infection show an increased 30-day mortality, one-year stroke rate, and increased risk of further AF episodes.

    P141 Effect of positive end expiratory pressure on left ventricular contractility

    M Gruebler, O Wigger, S Bloechlinger, D Berger

    Inselspital, University Hospital Bern, Bern, Switzerland

    Introduction: Its afterload reducing effects make PEEP the treatment of choice for cardiogenic pulmonary edema. Studies indicate that PEEP may lower coronary blood flow. Its effects on left ventricular contractility is unclear. Most of the surrogate measures for cardiac contractility are dependent on afterload and contractility assessment under PEEP may therefore be biased. We have investigated cardiac contractility under PEEP with the endsystolic pressure volume relationship (ESPVR) as a load-independent measure of contractility.

    Methods: 23 patients scheduled for coronary angiography were ventilated with CPAP and a full face mask at three levels of PEEP (0, 5 and 10 cmH2O) in random order. Structural cardiac pathologies were excluded with echocardiography. At every PEEP level, left ventricular pressure volume loops (Millar conductance catheter with INCA System, Leycom, Netherlands) were obtained. The endsystolic elastance was derived from a PV-loop family under preload reduction with an Amplatzer sizing balloon in the inferior caval vein. All participants gave written informed consent. The study was approved by the Bernese ethics committee.

    Results: 5 women and 18 men with an age 59±6 years were studied. Ejection fraction was 70±8 % at baseline. Mean ESPVR at PEEP levels of 0, 5 and 10 were 2.64±1.3, 2.56± 1.18 and 2.33±0.88 mmHg/mL (p = 0.318, repeated measurements ANOVA). dP/dt and ejection fraction did not differ between the PEEP levels (p=0.138 and 0.48).

    Conclusions: Moderate levels of PEEP did not influence endsystolic elastance. Higher PEEP and patients in cardiogenic shock should be investigated.

    P142 Biventricular 3D volumetric analysis with transthoracic echocardiography in the critically ill

    S Orde1, M Slama2, N Stanley3, SJ Huang4, AS Mclean4

    1ICU, Sydney, NSW, Australia; 2Amiens University Hospital, Amiens, France; 3Midland Hospital, Perth, Australia; 4Nepean Hospital, Sydney, Australia

    Introduction: We sought to assess the feasibility of 3D volumetric analysis with transthoracic echocardiography in critically ill patients. We choose a cohort typical of ICU where accurate volumetric analysis is important: hypoxic, mechanically ventilated patients. 3D analysis is enticing in simplicity and wealth of data available. It is accurate in cardiology patients [1] but has not been assessed in the ICU.

    Methods: Patients were imaged within 24 hours of admission. Inclusion criteria: adult, hypoxic (P:F <300), mechanically ventilated, Doppler stroke volume (SV) assessment possible. Echocardiography: Seimens SC2000 real-time volumetric analysis with standard B-mode and Doppler assessment. Images unacceptable if >2 segments unable to be seen in 2 volumetric planes. 3D Left ventricle (LV) and right ventricle (RV) analysis with Tomtec Imaging and Seimens Acuson respectively and compared to Doppler derived SV. 30% limit of agreement considered clinically acceptable [2]. Imaging was optimised for volumetric analysis (20-45 vols/sec).

    Results: 92 patients, 83 in sinus, 9 in AF. No significant difference seen between Doppler vs 2D Simpson’s biplane, 3D LV or 3D RV SV estimation. Feasibility, SV values and bias are reported in Table 1 and Fig. 1. Limit of agreement for corrected Doppler vs LV 3D SV = -48% to 55%; RV 3D SV = -62.7% to 84.3%.

    Conclusions: 3D LV and RV volumetric analysis is feasible in majority of patients requiring mechanical ventilation, however lacks agreement with Doppler derived stroke volume assessment. Although images may appear sufficient, the semi-automated software appears to underestimate stroke volume. Further larger studies using thermodilution are warranted.


    1. Pedrosa J et al. Curr Pharm Des 22:105-21, 2016

    2. Critchley L et al. J Clin Monit Comput 15:85-91, 1999

    Table 1 (abstract P142). Doppler vs 2D and 3D stroke volume assessment
    Fig. 1 (abstract P142).

    Bland-Altman plots for Doppler vs 3D biventricular stroke volume assessment

    P143 Determinants of venous return during trendelenburg position and effect of hepatic vascular waterfall

    S Liu1, P Moller2, A Kohler1, G Beldi1, D Obrist3, A Hana1, D Berger1, J Takala1, S Jakob1

    1Inselspital, Bern University Hospital, University of Bern, Bern, Switzerland; 2Institute of Clinical Sciences at the Sahlgrenska Academy, University of Gothenburg, Sahlgrenska University Hospital, Gothenburg, Sweden; 3University of Bern, Bern, Switzerland

    Introduction: Body position changes such as leg raising are used to determine fluid responsiveness. We hypothesized that the Trendelenburg position increases resistance to venous return. Together with abolishment of the hepatic vascular waterfall, this may limit the increase in regional blood flow.

    Methods: Inferior vena cava (IVC), portal vein (PV), hepatic, superior mesenteric (SMA) and carotid artery blood flows and arterial, right atrial (RA) and hepatic (HV) and portal venous blood pressures were measured in anesthetized and mechanically ventilated pigs in supine and 30° Trendelenburg positions. All hemodynamic parameters were measured during end-expiration at 5 cmH2O PEEP, and at inspiratory hold with increasing airway pressures (AWP) of 15, 20, 25 and 30 cmH2O, respectively. Paired t test was used to compare pressures and flows in different positions during end-expiration. Repeated measures ANOVA was performed to evaluate the effects of AWP on hemodynamic parameters.

    Results: Trendelenburg position significantly increased RA, HV and PV blood pressures at end-expiration, while Qpv and Qsma remained unchanged, Qha increased and Qivc showed a trend to decrease (Table 1). In both positions, all blood flows decreased with increasing AWP, and the difference between Ppv and Qsma became smaller, indicating splanchnic blood pooling (Table 2). In the Trendelenburg position, splanchnic blood pooling was less severe compared to supine position.

    Conclusions: Trendelenburg position tended to decrease venous return from inferior vena cava. Further increases in RAP by augmenting AWP led to a decrease in all flows and signs of abolished hepatic vascular waterfall. Passive manoeuvers to assess fluid responsiveness evoke complex hemodynamic reactions which are not fully understood.

    Table 1 (abstract P143). Paired t test was used to evaluate the differences between groups
    Table 2 (abstract P143). Repeated measures ANOVA was used to assess the effects of AWP and body position

    P144 Link between bioelectrical impedance analysis derived phase angle and late mortality in cardiac surgery patients

    D Ringaitiene, D Grazulyte, D Zokaityte, L Puodziukaite, V Vicka, J Sipylaite

    Department of Anesthesiology and Intensive Care, Institute of Clinical Medicine, Faculty of Medicine, Vilnius University, Vilnius, Lithuania

    Introduction: Increasing age and frailty of patients undergoing cardiac surgery complicate the selection of patients. The aim of this study was to determine whether bioelectrical impedance analysis (BIA) phase angle is linked to long-term results after cardiac surgery and could be used as predictor.

    Methods: This observational retrospective study included all of the patients who underwent any of the STS defined elective cardiac surgery type from 2013 to 2014 at the Vilnius University Hospital. Patients who died in the hospital during the first post-operative month were excluded. BIA was performed prior surgery, demographic and comorbidity data were gathered in perioperative period. We evaluated 3-5 year all-cause mortality rate. Patients were categorized according to the BIA provided phase angle (PhA) value, which was standardized for age and gender; long-term predictors were determined by Cox regression analysis.

    Results: Among the cohort of 642 patients undergoing cardiac surgery, the median age was 67.8 [59 - 73] years; most of them were men (67.8%). Long term mortality rate was 12.3% (n=79). Most of the cases were low risk with median EuroSCORE II value of 1.78 [1.07 - 2.49]. The rates of standardized PhA were as follows: <5th 10.4% (n=67), <10th 17.3% (n=111), <15th 22.7% (n=146), <20th 27.6% (n=177), <25th 35.5% (n=228), <30th 42.2 % (n=271), <35th 47.4 % (n=304), <40th 52% (n=334), <45th 58.6% (n=386). The Cox regression analysis of all percentiles revealed the most potent predictor – phase angle value below 25th of the reference range (OR 2.42, 95% CI: 1.49-3.94, p<0.001), with a mean difference in survival of 13.22 months (64.60 vs 51.38 p<0.001). This relation persisted after adjustment with EuroSCORE II value.

    Conclusions: BIA provided phase angle value can be used for long-term survival estimation before cardiac surgery. However, further studies are needed to prove the independent effect of these assumptions.

    P145 Screening ultrasound compression testing in critically ill patients performed by general nurses - validation study

    R Skulec1, A Kohlova2, L Miksova1, J Benes1, V Cerny1

    1Masaryk Hospital Usti nad Labem, JEP University, Usti nad Labem, Czech Republic; 2J.E. Purkinje University, Usti nad Labem, Czech Republic

    Introduction: Despite of preventive measures, the incidence of deep venous thrombosis (DVT) in ICU patients is estimated to range from 5-31%. While clinical diagnostics is unreliable, ultrasound compression test (UCT) has proven to be a highly sensitive and specific modality for the recognition of lower extremity DVT [1]. Delegating this competence to ICU nurses can increase UCT availability and enable preventive DVT screening. Therefore, we decided to conduct a clinical study to evaluate the sensitivity and specificity of UCT performed by general ICU nurse in ICU patients compared to an investigation by ICU physician certified in ultrasound.

    Methods: Prior to the study, each nurse participating in the study completed one-hour training in UCT and examined 5 patients under supervision. Then, ICU patients without known DVT underwent UCT in the femoral and popliteal region of both lower extremities performed by trained general ICU nurse. On the same day, the examination was repeated by an ICU physician. The results of the examinations of each patient were blinded to each other for both investigators until both tests were performed. In case of a positive test, the nurse immediately reported the result to the ICU physician. The sensitivity and specificity of the test performed by general nurse was calculated in comparison with the examination by a specialist.

    Results: A total of 80 patients were examined. Both lower extremities were examined in all patients. The prevalence of DVT of 11,25% has been found. The overall sensitivity of the examination performed by general nurse was 90.0%, the specificity 100% with negative predictive value of 98.61%, positive predictive value of 100% and accuracy of 98.77%.

    Conclusions: The results of our study have shown that general ICU nurses are able to perform bedside screening of DVT by compression ultrasound test with a high degree of reliability after a brief training.


    1. Minet C et al. Crit Care 19:287, 2015.

    P146 Effect of cytoadsorbant device on coagulation factors during cardio-pulmonary bypass

    E Poli, L Alberio, A Bauer-Doerries, C Marcucci, A Roumy, M Kirsch, E De Stefano, C Gerschheimer, L Liaudet, A Schneider

    Centre Hospitalier Universitaire Vaudois (CHUV), Lausanne, Switzerland

    Introduction: Cytokine Hemoadsorption (HA) might improve outcomes of patients undergoing cardiac surgery with cardiopulmonary bypass (CPB). However, the effect of HA on coagulation factors remains unknown. This substudy nested within a randomized control trial comparing HA device with standard of care (NCT02775123) aims at evaluating the effect of a cytoadsobent device on coagulation factor activity.

    Methods: A Cytosorb® (Cytosorbents, New Jersey, USA) HA device was inserted within the CPB circuit in ten patients undergoing elective cardiac surgery. One hour after CPB onset, the activity of coagulation factors (Antithrombin (AT), von Willebrand Factor (vWF), factors II, V, VIII, IX, XI, and XII) were measured before and after the device. Pre and post device measurements were compared using student t-test, a p value <0.05 was considered statistically significant.

    Results: Patients’ mean age was 60.6 ± 21.4 years, 20% were female, the mean EuroSCORE II was 6.2 ± 8.1. Procedures were: coronary artery bypass graft (CABG) (2/10), aortic root replacement (6/10) and CABG combined with aortic valve replacement (2/10). Mean CPB duration was 161.8 ± 52.3 min. Pre and post HA measurements of coagulation factors activity are presented in Fig. 1. Post-device AT and FII activity was significantly lower (respectively from 70.4 to 66.6, p=0.01 and from 61.5 to 57.1, p=0.03) compared to pre-device measurement. There was no statistically significant difference between pre-and post- HA measurements for all other coagulation parameters

    Conclusions: Pre and post HA Cytosorb® measurements for coagulation factor activity were not different except for a small decrease in AT and FII activity. This might be related with intra-device consumption or adsorption. Further analyses accounting for CPB fluid balance, the entire study population and timepoints are pending.

    Fig. 1 (abstract P146).

    Comparison of mean activity levels of coagulation parameter factors pre and post device. Error bars correspond to standard deviation

    P147 Changes in microvascular perfusion during blood purification with cytosorb in septic shock

    S Zuccari, V Monaldi, S Vannicola, R Castagnani, A Damia Paciarini, A Donati

    Università Politecnica delle Marche, Ancona, Italy

    Introduction: The aim of this study is to evaluate changes in hemodynamics and microvascular perfusion during extracorporeal blood purification with Cytosorb in patients with septic shock requiring renal replacement therapy.

    Methods: Eight adult patients with septic shock requiring continuous renal replacement therapy for acute renal failure were enrolled and underwent a 24-hour treatment with the emodasorption cartridge Cytosorb. Measurements were taken at baseline before starting Cytosorb, after 6h (t1) and 24h (t2) and included: blood gases, macro-hemodynamic parameters (Picco2), vasopressor and inotropic dose, plasma levels of cytokines (interleukin [IL]-1, IL6, IL8, IL10, tumor necrosis factor alpha) and parameters of microvascular density and perfusion (sublingual sidestream dark field videomicroscopy). Procalcitonin was measured at baseline and after 24h of treatment.

    Results: A non-significant decrease in plasma levels of cytokines was observed over time. Hemodynamic parameters and vasopressor requirement remained stable. The microvascular flow index increased significantly at t2, total vessel density and perfused vessel density increased at t1 and t2 (Figs. 1 and 2).

    Conclusions: In patients with septic shock requiring continuous renal replacement therapy for acute renal failure, blood purification with Cytosorb was associated with an improvement in sublingual microvascular perfusion.

    Fig. 1 (abstract P147).

    Microvascular Flow Index

    Fig. 2 (abstract P147).

    Perfused Vessel Density

    P148 Renal replacement therapy with the OXIRIS filter decreases inflammatory mediators and improves cardiorenal function in septic patients better then CVHDF.A cohort study and a propensity–matched analysis

    F Turani 1, S Busatti1, S Martini1, M Falco2, F Gargano2, R Barchetta2, L Weltert2, F Leonardis3

    1Aurelia Hospital, Rome, Italy; 2European Hospital, Rome, Italy; 3University of Tor Vergata, Rome, Italy

    Introduction: Objective renal replacement therapy (RRT) with the OXIRIS filter is used in sepsis septic shock with AKI, but few clinical studies compare the adsorbing effect of Oxiris filter on the inflammatory mediators to RRT.

    The aim of this study is 1- to confirm whether oxiris decreases cytokines and procalcitonin in sepsis septic shock.2- This effect is superior to RRT.3- This translates in a better cardio renal response.

    Methods: A coohort study and a propensity–matched analysis included 73 patients admitted to three Intensive Care (Aurelia Hospital, European Hospital, Tor Vergata – Rome) with a diagnosis of septic shock.50 patients were submitted to RRT with oxiris filter and 23 patients to RRT.Il 6, Procalcitonin, the cardiorenal indices and SOFA score were compared before (T0) and at the end of the treatments (T1). All data are expressed as mean±sd. ANOVA one way was used to compare the changes of the variables in the time. P< 0.05 was considered statistically significant.

    Results: Of 50 patients submitted to RRT with the oXiris filter 32 could be matched to 22 septic patients who received RRT. IL6 and Procalcitonin decreased in the Oxiris group (p< 0.01) but not in the RRT group.MAP increased (p< 0.01) and noradrenaline dosage decreased in oxiris group (p< 0.01), but non in RRT group. Also PaO2/FIO2 ratio, diuresis, SOFA improved only in the in the oxiris group (p<0.05).

    Conclusions: In sepsis/septic shock patients with AKI, IL6 and procalcitonin decrease more in the oXirs group then in the RRT group.This is associated with an improvement of the cardio - renal function and the clinical condition.The study confirms that RRT with oXiris filter may be useful in sepsis/septic shock when other convective/diffusive techinques fail.


    1. Journal of Hepatology 2015 vol. 63 j 634 &#776;C642

    P149 ADVOS reduces liver and kidney disease markers and corrects acidosis: the Hamburg experience

    VH Fuhrmann, D Jarczak, O Boenisch, S Kluge

    University Medical Center Hamburg-Eppendorf, Hamburg, Germany

    Introduction: ADVOS (Hepa Wash GmbH, Munich, Germany) is a recently developed CE-certified albumin-based hemodialysis procedure for the treatment of critically ill patients. In addition to the removal of water-soluble and albumin-bound substances, acid-base imbalances can be corrected thanks to an automatically regulated dialysate pH ranging 7.2 to 9.5.

    Methods: Patients treated with the ADVOS procedure between in the Department of Intensive Care Medicine of the University Medical Center Hamburg-Eppendorf were retrospectively analyzed. Overall 102 treatments in 34 critically ill patients (Mean SOFA Score 16) were evaluated. Additionally, subgroup analysis for hyperbilirubinemia, respiratory acidosis and non-respiratory acidosis were conducted.

    Results: Severe hyperbilirubinemia (>6 mg/dl) was present in 60 treatments, while 26 and 14 treatments were performed to treat respiratory (PaCO2>45 mmHg) and non-respiratory (PaCO2<45 mmHg) acidosis (pH<7.35), respectively. Mean treatment duration was 16 h.

    ADVOS procedure was able to correct acidosis and reduce bilirubin, BUN and creatinine levels significantly. The subgroup analysis shows an average bilirubin reduction of 21% per ADVOS multi treatment in the hyperbilirubinemia group (15.24mg/dL vs 11.77mg/dL, p<0.05). Moreover, pH (7.23 vs. 7.35, p<0.001) and PaCO2 (65.88 vs. 53.61 mmHg, p<0.001) were corrected in the respiratory acidosis group, while in the non-respiratory acidosis group, an improvement in pH (7.19 vs. 7.37, p<0.001), HCO3 (15.21 vs. 20.48, p=0.002) and base excess (-12.69 vs. -5.10, p=0.004) could be observed.

    There were no treatment-related adverse events during therapy.

    Conclusions: ADVOS is a safe and effective hemodialysis procedure, which is able to remove water soluble and protein bound markers and correct severe acidosis in critically ill patients.

    P150 Score for timely prescribing (STOP) renal replacement therapy in intensive care unit - preliminary study of a mneumonic approach

    E Leal Bastos de Moura, A Alves de Sousa, F Ferreira Amorim, J Rodrigues Trindade Jr, M De Oliveira Maia

    Hospital Santa Luzia Rede D’Or São Luiz, Brasília, Brazil

    Introduction: The moment of initiation of renal replacement therapy (RRT) in critically ill patients and a reason for debate, without having objective criteria that indicate it. The objective of this study was to propose a score to help identify the ideal time for the initiation of RRT, and if there is correlation between this score and Intensive Care Unit length of stay and mortality.

    Methods: patients admitted to the Intensive Care Unit, > 18-years-old, to whom RRT were indicated by the intensivist. The study protocol was approved by the Hospital das Forças Armadas Ethical Committe, and written informed consent was obtained from all patients. The STOP was assigned according to the presence or not of each of the items (Fig. 1). They were classified into groups A and B according to Fig. 2, and the group change was recorded.

    Results: 80 patients admitted to ICU in the period, 2 excluded for limitation of therapeutic efforts. 78 were admitted to the study, with the mean age of 75.2 years; 64,1% males (n=50). Distribution among the groups: A1 (n=1, 1.2%), A2 (31, 39.7%), A3 (5, 6.4%), B1 (6, 7.6%), B2 (35, 44.8%) e B3 (no patients). There were statistically significant correlation between group change and mortality (p 0.02), and between the STOP and nephrologist agreement (p 0.01). There was no correlation between STOP value and ICU LOS (p 0,75) or STOP and mortality (p 0.8).

    Conclusions: The STOP value is correlated with hemodialysis indication agreement between intensivists and nephrologists, and not correlated with ICU LOS or mortality. The group change was correlated to increased mortality, in the study population. The significance of STOP as a tool in determining the moment of initiation of renal replacement therapy remains a work in progress.

    Fig. 1 (abstract P150).

    STOP items description

    Fig. 2 (abstract P150).

    Description of the study

    P151 Intraoperative continuous renal replacement therapy in liver transplantation: a pilot randomized controlled trial (INCEPTION study)

    C Karvellas, S Taylor, T Ozelzel, E Bishop, D Cave, D Bigam, S Bagshaw

    University of Alberta, Edmonton, Canada

    Introduction: Liver transplant (LT) in patients with renal dysfunction presents intraoperative challenges and portends postoperative morbidity. Continuous renal replacement therapy (CRRT) is increasingly used for intraoperative support; however, there is a paucity of data to support this practice.

    Methods: Pilot randomized open-label controlled trial in adults receiving cadaveric LT with a Modification of End-Stage Liver Disease (MELD) score >=25 and preoperative acute kidney injury (KDIGO stage 1) and/or estimated glomerular filtration rate <60 mL/min/1.73m2. Patients were randomized to intraoperative CRRT (iCRRT) or standard of care. Primary endpoints were feasibility and adverse events. Secondary endpoints were changes in intraoperative fluid balance, complications, and hospital mortality. Analysis was intention-to-treat.

    Results: Sixty patients were enrolled, 32 (53%) were randomized (17 to iCRRT; 15 to control). Mean (SD) was age 49 (13) years, MELD was 36 (8), 75% (n=24) had cirrhosis; 63% (n=20) received preoperative RRT; and 66% (n=21) were transplanted from ICU. One patient allocated to iCRRT did not receive LT. Seven (41%) allocated to control crossed over intraoperatively iCRRT (high central venous pressure [n=4]; abdominal distension [n=1]; massive transfusion [n=1]; hyperkalemia [n=1]). No adverse events occurred. Operating time was similar (513 [140] vs. 463 [115] min, p=0.30). CRRT duration was 379 (137) min, with only 3 interruptions (all due to access). iCRRT fluid removal was 2.8L (range 0–14.5). Fluid balance was 5.3L (2.9) for iCRRT vs. 4.3L (6.1) for control (p=0.57). Postoperative CRRT was similar (77% vs. 50%, p=0.25). There were no differences in reexploration (p=0.36), mechanical ventilation time (p=0.87), reintubation (p=0.18), sepsis (p=0.56), or mortality (p=0.16).

    Conclusions: In this pilot trial of high acuity LT patients, iCRRT was feasible and safe. These data will inform the design of a large trial to define the role of iCRRT during LT.

    References NCT01575015.

    P152 The uptake of citrate anticoagulation for continuous renal replacement therapy in intensive care units across the United Kingdom

    A Soni, M Borthwick

    Oxford University Hospitals NHS Foundation Trust, Oxford, UK

    Introduction: The purpose of this descriptive study is to report the trend of citrate anticoagulation uptake, used for continuous renal replacement therapy (CRRT), in intensive care units (ICUs) across the United Kingdom (UK). Citrate anticoagulation has been used in the UK since 2008, but its uptake since then is unknown [1].

    Methods: A survey questionnaire targeted pharmacists working in UK adult ICUs providing CRRT. Invitations to participate were distributed utilising the United Kingdom Clinical Pharmacy Association online forum as a platform for access. Survey administration was by self-completion and submissions were accessible over a total of six weeks. Basic demographic data, ICU specifications, the citrate system in use and implementation details were sought. A descriptive statistical analysis ensued.

    Results: 70 responses were received of which 67 were analysed after duplication removal. 45 trusts, encompassing a total 67 units, in the UK confirmed use of citrate anticoagulation for CRRT. Units reported a mean of 71 days to implement a citrate system (range 0 to 645 days). Prismaflex® (Baxter) and Multifiltrate (Fresenius) were reported as the most commonly used citrate systems; 32 (47.8%) and 28 (41.8%) units respectively.

    Conclusions: There are 279 ICUs in the UK [2]. We conclude that a minimum of 67 units (24%) use citrate anticoagulation for CRRT in UK critical care centres. Citrate systems of anticoagulation are becoming an increasing popular choice for regional anticoagulation, falling in line with international guidance [3]. These guidelines were introduced in 2012 which corresponds to increase national uptake.


    1. Borg R et al. Intensive Care Soc 18:184-192, 2017.

    2. Borthwick M et al. Int Pharm Pract (in press), 2017.

    3. Kidney Disease: Improving Global Outcomes Acute Kidney Injury Working Group. Kidney Int Suppl 2:1-138, 2012.

    P153 Is vasopressor/inotrope requirement effected by ultrafiltrate volume and 24 hour fluid balance in intensive care patients undergoing acute renal replacement therapy?

    A Lal1, M Shaw2, A Puxty1

    1Glasgow Royal Infirmary, Glasgow, UK; 2University of Glasgow, Glasgow, UK

    Introduction: Patients requiring renal replacement therapy (RRT) whilst on significant doses of vasoactive medications have often been deemed unsuitable to undergo ultrafiltration (UF). However with better understanding of the pathophysiology of renal injury [1] in intensive care patients we hypothesise that vasopressor/inotrope requirement will not significantly increase with UF or with a more negative fluid balance (FB).

    Methods: Data was retrospectively collected in a general ICU/HDU of adult patients requiring acute RRT for acute kidney injury. Patients on chronic dialysis were excluded. Percentage change in vasopressor index and mean arterial pressure were combined to form the Combined Percentage Change (CPC) which we used as an index of patient stability.

    Results: 38 patients were assessed undergoing a total of 206 RRT sessions. The mean age was 57 with 23 females and 15 males. Mean FB for the 24 hours from start of RRT was +651mls (range -2317 to +14850mls). Using a model to correct for significant covariates and plotting 24 hour FB against CPC we found no significant effect of FB on stability p=0.98 (Fig. 1). Mean UF volume was 880mls (range 0-3009mls). There was a non linear relationship between UF and stability with moderate volumes improving but larger volumes worsening stability (Fig. 2). This did not reach statistical significance (p=0.074) so may be due to chance but is likely due to a lack of power.

    Conclusions: Fluid balance has no effect on cardiovascular stability during RRT in our cohort but there may be a varying effect of UF depending on volume.


    Perner A et al Intensive Care Med 43:807-815, 2017

    Fig. 1 (abstract P153).

    See text for description

    Fig. 2 (abstract P153).

    See text for description

    P154 Hemostatis parameter changes induced by filter change in infants on continuous renal replacement therapy

    A Akcan Arikan, PR Srivaths, N Tufan Pekkucuksen, JR Angelo, TM Mottes, MC Braun

    Baylor College of Medicine, Houston, TX, USA

    Introduction: Exposure of blood to a foreign surface such as a continuous renal replacement therapy (CRRT) filter could lead to activation of platelets (plt) and fibrinogen (fib) trapping. Thrombocytopenia has been reported in adults on CRRT but data in pediatrics are scarce. Our institution uses regional citrate anticoagulation (RCA) as standard of care with prefilter hemodilution and HF1000 filters (polysulfone, surface area (SA) 1.1 m2) regardless of patients’ (pts) age and size. As filter SA is relatively larger in younger pts, we aimed to investigate the impact of CRRT filter change on hemostasis parameters in infants on CRRT in up to first three filter changes.

    Methods: Retrospective chart review

    Results: 30 patients < 10 kg were included, age 4.3 (0.5-8) months, weight 5.4+2.4 kg, with 88 filters. Metabolic disease was the most common principal diagnosis (7/30, 23%), liver failure (LF) was the most common comorbidity (12/30, 40%). All patients received prefilter continuous venovenous hemodiafiltration with minimum dose of 2000 ml/1.73m2/h. Thrombocytopenia was common at CRRT start (28/30, 93%). Plts decreased in 74% filter changes (65/88) by 15+70% (pre vs post plt 71 (44-111) vs 50(30-83), p<0.001). Fibrinogen also decreased from 201 (152-261) to 170 (134-210), p<0.001; there was no change in PTT, PT, or INR values before and after filter changes. Bleeding events were seen in 13/30 (43%) of pts (8/12 of LF pts vs 5/18 others, p=0.04), but were not more common in pts who had decrease in plts or fib with filter changes (41% with drop in plts vs 57% without, p=0.66; 47% with drop in fib vs 75% without, p=0.58).

    Conclusions: Thrombocytopenia is common in infants on CRRT. Further decreases in plt and fibrinogen can be seen in with CRRT filter changes if the filters are relatively large compared to patient size. Bleeding events seems more related to underlying comorbidity, and less to changes in hemostatis parameters observed with filter change but would need to be confirmed with further studies.

    P155 Intensive monitoring of post filter ionized calcium concentrations during CVVHD with regional citrate anticoagulation: is it still required?

    D Khadzhynov, F Halleck, O Staeck, L Lehner, T Slowinski

    Charite Universitaetsmedizin, Berlin, Germany

    Introduction: The aim of the present study was to evaluate the role of postfilter calcium concentrations (pfCa) in terms of safety and efficacy in large retrospective cohort of patients treated with CVVHD and regional citrate anticoagulation.

    Methods: Retrospective, observational study at a university hospital with 6 ICUs. All patients treated with RCA-CRRT were included in the study.

    Results: Among 1070 patients treated with RCA-CVVH pfCa at the start of the CVVHD was available in 987 pts. The pfCa concentrations were in target range (0.25-0.35 mmol/L) in the majority of patients (70%), whereas 17% and 13% of patients had the pfCa below or above the target range, respectively. In the further 72h of CVVHD treatment the propotion of patients with targeted pfCa increased to 86% and remained stable. At the start of the RCA-CVVHD there was a significant but weak correlation between the pfCa and ionized systemic Ca (iCa) with a Spearman rank-order correlation coefficient (rho) of 0.374 (p < 0.001). The coefficient of variation of pfCa concentraions was significantly higher if compared to the coefficient of variation of iCa concentration. Using per protocol adaptations the incidence of a severe hypocalcemia (<0.9 mmol/L) was low and present only at first 12 hours of therapy: 4% and 2% of patients with pfCa below the target range and 0.7% and 0.4% of patients with pfCa in target range, at 0h and 12h respectively (p<0.001). There was no correlation between pfCa concentrations and filter lifetime.

    Conclusions: The results of the present study support the previous reports about higher measurements variation of pfCa compared to systemic iCa (1). Nevertheless due to the weak correlation of iCa and pfCa as well as a low number of patients with a severe metabolic complication, the results of our study question the necessity of intensive pfCa monitoring during RCA-CRRT. Present results need to be validated in further trials.


    1. Schwarzer P. et al, Crit Care. 2015 Sep 8;19:321



    P157 Association of pain in the critically ill patient with acute kidney injury risk in the intensive care unit (ICU)

    JM Vieira Junior, LB Herranz, LC Pontes de Azevedo, I Castro

    Hospital Sírio Libanês, Sao Paulo, Brazil

    Introduction: In critically ill patients, occurrence of pain is frequent and usually correlates with worse outcomes, such as prolonged ICU length of stay (LOS) and mechanical ventilation. In this regard, pain leads to sympathetic activation, inflammatory mediators and therefore, potentially to organic dysfunction. The aim of this study is to evaluate the relationship between acute pain in critically ill patients and their association with acute kidney injury (AKI).

    Methods: Retrospective cohort with 6345 adults patients admitted between June 2013 and June 2016, from the ICU of Hospital Sírio Libanês Hospital in Sao Paulo (Brazil). Main exclusion criteria were: length of stay < 48h, coma and previous AKI. The predictor pain was obtained through daily electronic records according to numerical verbal scale (0-10). The outcome was defined as serum creatinine elevation equal to or greater than 0.3mg/dl and/or greater than 50% increase at any time after the first 48 hours in the ICU. The multivariate analysis was performed by Binary Logistic Regression through distinct groups of early or late predictive factors in relation to AKI.

    Results: After the exclusion of 3220 patients, the incidence of pain with numerical verbal scale equal to or greater than 3 points was 23.6%. The outcome occurred in 31.7% of the cohort. In the binary regression, using the more early predictive factors, sex and pain presented independent relation with the outcome - adjusted OR 1.24 (1.12-1.36) and 1.63 (1.34-1.98), respectively (p <0.001). In the analysis of late association factors, mechanical ventilation over 3 days - OR 4.71 (3.01-7.36), use of strong opioid - OR 2.7 (1.58- 4.60) and PCR- t over 5.2mg/dl - OR 2.27 (1.15-4.47) presented the highest positive association with AKI (p<0,001).

    Conclusions: Poor management of ICU pain is associated to worse outcomes, including increased risk to AKI. The search for a better pain management strategy in the ICU scenario should therefore be reinforced.

    P158 Incidence and outcomes of acute kidney injury in 3 large inner city hospitals

    S Channon1, B Girling1, B Trivedi2, S DeFreitas2, C Kirwan2, J Prowle1

    1Queen Mary University, London, UK; 2Barts Health NHS Trust, London, UK

    Introduction: Acute Kidney Injury (AKI) is a common complication in hospitalised patients, strongly associated with adverse outcomes [1]. A lack of baseline incidence and outcome data limits our ability to assess local strategies aimed at improving AKI care.

    Methods: In an audit in three linked inner London hospitals we interrogated our electronic patient data warehouse (Cerner Millennium power insight electronic data warehouse) with a specially written query to identify cases of AKI, defined by KDIGO creatinine criteria, in patients aged over 18y admitted for >24h during January to June 2016. We excluded palliative care and obstetric patients. In the absence of premorbid baseline (median 7-365d pre-admission) the admission creatinine value was used. End stage renal disease (ESRD) and primary sepsis diagnosis was obtained from ICD10 coding.

    Results: Of 28872 admissions, we excluded 1052 with pre-existing ESRD (Hospital mortality 6.0%) and 8833 with fewer than one creatinine result who could not be assigned AKI status (mortality 1.1%). Of the remaining 18987 there were 3145 with AKI (16.6%), with mortality increasing from No AKI group (2.4%), to AKI stage 1 (12.6%), and a further increase to AKI stages 2-3 (22.4%) (p<0.001) (Table 1). Patients with AKI were older (p<0.001), more likely to be medical than surgical (p<0.001), more likely to have a primary sepsis diagnosis (p<0.001) and had higher baseline creatinine (median 91 vs 79 p<0.001). No known baseline was found in 29.7% of patients with AKI, but their mortality did not significantly differ to those with a baseline (14.2% vs 16.6%, p=0.093).

    Conclusions: An electronic query identified the local burden of AKI and it’s associated hospital-mortality; such baseline data is essential to assess the effect of Quality Improvement interventions in AKI prevention and care.


    1. Coca SG et al. Am J Kidney Dis 53(6):961-73, 2009

    Table 1 (abstract P158). Incidence and associations of AKI

    P159 Loop diuretics to treat acute kidney injury in critically ill patients: a systematic review and meta-analysis

    K Rosas1, D Gutierrez2

    1Hospital Espanol de Mexico, Ciudad de Mexico, Mexico; 2Hospital Angeles Acoxpa, Ciudad de Mexico, Mexico

    Introduction: Acute kidney injury (AKI) is a common condition in critically ill patients [1, 2]. Loop diuretics are generally used as first line treatment. However, controlled trials show controversial results. We ought to search systematically and realize a meta-analysis on the matter.

    Methods: An electronic search of randomized clinical trials in adult patient treated with diuretics for AKI compared with standard treatment or a control group was conducted. The primary objective of the analysis was to assess recovery of renal function. Secondary endpoints included time to recovery of renal function, need for Renal Replacement Therapy (RRT), mortality in the Intensive Care Unit (ICU) and complications.

    Results: The search obtained 7 studies for the analysis. A total of 853 patients, 446 in the intervention group and 407 in the control group were included. Comparing those treated with diuretic vs control, the analysis showed relative risk (RR) 1.11 for renal recovery (95% CI [0.74 - 1.67], p = 0.62), RR 1.29 for recovery time (95% CI [- 3.30 - 0.72], p = 0.21), the need for TRR with RR 0.96, (95% CI [1.23 - 0.75], p = 0.74) and mortality in the ICU with RR 0.80 (95% CI [0.48 - 1.31], p = 0. 52) (Fig. 1). The intervention group had an increased risk of complications compared to control (RR 1.83, 95% CI [1.40 - 2.40], p < 0.0001).

    Conclusions: The use of loop diuretic to treat AKI showed no difference in the recovery of renal function, the need for RRT or mortality in the ICU. However, it exhibited higher risk of complications.


    1. Hoste EAJ et al. Intensive Care Med 41(8):1411-1423, 2015

    2. Hoste AJ et al. Crit Care 10(3):R73, 2006

    Fig. 1 (abstract P159).

    Forest plot

    P160 Alterations in portal vein flow and intra-renal venous flow are associated with acute kidney injury after cardiac surgery: a prospective cohort study

    W Beaubien-Souligny1, A Benkreira1, P Robillard1, Y Lamarche1, J Bouchard2, A Denault1

    1Montreal Heart Institute, Montréal, Canada; 2Hôpital Sacré-Coeur, Montréal, Canada

    Introduction: Increased venous pressure is one of the mechanism leading to acute kidney injury (AKI) after cardiac surgery. Portal flow pulsatility and discontinuous intra-renal venous flow are potential ultrasound markers of the impact of venous hypertension on organs. The main objective of this study was to describe these signs after cardiac surgery and to determine if they are associated with AKI.

    Methods: This single center prospective cohort study (NCT02831907) recruited adult patients able to give consent. Ultrasound studies were performed before cardiac surgery and repeated on post-operative day (POD) 0, 1, 2 and 3. Abnormal portal and renal venous flow patterns are defined in Fig. 1. The association between the studied markers and the risk of new onset of AKI in the following 24 hours period following an assessment was tested using logistic regression with a 95% confidence interval. Clinical variables associated with the detection of the signs were tested using generalized estimating equation models. This study was approved by the local ethics committee.

    Results: During the study period, 145 patients were included. The presence of the studied ultrasound signs is presented in Fig. 2. During the week following cardiac surgery, 49 patients (33.8%) developed AKI, most often on POD 1 (71.4%). The detection of portal flow pulsatility and severe alterations in renal venous flow (Pattern 3) at ICU admission (POD 0) were associated with AKI in the subsequent 24 hours period and was independently associated with AKI in multivariable models including EUROSCORE II and baseline creatinine (Table 1). The variables associated with the detection of abnormal portal and renal patterns were associated with lower perfusion pressure, higher NT-pro-BNP and inferior vena cava measurements (Table 2).

    Conclusions: Abnormal portal and intra-renal venous patterns are associated with early AKI after cardiac surgery. These Doppler features must be further studied as potential treatment targets to personalize management.

    Table 1 (abstract P160). Assessment of echographic parameters on post-operative day 0 and the risk of AKI in the subsequent 24 hours period
    Table 2 (abstract P160). Association between the studied echographic markers and clinical parameters after surgery
    Fig. 1 (abstract P160).

    Portal vein and intra-renal venous Doppler assessment using trans-thoracic ultrasound. Probe position in showed using the Vimedix simulator (CAE Healthcare, St-Laurent, Canada). Abnormal pattern of intra-renal venous flow can be separated into a) Pattern 2 defined as discontinuous with flow in both systole and diastole and b) Pattern 3 defined as discontinuous with flow only in diastole.

    Fig. 2 (abstract P160).

    Portal and intra-renal venous flow patterns during the peri-operative period. (PF: pulsatility fraction, POD: post-operative day)

    P161 A clinical prediction model for severe AKI after pediatric cardiac surgery

    I Scharlaeken1, M Flechet2, D Vlasselaers1, L Desmet1, G Van den Berghe1, F Güiza2, G Meyfroidt1

    1University Hospitals Leuven, Leuven, Belgium; 2KU Leuven, Leuven, Belgium

    Introduction: Acute kidney injury (AKI) is very prevalent after cardiac surgery in children, and associated with poor outcomes [1]. The present study is a preplanned sub-analysis of a prospective blinded observational study on the clinical value of the Foresight near-infrared spectroscopy (NIRS) monitor [2]. The purpose of this sub-analysis was to develop a clinical prediction model for severe AKI (sAKI) in the first week of PICU stay.

    Methods: sAKI was defined as serum creatinine (SCr) >/= 2 times the baseline, or urine output < 0.5 ml/kg/h for >/= 12h. Predictive models were built using multivariable logistic regression. Data collected during surgery, upon PICU admission, as well as monitoring and lab data until 6h before sAKI onset, were used as predictors. Relevant predictors with a univariate association with sAKI, were included in the models. Accuracy of the models was tested using bootstraps, by AUROC and decision curves.

    Results: 177 children were enrolled, admitted to the PICU of the Leuven University Hospitals after cardiac surgery, between October 2012 and November 2015. 5 patients were excluded. 70 children (40.7%) developed sAKI in the first week of PICU stay. A multivariate model with 5 admission parameters (maximum lactate during surgery, duration of CPB, baseline sCr, RACHS1 and PIM2 scores), and 4 postoperative measurements (average heart rate, average blood pressure, hemoglobin, lactate), was most predictive for sAKI (Fig. 1).

    Conclusions: The risk of sAKI in children after congenital cardiac surgery could be predicted with high accuracy. Future models will also include medication data. These models will be compared against and combined with NIRS oximetry data to investigate the independent and added predictive value of the Foresight monitor.


    1. Kaddourah A et al. N Engl J Med 376:11-20, 2017

    2. Identifier: NCT01706497

    Fig. 1 (abstract P161).

    Performance of the multivariate LR model for sAKI

    P162 Renal perfusion, function and oxygenation in early clinical septic shock

    J Skytte Larsson, V Krumbholz, A Enskog, G Bragadottir, B Redfors, S Ricksten

    Institution for clinical sciences, Göteborg, Sweden

    Introduction: Acute kidney injury (AKI) occurs in over 50% of the patients in the intensive care unit (ICU). The predominantly ethiology of AKI is septic shock, the most common diagnosis in the ICU. AKI significantly increases the risk of both morbidity and mortality[1].

    Methods: 8 ICU patients with septic shock was studied within 24 hrs from admission. 58 patients after cardiac surgery served as control group. All patients were sedated and mechanically ventilated. Renal blood flow (RBF) and glomerular filtration rate (GFR) were obtained by the infusion clearance of paraaminohippuric acid (PAH) and by extraction of 51Cr-ethylenediamine (51Cr-EDTA). N-acetyl-β -D-glucosaminidase (NAG), was measured.

    Results: RBF was 19% lower, renal vascular resistance 19% higher and the relation of RBF to cardiac index was 29% lower in patients with septic shock compared to the control group. GFR (32%, p=0.006) and renal oxygen delivery (RDO2) (24%) where both significantly lower in the study group (Table 1). There was no difference between the groups in renal oxygen consumption (RVO2) but Renal oxygen delivery was almost 30% lower in septic shock patients. Renal oxygen extraction was significantly higher in the study group than in the control group. In the study group, NAG was 5.4 ± 3.4 units/mikromol creatinine more, i.e 5 times the value in patients undergoing cardiac surgery [2].

    Conclusions: Sepsis related AKI is caused by a renal afferent vasoconstriction resulting in a reduced RBF and lowered RDO2 In combination with an anchanged RVO2, this results in a renal oxygen supply/demand mismatch.


    1. Hoste EA, et al. Intensive Care Med 41(8):1411-23, 2015

    2. Lannemyr L et al. Anesthesiology 126(2):205-213, 2017

    Table 1 (abstract P162). Renal variables in early clinical septic shock

    P163 Utility of daily creatine kinase measurement within adult intensive care

    P Henderson1, J Adams2, A Blunsum1, M Casey3, N Killeen1, S Linnen1, J McKechnie3, A Puxty1

    1Glasgow Royal Infirmary, Glasgow, UK; 2Royal Alexandra Hospital, Paisley, UK; 3Queen Elizabeth University Hospital, Glasgow, UK

    Introduction: The primary aim was to determine if the addition of daily creatine kinase (CK) measurement was usefully guiding decision making in intensive care units within Greater Glasgow and Clyde.

    Methods: After a change to the daily blood ordering schedule to include CK, a retrospective audit was carried out covering a 5-month period within 3 intensive care units. All patients with CK >870 units/litre were included. Basic demographics, APACHE 2 score and admitting diagnosis were recorded. Utility of CK was assessed by determining the associated diagnosis and whether the diagnosis was first considered (diagnostic trigger) due to CK level, clinical suspicion or haematuria. Additionally, it was determined if and what actions had been taken based on the raised CK and associated diagnoses.

    Results: Data was collected from 01/08/2016 to 31/12/2016. 276 patients were captured with CK >870 units/litre from an average combined admission rate of 200 patients/month [1]. Total male patients 191 (69.2%) and female 85 (30.8%). Age range 17 to 95 years (mean 54.7). APACHE 2 score range 0 to 45 (mean 20.9) with estimated mean mortality of 36.7%. 176 patients (63.8%) had associated diagnoses with elevated CK including: burns 2 (0.7%), compartment syndrome 7 (2.5%), myocardial infarction 20 (7.2%), myositis/myocarditis 2 (0.7%), neuroleptic malignant syndrome 1 (0.4%), rhabdomyolysis 61 (22.1%), serotonin syndrome 7 (2.5%), surgical procedure 76 (27.5%). As outlined in Fig. 1 the diagnostic trigger was the routine CK measurement in 65 patients (23.6%), prior clinical suspicion 108 (39.1%), haematuria 1 (0.4%) and unclear in 102 (36.9%). Action was required based on CK result/associated diagnosis in 95 patients (34.4%) with the specific actions outlined in Table 1.

    Conclusions: With a raised CK resulting in a new diagnosis rate of 23.6% and a change in treatment rate of 34.4% the introduction of this test has proven useful within this cohort.


    1. Cole S et al. SICSAG Audit of critical care in Scotland 10:30-32, 2017

    Table 1 (abstract P163). Action taken
    Fig. 1 (abstract P163).

    Diagnostic trigger

    P164 Creatinine-based formula (MDRD) versus cystatin c-based formula (simple cystatin c formula) for estimation of GFR in critically ill patients

    R Marinho1, T Neves2, M Santos3, A Marinho1

    1Centro Hospitalar do Porto, Porto, Portugal; 2Instituto de Ciencias Biomedicas Abel Salazar – Universidade do Porto, Porto, Portugal; 3Faculdade de Ciencias da Nutricao e Alimentacao da Universidade do Porto, Porto, Portugal

    Introduction: Plasma or serum creatinine is the most commonly used diagnostic marker for the estimation of glomerular filtration rate (GFR) in clinical routine. Equations to estimate GFR based on serum creatinine have been introduced and the most validated and applied are the MDRD equation. Lately, the low molecular weight protein cystatin C was introduced as a GFR estimate (eGFR) superior to creatinine. However, there are conflicting reports regarding the superiority of Cys C over serum creatinine (Cr), with a few studies suggesting no significant difference. The aim of our study was to compare MDRD formula against Simple Cystatin C (Scys) formula for estimation of GFR in Critically Ill Patients.

    Methods: 71 critically ill patients (56.3% women, mean age 66,23±13.77 years, with a mortality rate of 18.3%) were enrolled. In each patient, GFR on the first day of admission to an ICU, was calculated using MDRD equation and Scys formulas. Statistical analysis was performed using MedCalc software.

    Results: The mean serum creatinine was 1.39±0.95 mg/dl, mean GFR (MDRD) was 77.23±44.4 mL/min/1,73m2. The mean serum cystatin was 1.655±1.06 mg/L, mean GFR (Scys) was 62.23±36.41 mL/min/1,73m2. The correlation coefficient (r value) between calculated GFR based on MDRD method and Simple Cystatin C (Scys) formula was 0.716 (P = 0.01)

    Conclusions: The correlation analysis showed the eGFRs from every formula could all to some extent reflect the glomerular function or GFR accurately. The GFR (Scys) formula was a quickly and accurate method for estimating GFR and may apply clinically in critically ill patients.

    P165 Perioperative chloride levels and acute kidney injury after liver transplantation: a retrospective observational study

    S Choi1, C Jung1, H Lee1, S Yoo1, H Ryu1, H Jang2

    1Seoul National University College of Medicine, Seoul, South Korea; 2Samsung Medical Center, Sungkyunkwan University College of Medicine, Seoul, South Korea

    Introduction: The risk of developing acute kidney injury (AKI) after liver transplantation in the immediate postoperative period ranges between 17 to 95%. Most studies in critically ill and surgical patients evaluated the link between chloride-rich resuscitation fluids, not serum chloride levels, and the incidence of AKI. The association between preoperative chloride level or difference in perioperative chloride levels and the incidence of postoperative AKI after liver transplantation were evaluated.

    Methods: Adult patients (>=18 years old) who underwent liver transplantation at Seoul National University Hospital between 2004 and 2015 were included in the retrospective analysis. The difference between preoperative serum chloride level and the immediate postoperative serum chloride level was defined as intraoperative chloride loading. Postoperative AKI within 7 days of liver transplantation was diagnosed according to the RIFLE criteria. Patients were divided into normochloremia group (96-106 mEq/L), hypochloremia group (<96 mEq/L), or hyperchloremia group (>106 mEq/L) according to their preoperative chloride level. Intraoperative chloride loading was defined as the difference between preoperative serum chloride level and immediate postoperative serum chloride level.

    Results: AKI developed in 58.8% (630/1071) of the patients. AKI was more frequent in patients with hyperchloremia (adjusted OR 1.44 [95% CI 1.08-1.90], P=0.01) and hypochloremia (adjusted OR 1.25 [95% CI 1.03-1.53], P=0.03) compared to patients with preoperative normochloremia. MELD scores > 11 and age >56 years were also associated with increased risk of AKI. Intraoperative chloride loading was not a significant risk factor for AKI after liver transplantation.

    Conclusions: Preoperative hyperchloremia and hypochloremia were both associated with an increased risk of developing AKI in the immediate postoperative period after liver transplantation.

    P166 Urinary strong ion difference as an early marker of acute kidney injury in septic patients

    M Cicetti, A Dell’anna, C Dominedò, A Ionescu, C Sonnino, E Tarascio, I Barattucci, SL Cutuli, M Antonelli

    Fondazione Policlinico A. Gemelli Università Cattolica Sacro Cuore, Rome, Italy

    Introduction: Sepsis is a major cause of acute kidney injury (AKI) which is associated with increased morbidity and mortality. Serum creatinine (sCr) increase is a late marker of AKI. Urinary Strong Ion Difference (SIDu = [Na+]u + [K+]u – [Cl-]u) reflects the kidney’s ability to compensate blood pH variations. The aim of this study is to evaluate the role of SIDu as an early marker of septic AKI.

    Methods: This prospective, observational, monocentric study included adult patients admitted to the ICU with sepsis and preserved kidney function. We excluded patients treated with diuretic before enrollment. Patients were daily evaluated to assess AKI according to KDIGO criteria. We studied the biochemical profile of blood and urine and calculated plasmatic apparent SID (SIDa), effective SID (SIDe), SIDu.

    Results: Fifty-five patients (men 33, age 58±17) were included in the analysis from September 2016 to June 2017 (Table 1), 19 (34%) of whom developed AKI. There was no significant difference in SIDu values between patients who developed AKI (AKIyes) and those who did not (AKIno). No association was found between SIDu and sCr, SIDu and creatinine clearance (Table 2). Urinary Na+ ([Na+]u) and Cl- ([Cl-]u) were not correlated to their plasmatic values. [Na+]u and [Cl-]u were significantly lower in the AKIyes group (baseline: Na+: 36.5[13-76] vs 91[51-139]; Cl-: 51[33-70] vs 110[66-170]), p=0.01) and these alterations occurred before the onset of derangements in urinary output and sCr (Figs. 1 and 2). [Na+]u and [Cl-]u showed a better AUC in predicting AKI compared to sCr (sCr: 0.54; [Na+]u: 0.76; [Cl-]u: 0.77, p=0.047 and 0.036 respectively).

    Conclusions: SIDu was not found to be a reliable marker of AKI but further studies are needed to evaluate its diagnostic value. Conversely, [Na+]u and [Cl-]u could be simple and inexpensive markers of renal dysfunction to be used in AKI diagnosis and management. Written informed consent was obtained from all patients.

    Table 1 (abstract P166). General characteristics at baseline
    Table 2 (abstract P166). Arterial blood gas analysis at baseline for patients with and without AKI
    Fig. 1 (abstract P166).

    ANOVA urinary Na during the first 7 days since admission

    Fig. 2 (abstract P166).

    ANOVA urinary Cl during the first 7 days since admission

    P167 Urinary electrolytes as early indicators of acute kidney injury after major abdominal surgery

    D Marouli, E Papadakis, P Sirogianni, G Papadopoulos, E Lilitsis, E Pediaditis, A Papaioannou, D Georgopoulos, H Askitopoulou

    University Hospital Heraklion, Heraklion, Greece

    Introduction: Perioperative Acute Kidney Injury (AKI) is associated with significant morbidity and mortality [1]. Certain urinary biochemical parameters seem to have a standardized behavior during AKI development and may act as surrogates of decreased glomerular filtration rate (GFR) aiding in early AKI diagnosis [2]. Aim of this prospective observational study was the evaluation of urinary biochemical parameters as early indicators of AKI in a cohort of major surgery patients.

    Methods: 68 patients were studied. AKI was defined according to AKIN criteria within 48 hrs after surgery [3]. At pre-defined time points (preoperatively, recovery room [RR] and on postoperative days [POD] 1 to 3) simultaneous serum and urine samples were analyzed for urea, creatinine, Na, K, Cl, while fractional excretions of Na (FENa), Urea (FEUrea), K (FEK), urinary strong ion difference (SIDU) and estimated GFR (eGFR) were calculated.

    Results: 16 patients (23.5%) developed AKI. While there was no difference in preoperative eGFR between AKI and non-AKI patients (75.3±16 vs 83.9±15.2ml/min/m2, p=0.09), RR eGFR was already lower in AKI patients (69.5±18.7 vs 85.7±15.6ml/min/m2, p=0.001). This was accompanied by significantly lower NaU (82.7±26.8 vs 108.1±41.9mEq/l, p=0.002) and ClU (94.7±32.9 vs 114.5±33.4mEq/l, p=0.041) values, as well as significantly higher FEK (62.5±41.5 vs 24.8±16.5%, p=0.002). FENa and FEUrea differed significantly between the two groups on POD 1, whereas SIDU did not differ.

    Conclusions: In a general surgery population low NaU and ClU values, as well as high FEK values were already evident immediately after surgery, probably representing GFR impairment preceding formal AKI diagnosis. Additional studies must confirm these findings and reevaluate these simple parameters as potential AKI monitoring tools.


    1. Hobson C et al. Ann Surg 261:1207-14, 2015

    2. Maciel AT et al. Renal Failure 38(10):1607-15, 2016

    3. Acute Kidney Injury Work Group: Kidney Int; 2:1-138, 2012

    P168 Urinary liver-type fatty acid-binding protein is the novel biomarker for diagnosis of acute kidney injury secondary to sepsis

    T Komuro, T Ota

    Shonan Kamakura General Hospital, Kamakura, Kanagawa, Japan

    Introduction: Acute kidney injury (AKI) is the predictor of poor prognosis for the patient with sepsis and septic shock. Several diagnostic criteria for AKI is used on clinical settings, but useful biomarker is not known yet. Urinary liver-type fatty acid-binding protein(L-FABP) is associated with kidney function and AKI[1], But that is not still discussed about AKI secondary to sepsis. Thus, we conducted the study of the association between urine L-FABP and AKI with secondary to sepsis.

    Methods: From May 2017 to October 2017, We collected adult sepsis patients admitted to our Intensive Care Unit(ICU). Patients were diagnosed with sepsis-3 definition[2].Kidney Disease Improving Global Outcomes (KDIGO) criteria was used for diagnosis of AKI.L-FABP was measured when patient admitted to our ICU. Sensitivity and Specificity of L-FABP for diagnosis of AKI was assessed by AUROC curve.

    Results: Ninety-five patients participated in this study. Systemic Organ Falure Assessment (SOFA) score was 7(median, IQR:5-10). Fifty-seven(60%) patients were diagnosed with AKI by KDIGO criteria. Serum creatinine level of AKI patients was 1.65mg/dl (median, IQR:1.40-2.53). Urine L-FABP level of AKI patient was 109.23μ g/g Cr(median, IQR:27.58-671.33). Urine output was 1147.5ml(median, IQR:725.25-1747.5). The estimated sensitivity of urinary L-FABP level for diagnosing AKI was 81.1% and specificity was 53.4%. AUROC was 0.705(95%CI:0.6-0.811) (Fig. 1). The cut-off line of L-FABP was 95.71μg/g Cr.

    Conclusions: L-FABP can be the novel biomarker for diagnosis of AKI. Further investigation need for diagnostic value of L-FABP and usefulness of early intervention for AKI used by L-FABP.


    1. Susantitaphong P et al. Am J Kidney Dis 61(3):430-9, 2013

    2. Singer M et al. JAMA 23;315(8):801-10, 2016

    Fig. 1 (abstract P168).

    The AUROC curve of L-FABP for diagnosis of AKI

    P169 Vitamin D metabolite concentrations in critically ill patients with acute kidney injury

    L Cameron, U Blanco Alonso, A Bociek, A Kelly, G Hampson, M Ostermann

    Guy’s and St Thomas’ NHS Foundation Trust, London, UK

    Introduction: Biotransformation of 25-hydroxyvitamin D to active 1,25(OH)2D occurs primarily in the kidney. Our aim was to explore whether this process was altered in patients with acute kidney injury (AKI).

    Methods: Consecutive patients admitted to critical care at a tertiary hospital were recruited. The AKI group comprised patients with KDIGO stage II or stage III AKI; the non-AKI group were patients requiring cardiovascular or respiratory support, but with no AKI. Vitamin D metabolite concentrations were measured on days 0, 2 and 5. Statistical analysis included comparison between groups at each time point, and longitudinal profiles of vitamin D metabolites.

    Results: Interim analysis of 55 participants (44% of the recruitment target) showed that 1,25(OH)2D concentrations were significantly lower in patients with AKI at day 2 and day 5. Considering longitudinal changes, 25-hydroxyvitamin D profiles were not different between the groups (Fig. 1) but there was a trend towards a longitudinal increase in 1,25(OH)2D in patients without AKI, which was not seen in AKI patients (Fig. 2).

    Conclusions: Interim analysis indicates significant differences in concentrations of 1,25(OH)2D, but not 25(OH)D, in critically ill patients with AKI. Recruitment is ongoing and further results are awaited.

    Fig. 1 (abstract P169).

    Longitudinal changes in 25(OH)D.No significant difference in the longitudinal profiles of patients with AKI and those with no AKI was found.

    Fig. 2 (abstract P169).

    Longitudinal changes in 1,25(OH)2D. A significant difference in the longitudinal profiles of patients with AKI and those with no AKI was found (p=0.047). This effect was attenuated with a statistical correction for unequal variance between groups.

    P170 Effects of fenoldopam on renal function in critically ill patients undergoing a strictly conservative strategy of fluid management

    S Tribuzi, LP Bucci, C Scaramucci, V Vano, A Naccarato, M Mercieri

    Sant’Andrea, Rome, Italy

    Introduction: Acute renal failure affects from 1% to 25% of patients in the intensive care units (ICUs)1 and it is associated with excess mortality. Hydratation is a useful preventive measure but it is often controindicated in critically ill patients who, on the contrary, often benefit by a strictly conservative strategy of fluid management. Fenoldopam, a selective dopamine 1-receptor agonist, increases renal blood flow and glomerular filtration rate by vasodilating selectively the afferent arteriole of renal glomerulus. The aim of our study is to compare renal effects of fenoldopam and placebo in critically ill patients undergoing a restrictive fluid management.

    Methods: We enrolled 130 patients admitted to our ICU. Patients were assigned by randomization to study groups: fenoldopam (n=64) and placebo (n=66). Fenoldopam was infused continuously at 0,1 mcg/Kg/min and equivalent volume for placebo during a period of seven days.

    Creatinine, cystatin C and creatinine clearance were daily measured as markers of renal function. The incidence of AKI according to RIFLE criteria (Risk, Injury, Failure, Loss, End Stage kidney disease) was also calculated.

    Results: Patients with a negative fluid balance at the end of the week (~ -5000 ml, p=0,0001) were included in the analysis, 32 in the placebo group and 38 in the fenoldopam group. There were not significant differences in the trend of creatinine, creatinine clearance, cystatin C and in the incidence of AKI between the groups during the week of infusion.

    Conclusions: A continuous infusion of fenoldopam at 0,1 mcg/kg/min does not improve renal function and does not prevent AKI in critically ill patients undergoing a strictly conservative strategy of fluid management.


    1. Bellomo R et al. Intensive Care Med 27:1685-1688, 2001

    P171 Dysphagia triage protocol as a tool in the intensive care risk management

    CE Bosso1, PC Dutil Ribeiro2, M Valerio2, O Alves de Souza2, A Pireneus Cardoso1, RD Jorge Caetano3, L De Oliveira3, BR Correa3, F Fernandes Lanziani3

    1Instituto do Coração de Presidente Prudente, Presidente Prudente, Brazil; 2UNOESTE, Presidente Prudente, Brazil; 3Santa Casa de Misericórdia de Presidente Prudente, Presidente Prudente, Brazil

    Introduction: This study aims to evaluate the efficacy of a protocol implemented for dysphagia risk factors [1] in hospitalized patients in a CICU (Coronary Intensive Care Unit).

    Methods: Patients hospitalized in the CICU of a medium-sized hospital in Presidente Prudente, SP, Brazil, were subjected to a survey that screened for dysphagia during the period from January of 2016 to September of 2017. Patients with at least one risk factor for dysphagia were evaluated by a phonoaudiologist and are the subject of this study. The information was statistically analyzed using EPI INFO, version software. Considering significant P <0.05 two-tailed, for logistic regressions multivariate estimated in the sample.

    Results: For this study 1018 patients were selected, of which 57.41% were male and the mean age was 71.77 ± 10.96 years. A higher incidence of dysphagia was observed among patients who had at least one of the following risk factors: stroke (Odds Ratio 9.58 p<0.001); brain tumor (OR 4.49 p=0.0013); chronic obstructive pulmonary disease (COPD) (OR 3.45 p=0.023); degenerative diseases (OR 16.76 p<0.001); lower level of consciousness (OR 13.62 p<0.001); ataxic respiration (OR 2.24 p<0.001); aspiration pneumonia (OR 7.04 p<0.001); orotracheal intubation >48h (OR 13.35 p<0.001); tracheostomy (OR 12.99 p<0.001); airway secretion (OR 24.91 p<0.001); nasoenteral tube (OR 14.9 p<0.001); gastrostomy (OR 4.58 p=0.030). There was no statistical significance for age >60, traumatic brain injury, oropharyngeal surgery and unfavorable dentition. Four factors appeared less than 3 times and could not be analyzed (chagas disease, human immunodeficiency virus (HIV), orofacial burn and excess saliva).

    Conclusions: We concluded that the dysphagia triage protocol insertion was effective to identify dysphagic patients and can be used as an additional tool in the intensive care risk management.


    1. Werle RW et al. CoDAS 28: 646-652, 2016.

    P172 Apnea oxygenation : a novel respiratory system model for physiological studies using high-flow nasal cannula oxygen therapy

    V Masy, B Bihin, J Petit

    CHU UCL Namur - Godinne, Yvoir, Belgium

    Introduction: Since the advent of high-flow nasal cannula (HFNC) oxygen therapy, apnea oxygenation has once again been the subject of numerous clinical studies whose results are conflicting. The physiological bases of this age old concept, more recently applied to endotracheal intubation, have never been confirmed by current methods. We therefore decided to study the effects of an apnea oxygenation period under HFNC oxygen therapy by means of a novel modelization of the respiratory system.

    Methods: Firstly, an airway model was built with anatomical, physical and physiological attributes similar to that of a healthy subject (Fig. 1). This system reproduces the physiological evolution of intrapulmonary gases during apnea by progressively increasing CO2 levels after having cut off previous O2 supplies (FIO2 21%). Secondly, the effects of a HFNC apnea oxygenation of 50l/min with an FIO2 of 100% were analyzed by collecting intrapulmonary gas samples at regular intervals (Fig. 2).

    Results: After 1 minute of apnea oxygenation, intrapulmonary oxygen levels remain stable at 21%. After 5 minutes, oxygen fraction reaches 33%, and increases up to 45% in 10 minutes. Regarding CO2 levels, no significant modifications were observed.

    Conclusions: A novel experimental and physiological model of the respiratory system has been developed and confirms the existence of an alveolar oxygen supply as well as the lack of a CO2 washout during HFNC apnea oxygenation. However, these effects are only observed after a delay of about 1.5 to 2 minutes. Therefore, the clinical interests of this technique to reduce apnea-induced desaturation during intubation of a hypoxemic patient in the ICU seem limited without adequate preoxygenation. Combination of both preoxygenation and apnea oxygenation by HFNC can most likely explain positive results observed in other clinical studies.

    Fig. 1 (abstract P172).

    Respiratory system model

    Fig. 2 (abstract P172).

    Evolution of intrapulmonary gases during apnea; HFNC : high-flow nasal cannula

    P173 Effect of 4% nebulized lignocaine versus 2% nebulized lignocaine for awake fibreoptic nasotracheal intubation in maxillofacial injuries in emergency department

    H Abbas, L Kumar

    King George’s Medical University,Lucknow,India, Lucknow, India

    Introduction: Topical lignocaine is most commonly used pharmacological agent for anaesthetizing upper airway during fibreoptic bronchoscopy. We compare the effectiveness of two different concentrations, 2% lignocaine and 4% lignocaine, in nebulised form for airway anaesthesia during awake fibreoptic nasotracheal intubation in terms of patient’s comfort and optimal intubating conditions, intubation time.

    Methods: Institutional Ethics Committee approved the study and written informed consent obtained; patients of either sex, between 18-55 years age with anticipated difficult airway planned for intubation were included for this study. Patients were randomly allocated into two groups (A and B) based on sealed envelope method; patients and observers were blinded by using prefilled syringes of lignocaine.One group was nebulized with 10ml of 4% lignocaine(Group A) and other with 10 ml of 2% lignocaine(Group B) in coded syringes via ultrasonic nebuliser for 10 minutes followed by Inj midazolam 0.05 mg/kg IV and Inj Fentanyl 1 microgram/kg IV just before the procedure. The fibreoptic broncoscope was introduced via nostril and the other nostril was used for oxygen insufflation (3–4 L/min). The fibroscope was introduced through the glottic opening and visualising tracheal rings and carina.The endotracheal tube railroaded over the fiberscope and cuff inflated.

    Results: The primary outcome measure was patient’s comfort during awake fibreoptic nasotracheal intubation. The mean patient comfort Puchner scale score of Group A was 1.30 ± 0.08 and of Group B was 2.23 ± 0.12. The mean value of Puchner scale of Group B was significantly higher.The mean procedural time of Group B was significantly higher (15.1%) as compared to Group A (p<0.001). The no of intubations attempts did not differ between the two groups.

    Conclusions: 4% nebulised lidocaine provided adequate airway anaesthesia and optimal intubating conditions, patient comfort, stable hemodynamics.

    Table 1 (abstract P173). Puchner Comfort Scale
    Fig. 1 (abstract P173).

    Five point Puchner scale. Group A (4% lignocaine) having more patient comfort with mean value of score 1.30 ± 0.08, as compared to Group B (2% lignocaine) having mean value of 2.23 ± 0.12 (p < 0.001)

    Fig. 2 (abstract P173).

    The mean procedural time to secure airway in Group A (4% lignocaine) was 29.67 ± 5.40 minutes and in Group B (2% lignocaine) it was 34.93 ± 5.52 minutes. The difference in mean time duration was statistically significant (p<0.001)

    P174 Video laryngoscopy versus direct laryngoscopy for emergency orotracheal intubation outside the operating room - a systematic review and meta-analysis

    V Bennett1, N Arulkumaran2, J Lowe3, R Ions4, M Mendoza Ruano5, M Dunser6

    1St George’s Hospital, London, UK; 2University College London, London, UK; 3Arrowe Park Hospital, Merseyside, UK; 4Musgrove Park Hospital, Taunton, UK; 5Hospital Universitario y Politecnico La Fe, Valencia, Spain; 6Innsbruck Medical University, Innsbruck, Austria

    Introduction: This systematic review and meta-analysis aims to investigate whether video laryngoscopy (VL) improves the success of orotracheal intubation, when compared with direct laryngoscopy (DL).

    Methods: A systematic search of Pubmed, Embase, and CENTRAL databases was performed to identify studies comparing VL and DL for emergency orotracheal intubations outside the operating room. The primary outcome was rate of first pass intubation. Subgroup analyses by location, device used, clinician experience, and clinical scenario were performed. The secondary outcome was rate of complications.

    Results: The search identified 32 studies with 15,064 emergency intubations. There was no overall difference in first-pass intubation with VL compared to DL. Subgroup analysis showed first-pass intubations were increased with VL in the intensive care unit (ICU) (2.02 (1.43-2.85); p<0.01), but not in the emergency department or pre-hospital setting. Rate of first-pass intubations were similar with Glidescope® and DL, but improved with the CMAC® (1.32(1.08-1.62); p=0.007). There was greater first-pass intubation with VL than DL among novice/trainee clinicians (OR=1.95 (1.45-2.64); p<0.001), but not among experienced clinicians or paramedics/nurses. There was no difference in first-pass intubation with VL and DL during cardiopulmonary resuscitation or trauma. VL was associated with fewer oesophageal intubations than DL (OR=0.31 (0.14-0.69); p=0.004), but more arterial hypotension (OR=1.49 (1.00-2.23); p=0.05).

    Conclusions: In summary, compared to DL, VL is associated with greater first-pass emergency intubation in the ICU and among less experienced clinicians. VL is associated with reduced oesophageal intubations but a greater incidence of arterial hypotension.

    P175 Compared success rate between direct laryngoscope and video laryngoscope for emergency intubation, in Emergency Department: Randomized Control Trial

    P Sanguanwit, N Laowattana

    Ramathibodi Hospital, Bangkok, Thailand

    Introduction: Video Laryngoscope was used as an alternative to intubate in the Emergency room, designed for tracheal intubation more success [1, 2].

    Methods: We performed a prospective randomized controlled trial study of 158 patients who had sign of respiratory failure or met indication for intubation from July 2015 to June 2016. Patients were randomly by SNOSE technique; assigned to Video laryngoscope first or Direct laryngoscope first. We collect the Demographics, Difficult Intubation Predictor, Rapid Sequence Intubation, attempt, Cormack-Lehane view and immediate complication. Primary outcome was first attempt success rate of intubation.

    Results: First attempt success rate of Video laryngoscope was 73.1% trend to better than Direct laryngoscope was 58.8%, (P=0.06), Good Glottic view (Cormack-Lehane view 1-2) of Video laryngoscope was 88.5% better than Direct laryngoscope 71.3%, and statistically significant (P=0.03), no statistical significant in immediate serious complication between Direct laryngoscope or Video laryngoscope.

    Conclusions: Compared to the success rate between using Video laryngoscope or Direct laryngoscope for intubation, Video laryngoscope trend to better success rate, and better glottic view.


    1. Choi HJ et al. Emerg Med J 27(5):380-2, 2010

    2. Mosier JM et al. J Emerg Med 42(6):629-34, 2012.

    P176 10-year cohort of prehospital intubations and rescue airway techniques by helicopter emergency medical service physicians: a retrospective database study

    P De Jong, C Slagt, N Hoogerwerf

    Radboudumc, Nijmegen, Netherlands

    Introduction: In the Netherlands the pre-hospital Helicopter Emergency Medical Service (HEMS) is physician based and an adjunct to ambulance services. All four HEMS stations together cover 24/7 specialist medical care in the Netherlands. In many dispatches the added value is airway related [1]. As part of our quality control cycle, all airway related procedures were analysed. High quality airway management is characterized by high overall and first pass endotracheal intubation (ETI) success [2].

    Methods: The HEMS database was analysed for all patients in whom prehospital advanced airway management was performed in the period 2007-2017. Balloon/mask ventilation, supraglottic airway (SGA) devices, total intubation attempts, Cormack & Lehane (C&L) intubation grades, successful ETI, primary and rescue surgical airway procedures and professional background were reviewed.

    Results: In the 10-year period, there were 17075 dispatch calls. In total 8127 patients were treated in the prehospital setting by our HEMS. Of those, 3233 required a secured airway. ETI was successful in 3078 of 3148 (97.8%). In the remaining 70 patients (Fig. 1) an alternative airway was needed. Rescue surgical airway was performed in 1.4%, 0.5 % received a rescue SGA, rescue balloon/mask ventilation was applied in 0.2% of cases, 1 was allowed to regain spontaneous ventilation and in 0.1% of patients all airway management failed. HEMS physicians, ambulance paramedics, HEMS paramedics and others (e.g. German emergency physicians) had ETI first pass success rates of 83.4%, 59.6%, 62.4% and 84.5% respectively (Fig. 2). Difficult laryngoscopy (no epiglottis visible) was reported in 2.2% of patients (Table 1).

    Conclusions: Our data show that airway management performed by a physician based HEMS operation is safe and has a high overall ETI success rate of 97.8%. The total success rate is accompanied by a high first pass ETI success rate.


    1. Slagt C et al. Air Med J 23:36-7, 2004

    2. Peters J et al. Eur J Emerg Med 22:391-4, 2015

    Table 1 (abstract P176). C&L intubation grade
    Table 2 (abstract P176). Airway attempts
    Fig. 1 (abstract P176).

    Primary and rescue airway techniques. Total number of patients: 3233

    Fig. 2 (abstract P176).

    First pass endotracheal intubation success rates by professional background

    P177 Identification of factors associated with event occurrence due to unsafe management of endotracheal tubes

    E Ishida1, K Kobayashi1, Y Kobayashi1, Y Shiraishi1, M Yamamoto1, T Kuroda1, Y Iesaki1, Y Tsutsumi1, R Hosoya1, H Yasuda2

    1Japanese Red Cross Musashino Hospital, Tokyo, Japan; 2Kameda Medical Center, Chiba, Japan

    Introduction: Incidences associated with endotracheal tubes are frequent during mechanical ventilation (MV) of intensive care unit (ICU) patients and can be associated with poor outcomes for patients and detrimental effects on health care facilities. Here, we aimed to identify factors associated with Event occurrence due to Unsafe Management of Endotracheal Tubes (E-UMET).

    Methods: A retrospective observational study was conducted in three ICUs: one surgical ICU, one stroke ICU, and one emergency department, at a tertiary hospital in Japan from 1 April 2016 to 31 March 2017. Patients requiring MV and oral intubation during their ICU stay were included. The primary finding was the incidence rate of E-UMET (biting, unplanned extubations, and/or displacement of the endotracheal tube). The patients were divided into two groups: with or without E-UMET. To investigate E-UMET, potential factors possibly related to its occurrence were obtained from electronic medical records. We conducted univariable and multivariable analyses to investigate E-UMET factors.

    Results: Of 410 patients, E-UMET occurred in 112 (27.3%). The mean and standard deviation for age and Acute Physiology and Chronic Health Evaluation (APACHE) II score were 66 (17) and 25 (7), respectively. According to a multivariate logistic-regression analysis, significant risk factors associated with E-UMET included patients of neurosurgery (odds ratio (OR) 3.3; 95% CI, 1.51-7.46; p=0.003), sedative administration (OR 2.9; 95% CI, 1.63-5.32; p<0.001), and higher Richmond Agitation-Sedation Scale (RASS) scores (OR 1.4; 95% CI, 1.24-1.77; p<0.001). The use of a restraint (OR 0.4; 95% CI, 0.22-0.95; p=0.003) was an independent factor associated with a lower probability of E-UMET.

    Conclusions: This study suggests that risk factors associated with E-UMET include neurosurgery, higher RASS scores, and the administration of sedatives. Patients with these factors and longer oral intubation periods might require extra care.

    P178 Critical care extubation in Type II respiratory failure with nasal high flow therapy

    R D’Espiney, J Martin-Lazaro, M Brys, J Grundlingh, J Napier

    Newham University Hospital, London, UK

    Introduction: The use of nasal high flow (NHF) as a respiratory support therapy post-extubation has become increasingly more common. NHF has been shown to be non-inferior to NIV and reduces escalation needs compared to conventional oxygen therapy. Clinical outcomes using NHF in patients with Type II Respiratory Failure (RF) is less well understood. Our aim was to determine if NHF can be used successfully when extubating Type II RF patients compared to Type I RF.

    Methods: We conducted a retrospective observational study on the use of NHF as an extubation respiratory support in 56 (n=56) consecutive patients in ICU over a 12-month period. Primary outcome was the need for escalation in therapy (NIV, Intubation and Palliation) post extubation. Patients were categorised as high risk if they scored >=1 from: Age>=75 years, BMI>=30 and >=1 medical comorbidity.

    Results: Analysis was conducted on all fifty-six (n=56) patients. Type I RF group was composed of 25 (n=25) patients with a mean age of 62.7 (±SD) years. Type II RF group had 31 (n=31) patients with a mean age of 65.5 (±SD) years. In Type I RF 22 patients (88%) were successfully extubated with NHF compared to 21 patients (67.7%) in Type II. In Type II RF the outcomes were more variable with a greater requirement for NIV. Of these patients 16% required NIV, 3.2% required intubation and 12.9% received NHF therapy for palliation. A higher average BMI (30.32 vs 27.16 kg/m2) was found in unsuccessfully vs successfully extubated patients in Type II RF. In Type I RF escalation of therapy was equally distributed with 4% in each category.

    Conclusions: The use of NHF for respiratory support post-extubation may become standard practice for Type I RF in critical care settings. Our data suggests that NHF can be used but with caution in Type II RF and clinicians should risk stratify patients to identify those at risk of re-intubation and post-extubation respiratory failure.

    P179 High flow nasal oxygen in critical care: an audit of practice and outcome

    T Tasnim1, A Kuravi2

    1University Hospital Birmingham, Birmingham, UK; 2Walsall Manor Hospital, Walsall, UK

    Introduction: High Flow Nasal Oxygen (HFNO) is a relatively new therapy. This retrospective audit reviews the use of HFNO in relation to local guidelines in a critical care unit following its introduction.

    Methods: Patients were identified by reviewing ICU charts between August 2015 and September 2016. Data was collected from electronic patient records and the ICNARC database. This included patients’ age, indication and duration of HFNO, mode of oxygen therapy and blood gases before, during and after. HFNO therapy interrupted by >24 hours were analysed as separate episodes. From the 47 patients identified, there were 53 episodes. These were subdivided into the respiratory pathology group (RPG) or post-extubation elective group (EG) based on the indication for HFNO therapy. Two episodes were excluded from the analysis due to the indication for HFNO.

    Results: The median age of patients was 67 years. The mean duration of HFNO was 2.56 days. The mean APACHE score in RPG & EG were 17 and 18 respectively. In the RPG, 59.1% were weaned to Nasal Speculum(NS) or to Room Air(RA) with a further 13.6% to Face mask(FM). The mean, median(SD) PO2 noted before & during the HFNO therapy were 10.3, 9.45(2.58) & 10.07, 9.46(2.08) KPa. The mean, median(SD) of pH was 7.26, 7.45(1.1) pre HFNO, changing to 7.44, 7.46(0.05) on HFNO. In the EG, 57.1% were weaned to NS and 14.3% to FM. The mean, median(SD) of PaO2 before and during the therapy were 10.72, 10.1(3.66) & 9.08, 9.2(1.96) KPa. The mean, median(SD) of pH was 7.43, 7.45(0.05) pre HFNO changing to 7.47, 7.47(0.07) on HFNO. In 15% of episodes HFNO was used despite contraindications with no adverse events.

    Conclusions: In most episodes, HFNO was used according to the local guidelines with no reported adverse events. More than 50% in both groups were successfully weaned to NS or RA. Only 27.3% in RPG and 28.6% in EG were escalated to NIV and Intubation. Despite the small cohort and multiple confounding factors, the audit has shown a general trend towards benefit from HFNO.

    P180 Infusion of antiseptic in an oral model. An innovative technique for possible prevention of ventilator associated pneumonia

    A Omar, F Teunissen, S Aboulnaga, S Hanoura, S Doiphode, A Alkhulaifi

    Hamad medical corporation, Doha, Qatar

    Introduction: Pathogenesis of ventilator-associated pneumonia (VAP) relies on colonization and microaspiration. Oral topical decontamination reduced the VAP incidence from 18 to 13% [1]. The persistence of antiseptic effect in the oral cavity is questionable; we hypothesize that continuous oral antiseptic infusion may offer a better decontamination.

    Aim of the work: We developed endotracheal tube that allows continuous oral infusion of chlorhexidine (CHX), and we want to test the technique versus the conventional on bacterial colonization. (Provisional patent: 62359944)

    Methods: A two identical bio models for the upper airways were manufactured by (3DX Diagnostics, USA) to adapt the modified and the ordinary endotracheal tubes (ETT). The two techniques tested were using six hourly disinfection with CHX (group A) versus disinfection through the 24 hours infusion technique (Group B). Five microorganisms plus mixed bacteria were used and each was tested for five times. Normal saline was used constantly to irrigate the biomodels and Ten ml aliquot was collected by the procedure end. Culturing of the aliquots from decanted broth pre and post disinfection was performed. The time to apply CHX by practitioner was also compared.

    Results: There was a trend towards lower bacterial growth in group A in 5 experiments which reach statistical significance only with Pseudomonas aeruginosa (p=0.045). In one experiment the growth was lower in group B (Fig. 1). Additionally there was time saving advantage in group B (15±3.3 versus 5±1.2 min, p=0.01).

    Conclusions: The novel technique got at least non inferior results, plus time saving advantage. These results may warrant future clinical trial.


    1) Koeman M, et al. Am J Respir Crit Care Med. 173(12):1348-55, 2006.

    Fig. 1 (abstract P180).

    Bacterial growth in both groups

    P181 Non invasive measurement of particle flow from the small airways using PExA in vivo and during ex vivo lung perfusion in DCD donation

    E. Broberg, L. Pierre, M. Wlosinska, L. Algotsson, S. Lindstedt

    Skane University Hospital, Lund, Sweden

    Introduction: The optimal mechanical ventilation in the different phases in the LTX DCD (Lung Transplantation Donation after Cardio-circulatory Determination of Death) donation (in vivo, post mortem and ex vivo) is on debate. Monitoring airways non invasive online analysing different particle flow from the airways is never done before. In the present study we use a new technology for airway monitoring using mass spectrometric analysis of particle flow and their size distribution (PExA Particles in Expired Air). The exhaled particles are collected onto a substrate and possible for subsequent chemical analysis for biomarkers. Our hypothesis was that by analysing the particle flow online, we could optimise the mechanical ventilation. Our hypothesis was that a small particle flow would probably be more gentle for the lung than a large particle flow when the lung is squeezed out and the majority of all small airways are open.

    Methods: In the present study we analyse the particle flow from the airways in vivo, post mortem and during ex vivo lung perfusion using different ventilation modes; Volume Controlled Ventilation (VCV) and Pressure Controlled Ventilation (PCV) comparing small tidal volumes(1) versus big tidal volumes(2) at different PEEP (Positive End-Expiratory Pressure) and after distribution of different drugs in six domestic pigs.

    Results: We found that VCV resulted in a significant lower particle flow than PCV in vivo but in ex vivo settings the opposite was found (Fig. 1). In both in vivo and ex vivo settings we found that big tidal volume resulted in a larger particle flow than small tidal volumes.air.

    Conclusions: The opening and the closure of the small airways reflect the particle flow from the airways. We found that different ventilation modes resulted in different particle flow from the airways. We believe this technology will be useful for monitoring mechanical ventilated patients to optimise ventilation and preserve the lung quality and has a high potential to detect new biomarkers in exhaled air.

    Fig. 1 (abstract P181).

    DCD PExA

    P183 Incentive to better practices in non-invasive ventilation

    N Matsud, G Correa, D Tucunduva, L Silva, V Veiga, T Alvarisa, N Postalli, P Travasos, R Vale, S Rojas

    Hospital BP - A Beneficência Portuguêsa de São Paulo, São Paulo, Brazil

    Introduction: Protocols for the use of non-invasive ventilation are associated with better outcomes in ICUs due to the reduction of the need for invasive ventilation and associated complications. The objective of this study is to evaluate the adherence to the noninvasive ventilation protocol in a large hospital intensive care unit.

    Methods: We included all patients who used a non-invasive ventilation device from February 2016 to May 2017, based on the institutional protocol of Noninvasive Ventilation Indication.

    Results: In the period, 4963 patients were admitted in the sector, and 641 (12.91%) used noninvasive ventilation, according to institutional protocol. The mean SAPS3 in the period was 43.9 points, with an expected mortality of 22.3%. The actual mortality rate was 11.2%. The average adherence to the protocol was 88.45% in 2016, rising to 98.4% in 2017. This increase was associated with an organization culture, training of the professionals involved - physicians and physiotherapists, monthly feedback of the results, with established plans. The main nonconformities were related to failure of records, indication of the resource or choice of interface and time of therapeutic response.

    Conclusions: The adoption of protocols for the indication of non-invasive ventilation in highly complex patients was shown to be safe and effective in patients of high complexity, making it possible to reduce the number of patients on invasive ventilation and its complications.

    P184 Can non-invasive ventilation change the result of malaria with pulmonary dysfunction?

    A Costa e Silva, R Barbara, L Figueirôa

    Clínica Multiperfil, Luanda, Angola

    Introduction: Malaria is a common problem in underdeveloped countries, with an estimated mortality of more than one million people per year. Pulmonary involvement is one of the most serious manifestations of Plasmodium falciparum malaria. Non-invasive ventilation (NIV) decreases muscular works and improves gas exchange by recruitment of hypoventilated alveolus. In this context, we analyze the impact of the use of non-invasive ventilation in malaria with pulmonary dysfunction.

    Methods: It’s a retrospective cohort study. We analyzed electronic records of patients who were diagnosed with malaria, with acute respiratory failure, who underwent respiratory therapy with NIV between 2015-2016 within the intensive care unit (ICU). The study variables were: ICU mortality, length of hospital stay, NIV time and outcome groups. Statistical analysis was performed with the Pearson correlation coefficient, with significance level of p <0.01. The statistics were performed using the BioEstat 3.0 program.

    Results: Thirty-one patients were included in the study. Four results were analyzed according to Table 1 and Fig. 1. 94% of the patients were discharged from the hospital. Pearson’s correlation coefficient analysis showed statistical significance in the Group (NIV/Discharge) in the analysis of patients hospitalized versus NIV (95% CI = 0.24 to 0.83 <(p) = 0.0036).

    Conclusions: The use of NIV was positive in patients using this resource as first-line treatment of malaria in the fight against respiratory decompensation, with improvement of symptoms.

    Table 1 (abstract P184). Serial analysis of intervention groups
    Fig. 1 (abstract P184).

    Outcome groups.

    P185 Factors predicting postoperative noninvasive ventilation failure in a tertiary care university hospital: a retrospective study

    S Chatmongkolchart, B Choochot

    Faculty of Medicine, Prince of Songkla University, Songkla, Thailand

    Introduction: Noninvasive ventilation (NIV) is a therapeutic choice in the management of acute respiratory failure in the postoperative period. NIV failure is associated with poor outcomes and increased mortality. The aim of this study is to determine the associated factors of NIV failure in the postoperative period.

    Methods: Data was obtained from a total of 206 surgical patients who experienced postoperative respiratory complications and were managed by NIV between January 2012 and December 2016 at surgical intensive care unit in Songklanagarind hospital. Predictive factors for NIV failure were determined using multivariate logistic regression models.

    Results: The incidence of postoperative NIV failure was 39.3%. In multivariate analysis, factors independently predicting NIV failure were smoking (odds ratio (OR) 3.24, 95% confidence interval (CI) 1.63-6.44), independent functional status (OR 2.05, 95% CI 1.05-4.03), American Society of Anesthesiologists (ASA) status 4-5 vs 2-3 (OeR 2.72, 95% CI 1.36-5.46), and intracranial surgery (OR 2.66, 95% CI 1.19-5.96) (Table 1). NIV failure was also associated with a higher incidence of postoperative pulmonary complication such as pneumonia (81.2% vs 18.8%, P <0.001), prolonged hospital stay (41.2±35.5 vs 26.1±21.0 days, P <0.001) and cardiac arrest (88.2% vs 11.8%, P = 0.004) comparing with NIV success.

    Conclusions: Pre-and perioperative predictors of NIV failure were smoking, independent functional status, ASA status 4-5, and intracranial surgery. Management of NIV in patients at risk of failure requires close monitoring because of the failure of NIV may be associated with higher risk of postoperative complications.

    Table 1 (abstract P185). Final logistic regression model of factors predicting postoperative noninvasive ventilation failure

    P186 Noninvasive ventilation (NIV) failure in critically ill patients with COPD and influenza infection: risk factors and outcomes

    C Domínguez-Curell1, A Rodríguez1, L Reyes2, M Bodi1, A Esteban3, F Gordo3, I Martín-Loeches3, J Solé-Violán4, E Díaz5, A Anzueto2, M Restrepo2

    1Hospital Universitari de Tarragona Joan XXIII, Tarragona, Spain; 2University of Texas Health Science Center at San Antonio and South Texas Veterans Health Care System, San Antonio, Texas, USA; 3Multidisciplinary Intensive Care Research Organization (MICRO), St.James’s University Hospital, Trinity Centre for Health Science, Dublin, Ireland; 4Hospital Dr. Negrín, Gran Canaria, Las Palmas, Spain; 5Hospital Parc Tauli, CIBERES, Sabadell, Spain

    Introduction: The effectiveness of NIV in COPD patients with acute respiratory failure (ARF) due to influenza infection remains controversial. The aim of this study was to characterize COPD patients at risk to NIV failure (NIVf) and their impact on ICU mortality.

    Methods: Secondary analysis from a prospective, observational, multi-center study of COPD subjects admitted to the ICU with ARF due to influenza infection. Demographics data, clinical and laboratory variables and severity of illness were recorded. Three groups were studied: (1) COPD subjects who received NIV at ICU admission and failed(NIVf group); (2) COPD subjects who received NIV at ICU admission and succeeded(NIVs group); and (3) COPD subjects who received invasive mechanical ventilation at ICU admission(IMV group). Univariate and multivariate analysis was performed to determine factors independently associated with NIVf and ICU mortality.

    Results: Of 476 patients, 211 (44.3%) required IMV and 265 (55.6%) NVI. Failure occurred in 118 (44.5%) patients and were more likely to have high severity(SOFA 7 vs.4,p<0.001),shock(72.9%vs.13.6%,p<0.05),acute renal failure (31.4%vs. 13.6%, p<0.05), bacterial co-infection(25.4% vs. 10.9%,p<0.05) and hematological disease (5.9% vs. 1.4%,p<0.05) compared to NIVs group. Shock (OR=13.4[6.36-28.8]) was independently associated with NIVf. The overall ICU mortality was 31% in the IMV group, 39% in the NIVf group and 5% in the NIVs group, respectively (p<0.001 comparing NIV groups). In the multivariate analysis, only number of quadrants infiltrates(OR=1.42[1.03-1.94]p<0.02), hematological disease (OR=6.27[1.09-35.9]p=0.04), chronic renal failure (OR=7.3[2.1-24.4]p=0.02) and NIVf (OR=9,27[2.9-28.7]p<0.001) were variables independently associated with ICU mortality.

    Conclusions: NIVf is frequent a complication in COPD patients admitted to the ICU with influenza infection. Shock presence should alert clinicians to consider IMV due to the high risk of NIV failure and ICU mortality.

    P187 Non-invasive ventilation and oxygen therapy after extubation in patients with acute respiratory failure: a meta-analysis of randomized controlled trials

    S Bhattacharjee, S Maitra

    All India Institute of Medical Sciences, New Delhi, New Delhi, India

    Introduction: Role of non-invasive ventilation (NIV) following extubation in patients with acute respiratory failure is debatable. NIV may provide benefit in post surgical patients [1], but its role in non-surgical patients is controversial.

    Methods: PubMed and Cochrane Central Register of Controlled Trials (CENTRAL) were searched (from 1946 to 20th November 2017) to identify prospective randomized controlled trials, where post extubation NIV has been compared with standard oxygen therapy in adult patients with acute respiratory failure. For continuous variables, a mean difference was computed at the study level, and a weighted mean difference (MD) was computed in order to pool the results across all studies. For binary outcomes, the pooled odds ratio (OR) with 95% confidence interval (95% CI) was calculated using the inverse variance method.

    Results: Data of 1525 patients from 11 randomized trials have been included in this meta-analysis. Two trials used NIV to manage post-extubation respiratory failure. Pooled analysis found that mortality rate at longest available follow-up [OR (95% CI) 0.84 (0.50, 1.42); p=0.52; Fig. 1] and re-intubation rate [OR (95% CI) 0.75 (0.51, 1.09); p=0.13] were similar between NIV and standard oxygen therapy. NIV did not decrease intubation rate when used as preventive modality [OR (95% CI) 0.65 (0.40, 1.06); p=0.08]. Duration of ICU stay was also similar in the two groups [MD (95% CI) 0.46 (-0.43, 1.36) days; p=0.31; Fig. 2].

    Conclusions: Post extubation NIV in non- surgical patients with acute respiratory failure does not provide any benefit over conventional oxygen therapy.


    Faria DA, et al. Cochrane Database Syst Rev CD009134, 2015.

    Fig. 1 (abstract P187).

    Forest plot for pooled analysis of mortality at longest available follow-up

    Fig. 2 (abstract P187).

    Forest plot for pooled analysis of need for re-intubation

    P188 A randomized comparative study of helmet CPAP versus facemask CPAP in acute respiratory failure (ARF)

    O Adi, S Salleh

    Raja Permaisuri Bainun Hospital, Ipoh, Perak, Malaysia

    Introduction: CPAP is used to improve oxygenation in patient with ARF. We aimed to determine non-inferiority (NI) of helmet CPAP to facemask in ARF based on physiological (heart rate (HR) and respiratory rate (RR)) and blood gas parameters (PaO2 and PaCO2). We also compared patients’ perception in dyspnea improvement after CPAP using dyspnea scale (visual analogue scale (VAS)) and Likert score.

    Methods: We randomized 123 patients to helmet (n=64) and facemask (n=59) with 71.7% of ARF was due to acute pulmonary edema. CPAP was applied for 60 minutes. Patients’ physiological and blood gas parameters were recorded before and after intervention. Patients then marked on dyspnea scale and Likert score. NI of helmet would be declared if confidence interval (CI) of mean difference between groups (helmet’s mean minus facemask’s mean) in improving physiological, blood gas parameters and dyspnea scale was no worse than predetermined non-inferiority margin (NIM). Secondary outcome was to compare incidence of discomfort and mucosal dryness between groups.

    Results: Both intention to treat and per protocol (PP) analysis showed mean difference for HR, RR and dyspnea scale were above NIM thus conclude helmet (NI) to facemask. PP analysis of mean differences for HR, RR and dyspnea scale as followed: HR mean difference was -4.43 beats per minute (upper bound 97.5% CI 1.43), mean difference of RR was -0.41 breaths per minute (upper bound of 97.5% CI 0.48) and mean difference of dyspnea scale was 10.98mm (lower bound 97.5% CI 1.94) (Fig. 1) . Both PaO2 and PaCO2 level improved in helmet, but it was inferior. Analysis of Likert score for dyspnea improvement also conclude helmet was better than facemask (mean rank 67.81 vs 55.69, p=0.04). Incidence of discomfort and mucosal dryness was significantly lower with helmet.

    Conclusions: CPAP delivered by helmet improves HR, RR and dyspnea in ARF with less discomfort and dryness of mucosa and it is the alternative for patients who are unable to tolerate facemask.

    Fig. 1 (abstract P188).

    Mean difference between helmet and facemask for heart rate, respiratory rate and dyspnea scale

    P189 Effects of varying ventilatory parameters on FiO2 delivery of a boussignac CPAP in a mechanical model of acute respiratory distress

    M Bibombe1, A Penaloza Baeza2, G Liistro2, N Delvau2

    1Université Catholique de Louvain, Brussels, Belgium; 2Cliniques Universitaires St-Luc, Université Catholique de Louvain, Brussels, Belgium

    Introduction: Acute cardiogenic pulmonary edema is a sudden onset respiratory distress. Its management includes, drug therapy, oxygen and airway support by spontaneous ventilation in continuous positive airway pressure (CPAP). The Boussignac CPAP is powered by pure oxygen, its ‘open’ character allows the patient to increase his inspiratory demand himself, which may influence the delivered FiO2, the patient also breathing ambient air with 21% O2. Previous studies assessed the FiO2 delivered by the device under conditions of respiratory distress but did not focuse on insipratory flow. The aim of this study was to measure the FiO2 actually delivered by the device under simulated conditions of respiratory distress.

    Methods: In this benchmark study, FiO2 was measured by varying the respiratory rate (FR, 10 up to 45/min), tidal volume (Vt, between 150 and 750mL) and inspiratory flow (between 30 and 90L/min) at the target pressure of 8 cmH2 O. The assembly included a Boussignac CPAP fed with 100% O2 via a flowmeter up to 30 L/min, a double Vygon mask sealed, a Vygon manometer, a Michigan test lung driven by a Dräger Evita 4 ventilator, and a Citrex analyzer. Each measurement was done 3 times and the average of the 3 values was retained.

    Results: The O2 flow required to maintain the target pressure of 8cmH20 was <= 25L/min. Depending on FR and inspiratory flow, for a Vt <= 250mL, the delivered FiO2 ranged between 70 and 99%. For a 350mL <= Vt <= 500mL, the FiO2 was between 57 and 90%. For a given Vt, FiO2 decreased when FR and/or inspiratory flow were increased (Fig. 1).

    Conclusions: The Boussignac CPAP delivered high values of FiO2 at a flow rate of O2 <= 30L/min. However, FiO2 varied with FR, Vt and inspiratory flow. These changes in FiO2 observed during simulated severe respiratory distress conditions should be taken into account and compared to other CPAP devices.


    Templier F et al. Eur J Emerg Med 10(2):87-93, 2003

    Fig. 1 (abstract P189).

    Effect of varying RR and inspiratory flow on FiO2 at a Vt of 0.250 L

    P190 Recurrent respiratory deterioration events during intensive care unit stay and mortality among mechanical ventilated patients

    D Stavi1, Y Lichter1, H Artsi1, U Keler2, AM Lipsky3, N Adi1, I Matot1

    1Tel Aviv Sourasky Medical Center, Tel Aviv, Israel; 2Intensix, Netanya, Israel; 3Rambam Health Care Campus, Haifa, Israel

    Introduction: Recurrent respiratory deteriorations in Mechanically Ventilated (MV) patients may occur during Intensive Care Unit (ICU) stay. Knowledge of the association of such events and mortality is limited.

    Methods: This is a single center retrospective study performed in the ICU of Tel Aviv Medical Center, Israel, a tertiary academic referral hospital. Using the electronic medical record system and Intensix predictive critical care system for analysis, all patients admitted to the ICU between 1.2007 and 12.2014 were assessed. Respiratory deterioration in MV patients was defined as acute adjustment of FiO2 increase >20% or PEEP increase > 5 cmH2O that persisted for at least 2 hours. The primary outcome was ICU mortality. Secondary outcome was length of ICU stay (LOS). A Chi square test for trends was used for the significance of mortality data and a one way ANOVA test for LOS.

    Results: 5376 MV patients were admitted to the ICU with an overall mortality of 16.5%. Mortality and LOS were tripled in patients who experienced at least one respiratory deterioration when compared to no events (33.8% vs. 9.9 %, p<0.0001 and 10.7 vs. 2.2 days, p<0.0001 respectively) (Fig. 1). Increased events of respiratory deteriorations showed significant trend of increased mortality (p<0.0001).

    Conclusions: In MV patients, a single respiratory deterioration event carries a 3 times higher mortality rate and Length Of Stay (LOS). Any additional event further increases both parameters.

    Table 1 (abstract P190). Results Table

    P191 Association of lung ultrasound score with mortality in mechanically ventilated patients

    J Taculod, JT Sahagun, Y Tan, V Ong, K See

    National University Hospital Singapore, Singapore, Singapore

    Introduction: Lung ultrasound is an important part of the evaluation of critically ill patients. It has been shown to predict recruitability in acute respiratory distress syndrome. However, little is known about the application of lung ultrasound in predicting mortality in mechanically ventilated patients.

    Methods: Observational study of mechanically ventilated patients admitted to the medical intensive care unit (ICU) of a tertiary hospital (National University Hospital, Singapore) in 2015 and 2016. Only the first ICU admissions of these patients were studied. Lung ultrasound was done at six points per hemithorax and scored according to Soummer (Crit Care Med 2012): normal aeration = 0; multiple, well-defined B lines =1; multiple coalescent B lines = 2; lung consolidation = 3. The Lung Ultrasound (LUS) score was calculated as the sum of points (score range 0-36). We analysed the association of LUS score with ICU/hospital mortality, using logistic regression, adjusted for age and Acute Physiology and Chronic Health Evaluation (APACHE) II score.

    Results: 247 patients were included (age 62.0 ± 16.2 years; 89 female [36.0%]; APACHE II 29.7 ± 7.9; 88 sepsis diagnosis [35.6%]). ICU and hospital mortality were 16.2% and 29.6% respectively. LUS score was associated with increased ICU (OR 1.04, 95% CI 1.00-1.09, P=0.07) and hospital (OR 1.04, 95% CI 1.00-1.08, P=0.045) mortality, adjusted for age and APACHE II score.

    Conclusions: LUS score was associated with increased ICU/hospital mortality and may be useful for risk stratification of mechanically ventilated patients admitted to ICU.

    P192 A survey on mechanical ventilator practice and synchrony management in Thailand

    N Kongpolprom

    King Chulalongkorn Memorial Hospital, Bangkok, Thailand

    Introduction: Ventilator asynchrony results in morbidities and mortality. The aim of this study was to explore whether and how physicians used patient-ventilator interactions(PVI) to set mechanical ventilators(MV) in Thailand.

    Methods: Thai physicians treating MV patients were asked to respond to questionnaires distributed in conferences and to e-mails sent. Types of asynchronies encountered and frequency of MV adjustment guided by PVI were evaluated. In addition, correlations between physician’s knowledge and 1)confidence to manage asynchronies and 2)their experience were analyzed.

    Results: Two hundred and eleven physicians answered the questionnaires. Most of them were medical residents and ICU specialists. 82% of them set and adjusted MV by asynchrony guidance and the majority used waveform analysis to more than a half of their patients. The most and the least common asynchronies encountered were double triggering and reverse triggering, respectively, while the most difficult-to-manage and the most easily managed asynchronies were periodic/?A3B2 show $132#?>unstable breathing and flow starvation, respectively. Lack of confidence and knowledge of PVI were the major reasons of physicians who did not perform asynchrony assessment. For knowledge evaluation, more than 50% of physicians incorrectly managed asynchrony. Chest and ICU fellows had the greatest skills in waveform interpretation and asynchrony management with the mean score of 2.62 from the total 5, compared with specialist(2.58), medical residents(1.85), internists(1.84) and general practitioner(0.85). There were poor correlations between years’ experience in MV management and the skill in waveform interpretation (r = 0.15, p=0.034) and between physician’s confidence in PVI management and the clinical skill (r = 0.27, p<0.001)

    Conclusions: The majority of Thai physicians realized the importance of PVI, but the skill in asynchrony management was moderate. Intensive programs should be provided to improve their clinical performance.

    P193 Single step versus intermittent evacuation of simple pleural effusion in ventilated patients

    N Makhoul

    Galilee Medical Center, Naharia, Israel

    Introduction: Early and expeditious evacuation of simple pleural effusion (SPE) in ventilated patients may improve their respiratory condition. Employment of ultrasonography (US) and pigtail catheters for SPE management has shown beneficial results and widely accepted. Nevertheless, methods of SPE evacuation are still under discussion. Objective of the study is to evaluate efficacy and safety of single step versus intermittent SPE evacuation in ventilated patients.

    Methods: Our retrospective study included 81 adult ventilated ICU patients with SPE from 05.2015 to 08.2017. US has been used for diagnosis and navigation of pigtail catheters insertion. In the a-group (40 patients) SPE evacuation was done by single step during one hour. In the b-group (41 patients) SPE evacuation was performed intermittently for 24 hours as usually recommended. In both groups SPE was evacuated by gravitation method. The outcomes of intervention were evaluated by chest US and mobile x-ray film.

    Results: Of 81 patients (52 males, 29 females; mean age 68,9) the main causes of SPE were: congestive heart failure (30%), pneumonia (22%), and malignancy (20%). The median evacuated volume was 781 ml (range 300- 2400 ml). Catheters were removed if minimal fluid discharge (<100 ml) has been observed during following 48 hours in a-group, and 72 hours in b-group. In both groups US and chest x-ray demonstrated re-expansion of compressed lung. There were no complications in either group.

    Conclusions: Single step evacuation provided safe, less time-consuming management of any SPE volume with the same efficacy in comparison to intermittent technique in ventilated patients.

    P194 Positional effects on the distributions of ventilation and end-expiratory gas volume in the asymmetric chest –a quantitative lung computed tomographic analysis

    GA Cortes-Puentes1, K Gard2, A Adams2, D Dries3, R Oeckler1, M Quintel4, L Gattinoni4, JJ Marini3

    1Mayo Clinic, Rochester, MN, USA; 2Regions Hospital, Saint Paul, MN, USA; 3University of Minnesota, Minneapolis, MN, USA; 4University of Göttingen, Göttingen, Germany

    Introduction: Body positioning affects the configuration and dynamic properties of the chest wall and therefore may influence decisions made to increase or decrease ventilating pressures and tidal volume. We hypothesized that unlike global functional residual capacity (FRC), component sector gas volumes and their corresponding regional tidal expansions would vary markedly in the setting of unilateral pleural effusion (PLEF), owing to shifting distributions of aeration and collapse as posture changed.

    Methods: Six deeply anesthetized swine underwent tracheostomy, thoracostomy and experimental PLEF with 10 ml/kg of radiopaque saline randomly instilled into either pleural space. Animals were ventilated at VT=10ml/kg, frequency=15bpm, I/E=1:2, PEEP=1cmH2O, and FIO2=0.5. Quantitative lung computed tomographic (CT) analysis of regional aeration and global FRC measurements by nitrogen wash-in/wash-out technique were performed in each of these randomly applied positions: semi-Fowler’s (inclined 30° from horizontal in the sagittal plane); prone, supine, and lateral positions with dependent PLEF and non-dependent PLEF (Fig. 1).

    Results: No significant differences in FRC were observed among the horizontal positions, either at baseline (p=0.9037) or with PLEF (p=0.58) (Fig. 2A). However, component sector total gas volume in each phase of the tidal cycle were different within all studied positions with and without PLEF (p=<.01). Compared to other positions, prone and lateral position with non-dependent PLEF had a more homogenous VT distribution among quadrants (p=.051, Fig. 2B). Supine was associated with most dependent collapse (Fig. 2C) and greatest tendency for tidal recruitment (48% vs ~22%, p=0.0073, Fig. 2D).

    Conclusions: Changes in body position in the setting of effusion-caused chest asymmetry markedly affected the internal distributions of gas volume, collapse, ventilation, and tidal recruitment, even when commonly used global FRC measurements provided little indication of these important positional changes.

    Fig. 1 (abstract P194).

    Nomenclature For Analysis of Regional Aeration: I. Supine and II. Prone, where quadrants were defined as: Non-PLEF Dorsal (A); Non-PLEF Ventral (B); PLEF Dorsal (C); and PLEF Ventral (D). III. Lateral position with “Dependent Pleural Effusion”, where quadrants were defined as: Non-PLEF, Non-Dependent Dorsal (A); Non-PLEF, Non-Dependent Ventral (B); PLEF Dependent Dorsal (C); and PLEF Dependent Ventral (D). IV. Lateral position with “Non-Dependent Pleural Effusion”, where quadrants were defined as: Non-PLEF, Dependent Dorsal (A); Non-PLEF, Dependent Ventral (B); PLEF, Non-Dependent Dorsal (C); and PLEF Non-Dependent Ventral (D). PLEF: Pleural Effusion, *: anatomical distribution of PLEF

    Fig. 2 (abstract P194).

    A. Global FRC response to body position; B. Distribution of tidal ventilation; C. Positional Changes in End-Expiratory Collapsed Volume; D. Positional Changes in Tidal Recruitment

    P195 Intensive care unit doctors’ self-reported preferences for arterial oxygen tensions in mechanically ventilated patients

    OL Schjørring1, AP Toft-Petersen2, KH Kusk1, EE Sørensen1, P Mouncey2, A Perner3, J Wetterslev3, BS Rasmussen1

    1Aalborg University Hospital, Aalborg, Denmark; 2Intensive Care National Audit & Research Centre (ICNARC), London, UK; 3Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark

    Introduction: Optimal oxygenation targets in critically ill patients treated in the intensive care unit (ICU) are not established. As high oxygen levels may be harmful, doctor’ preferences for oxygen administration in mechanically ventilated patients are of significant interest as these have the largest exposure to high fractions of inspired oxygen (FiO2). To quantify a multinational segment of ICU doctors’ preferences for oxygen administration in a variety of mechanically ventilated ICU patients, we conducted an international questionnaire-based survey.

    Methods: Based on a previous survey [1] we constructed and validated a 17-part questionnaire on oxygen administration. The questionnaire was electronically distributed through spring and summer 2016 to 1080 ICU doctors at various educational levels in 233 hospitals in Denmark, Finland, Norway, Sweden, England, Wales and Northern Ireland. Repeated reminders were sent.

    Results: In total, 681 doctors responded yielding a response rate of 63%. Of the respondents, 80% were affiliated with multidisciplinary ICUs, 14% with thoracic and/or cardiac ICUs and 6% with neuro-ICUs. Most respondents (79%) had completed their specialist training. Overall, arterial oxygen tension (PaO2) was the preferred parameter for the evaluation of oxygenation (Fig. 1). The proportions of doctors’ preferences for increasing, decreasing or not changing an FiO2 of 0.50 in two (out of six) patient categories at different PaO2 levels are presented in Table 1 and Table 2.

    Conclusions: This is the largest survey of the preferred oxygenation targets among ICU doctors. PaO2 seems to be the preferred parameter for evaluating oxygenation. The characterisation of PaO2 target levels in various clinical scenarios provide valuable information for future clinical trials on oxygenation targets in critically ill ICU patients.


    1. Mao C et al. Crit Care Med 27(12):2806-2811, 1999

    Table 1 (abstract P195). Doctors’ preferences for adjusting an FiO2 of 0.50 in a patient with acute respiratory distress syndrome (n = 654 to 655)
    Table 2 (abstract P195). Doctors’ preferences for adjusting an FiO2 of 0.50 in a patient with chronic obstructive pulmonary disease with known habitual hypercapnia (n = 649)
    Fig. 1 (abstract P195).

    Preferred parameter in the evaluation of oxygenation (n = 677). *Most doctors preferred PaO2 in their evaluation of oxygenation in mechanically ventilated patients (p < 0.001, simple binomial test against the equally distributed proportion)

    P196 Diaphragm ultrasound to predict weaning outcomes in mechanically ventilated patients

    S Abdallah, A Ben Souissi, W Yaakoubi, S Kamoun, I Ben Naoui, MS Mebazaa

    Mongi Slim university Hospital, Tunis, Tunisia

    Introduction: Sonographic assessment of diaphragmatic excursion and muscle thickening fraction have been suggested to evaluate diaphragm function during weaning trial [1]. The purpose of this study is to compare these two parameters to predict extubation success.

    Methods: This prospective study was carried out during 9 months from March to November 2017. We enrolled patients who were mechanically ventilated for more than 48h and met all criteria for extubation. The non inclusion criteria were: age < 18 years, history of neuromuscular disease or severe chronic respiratory failure. We excluded subjects who needed reintubation for upper airway obstruction, neurological or hemodynamic alteration. The US exam was performed during spontaneous breathing test or pressure support trial, measuring both diaphragm excursion (DE) and diaphragmatic thickening fraction (DTF) within 24 hours before extubation

    Results: Among 36 enrolled patients, four were excluded and 78.1% were successfully extubated, whereas 21.9% needed reintubation or noninvasive ventilation within 48h from extubation. The median degree of DE was higher in patients with extubation success than those with failure (17.8 mm vs. 11.9 mm, p=0.01). Patients with extubation success had a greater DTF (41.7% vs. 29.7%, p = 0.007). The area under the ROC for DE was 0.800, 95% CI [0.578-1], while it was 0.717 for DTF 95% CI [0.453-0.981], p=0.001. Cut off values associated with successful extubation were 12 mm for DE and 31% for DTF giving respectively 84% and 96% of sensibility, 71% and 57% of specificity, 7.6 and 9.7 of likelihood ratio. Combining both parameters decreased sensibility to 79% but increased specificity to 100%.

    Conclusions: US may be a valuable tool in the evaluation of diaphragm dysfunction. The diaphragm excursion seems more accurate than the diaphragm thickening fraction to predict extubation success


    Mayo P et al. Intensive Care Med 42:7, 2016

    P197 Impact of adhesion of ventilatory weaning protocol on the incidence rate of pneumonia associated with mechanical ventilation in the tracheostomized patient

    P Travassos1, E Teixeira2, L Freitas2, P Darin2, L Junqueira2, T Kawagoe2, N Postalli2, R Vale2, V Veiga2, S Rojas2

    1Sao Paulo, Sao Paulo, Brazil; 2Hospital BP - A Beneficência Portuguesa de São Paulo, São Paulo, Brazil

    Introduction: Ventilatory weaning protocols are important for the reduction of pneumonia associated with mechanical ventilation in tracheostomized patients. The objective of this study was to evaluate the impact of the adhesion of the ventilatory weaning protocol on the incidence rate of pneumonia associated with mechanical ventilation in the tracheostomized patient in a large hospital neurological intensive care unit.

    Methods: The tracheostomized patients were retrospectively assessed from January 2015 to May 2017, correlating time of ventilatory weaning and pneumonia associated with mechanical ventilation.

    Results: In the period, 8,485 patients were admitted to the unit, with a mean age of 66.5 years, with an average stay of 5.6 days; 56% of the hospitalizations were surgical, with an expected mortality of SAPS3 of 22.3% and real mortality of 11.2%. In this group, 497 were tracheostomized patients and 276 eligible for ventilatory weaning according to institutional protocol. Ninety-six percent of patients completed ventilatory weaning. Prior to protocol initiation, the mean ventilatory weaning time was 12.8 days, which decreased to 5.2 days in 2015 and 1.6 days in 2016. The incidence rate of ventilator-associated pneumonia was 1.09 by 2015; 1.49 in 2016 and 0.75 in 2017.

    Conclusions: The implementation of ventilatory weaning protocol contributed safely to standardization of the weaning process in the unit, reduction of mechanical ventilation time and low rate of pneumonia associated with mechanical ventilation.

    P198 Evaluation of diaphragmatic thickness with ultrasound in patients undergoing controlled and assisted mechanical ventilation

    C Abbruzzese 1, D Ferlicca2, S Francesconi2, V Ormas3, E Lupieri3, S Calcinati4, S Cenci3, E Chiodaroli5, A Facchini2, A Pesenti1, G Foti4, G Bellani3

    1Fondazione IRCCS Ca’ Granda Ospedale Maggiore Policlinico, Milano, Italy; 2ASST Monza, Desio Hospital, Desio, Italy; 3University of Milano-Bicocca, Monza, Italy; 4ASST Monza, San Gerardo Hospital, Monza, Italy; 5Università degli Studi di Milano, Milan, Italy

    Introduction: Ventilator Induced Diaphragmatic Dysfunction is known to be a contributor to weaning failure. Some data suggest that assisted ventilation might protect from diaphragmatic thinning. Aims of this study are to evaluate, by ultrasound (US), the change in diaphragm thickness and thickening in patients undergoing controlled and assisted mechanical ventilation (MV) and clinical factors associated with this change.

    Methods: We enrolled patients who underwent either controlled MV (CMV) for 48 cumulative hours or 48 hours of pressure support (PSV) if ventilation was expected to last for at least 5 days. Patients < 18 years old, with neuromuscular diseases, phrenic nerve injury, abdominal vacuum dressing system and poor acoustic window were excluded. Diaphragm thickness and thickening were measured with US as described by Goligher and clinical data were collected every 48 hours until ICU discharge.

    Results: We enrolled 44 patients, 13 were excluded because they had less than 4 measurements and 2 for low quality images, leaving 29 patients for analysis. As expected, during CMV diaphragm thickening was almost absent and significantly lower than during PSV (p<0,001). Diaphragm thickness did not reduce significantly during CMV (p=0.201), but during PSV significantly increased (p<0.048) (Fig. 1, where “day 0” represents the first day of PSV). During CMV, in 10/21 patients diaphragm thickness showed a >=10% reduction. They had a significantly higher fraction of days spent in CMV (p=0.005) and longer neuromuscular blocking drugs (NBDs) infusion (p=0.043). During PSV, 17/26 patients showed an increase in diaphragm thickness >=10%. Duration of hospital stay was significantly lower for these patients (p 0.048). Differences between the two groups are reported in Table 1.

    Conclusions: Longer time spent in CMV and with NBDs infusion seems associated with a decrease in diaphragm thickness. Assisted ventilation promotes an increase in diaphragm thickness, associated with a reduction in the length of hospitalization.

    Table 1 (abstract P198). Characteristics of patient subgroups according to increase in diaphragm thickness during PSV
    Fig. 1 (abstract P198).

    Changes in diaphragm thickness over time. Day “0” is the first day of PSV; days before “0” are days of CMV, days after “0” are days of PSV. Dashed line represents 10% cut-off value selected a priori to define clinically relevant increases in diaphragm thickness. *: p value < 0,05 versus day 0.

    P199 Prediction of intrinsic positive end-expiratory pressure using diaphragmatic electrical activity in neutrally-triggered and pneumatically-triggered pressure support

    F Xia

    Nanjing Zhongda Hospital, Southeast University, Nanjing, China

    Introduction: Intrinsic positive end-expiratory pressure (PEEPi) may substantially increase the inspiratory effort during assisted mechanical ventilation. Our purpose of the study was to assess whether electrical activity of the diaphragm (EAdi) can be reliably used to estimate PEEPi in patients undergoing conventional pneumatically-controlled pressure support (PSP) ventilation and neutrally-controlled pressure support (PSN) and whether PSN was beneficial in comparison with PSP in patients affected by PEEPi.

    Methods: Twelve intubated and mechanically ventilated COPD patients with static PEEPi>=5cm H2O underwent PSP and PSN at different levels of extrinsic PEEP (PEEPe) (at 0%, 40%, 80%, and 120% of static PEEPi, for 12 minutes at each level on average), at matching peak airway pressure. We simultaneously recorded EAdi, airway, esophageal pressure, and flow. Tracings were analyzed to measure dynamic PEEPi (decrease in esophageal pressure to generate inspiratory flow), and intrinsic-EAdi (EAdi value at the onset of inspiratory flow), and IDEAdi (inspiratory delay between the onset of EAdi and the inspiratory flow).

    Results: Mean airway pressure was comparable for PSP and PSN at same target levels. The pressure necessary to overcome PEEPi, intrinsic-EAdi, and IDEAdi was significantly lower in PSN as compared with PSP, decreased with increase in PEEPe, although the effect of external PEEP was less pronounced in PSN. Intrinsic-EAdis at different PEEPe in PSP were tightly correlated with dynamic PEEPi (r2 =0.46, r2 =0.56, r2 =0.71, r2 =0.62, respectively).

    Conclusions: In COPD patients with PEEPi, PSN compared with PSP, led to a decrease in the pressure necessary to overcome PEEPi, which could be reliably monitored by the electrical activity of the diaphragm before inspiratory flow onset (intrinsic-EAdi) in PSP.

    P200 Distribution of ventilation in obese and non-obese patients: obesity greatly attenuates peep-induced hyperdistension.

    G Alcala1, C Morais1, R De Santis2, M Tucci1, M Nakamura1, E Costa1, M Amato1

    1University of Sao Paulo School of Medicine, Sao Paulo, Brazil; 2Massachusetts General Hospital, Boston, MA, USA

    Introduction: Atelectasis develops in critically ill obese patients submitted to mechanical ventilation. The pressure exerted by the abdominal weight on the diaphragm causes maldistribution of ventilation with increased pleural pressure and diminished response to PEEP. Our objective was to analyze the effects of PEEP in the distribution of ventilation in obese and non-obese patients according to BMI (obese >= 30 kg/m2, or non-obese: 20 to 29.9 kg/m2), using electrical impedance tomography (EIT).

    Methods: We assessed the regional distribution of ventilation of surgical and clinical patients submitted to a decremental PEEP itration monitored by EIT. We calculated the percent ventilation to the nondependent (anterior) lung regions at the highest and lowest PEEP applied. The highest compliance of respiratory system was consistently observed at intermediate values of PEEP (between those extreme values), indicating that the highest PEEP caused pulmonary overdistension, whereas the lowest PEEP likely caused dependent lung collapse

    Results: Were enrolled 37 patients, with 15 non-obese patients (25,7±2 kg/m2) and 22 obese patients (32.4± 1.7 kg/m2). All patients presented progressively decreased ventilation to dependent (posterior) lung regions when PEEP was lowered (P<0.001). Obese patients consistently presented higher ventilation to the anterior lung zones (when compared no nonobese), Fig. 1, at equivalent PEEP levels: (68±2 % vs 59±1 %, P<0.001 at the lowest PEEP; 40±0.1 % vs 32±0.1%, P<0.001 at the highest PEEP). The Higher and Lower PEEP levels applied were similar in obese vs. non-obese.

    Conclusions: Obese patients present a displacement of ventilation to the anterior region when compared to non-obese patients.

    Fig. 1 (abstract P200).

    Distribution of ventilation to non-dependent lung regions. Red= obese patients; Blue = non-obese patients

    P201 Low flow CO2 removal in combination with renal replacement therapy effectively reduces ventilation requirements in hypercapnic patients – results of a pilot study

    J Nentwich1, S Kluge2, D Wichmann2, H Mutlak3, S Lindau3, S John1

    1Klinikum Nürnberg, Nuremberg, Germany; 2University Medical Center Hamburg-Eppendorf, Hamburg, Germany; 3University Hospital Frankfurt, Frankfurt, Germany

    Introduction: Lung protective ventilation is the mainstay of mechanical ventilation in critically ill patients [1]. Extracorporeal CO2 removal (ECCO2R) can enhance such strategies [2] and has been shown to be effective in low flow circuits based on renal replacement platforms [3, 4, 5]. We show the results of a pilot study using a membrane lung in combination with a hemofilter based on a conventional renal replacement platform (Prismalung™) in mechanically ventilated hypercapnic patients requiring renal replacement therapy (NCT02590575).

    Methods: The system incorporates a membrane lung (0.32 m2) in a conventional renal replacement circuit downstream of the hemofilter. 26 mechanically ventilated patients requiring renal replacement therapy were included in the study. Patients had to be hypercapnic at inclusion under protective ventilation. Changes in blood gases were recorded after implementation of the extracorporeal circuit. Thereafter ventilation was intended to be decreased per protocol until baseline PaCO2 was reestablished and changes in VT and Pplat were recorded. Data from 20 patients were included in the final analysis.

    Results: The system achieved an average CO2 removal rate of 43.4±14.1 ml/min which corresponded to a PaCO2 decrease from 68.3±11.8 to 61.8±11.5 mmHg (p<0.05) and a pH increase from 7.18±0.09 to 7.22±0.08 (p<0.05) [Fig. 1]. After adaption of ventilator settings we recorded a decrease in VT from 6.2±0.9 to 5.4±1.1 ml/kg (p<0.05) and a reduction of Pplat from 30.6±4.6 to 27.7±4.1 cmH2O (p<0.05). These effects were even more pronounced in the “per protocol” analysis [Fig. 2].

    Conclusions: Low flow ECCO2R in combination with renal replacement therapy provides partial CO2 removal at a rate of over 40 ml/min can significantly reduce invasiveness of mechanical ventilation in hypercapnic patients.


    [1] ARDSNet 2000,

    [2] Bein 2013

    [3] Terragni 2009

    [4] Forster 2013

    Fig. 1 (abstract P201).

    30 minutes after implementation of the combined renal replacement and ECCO2R circuit a moderate decrease in PaCO2 (-6.5 mmHg) corresponding to a slightly higher pH (0.04) was observed

    Fig. 2 (abstract P201).

    Reestablishment of initial PaCO2 resulted in a significant decrease in tidal volume (-0.8 ml/kg, p<0.05) and plateau pressure (-2.9 cmH2O, p<0.05)

    P202 Is the filters setting relevant in terms of performances during low flow(lf)-ECCO2R in combination with CRRT?

    S Mantovani1, F Biagi1, R Baldini1, A Gmerek1, C Barth1, VM Ranieri2

    1B. Braun Avitum AG, Melsungen, Germany; 2Sapienza University, Rome, Italy

    Introduction: In ECCO2R-CRRT, efficiency of CO2 removal is higher positioning the oxygenator (OXY) up-stream than down-stream the haemofilter due to higher blood flow (BF) [1]. We tested whether this effect was due to lower pre-filter pressure (PFP).

    Methods: ECCO2R-CRRT circuit was tested in-vitro (n=10) with the following settings: 5 L bovine blood; BF 450 ml/min; OXY 1.81 m2 (Euroset); CVVH post mode; substitution flow 2500 ml/h; UF rate function off; 1.5 m2 haemofilter (Diapact®, B.Braun Avitum); sweep air flow 4.5 l/min. PFP was evaluated at baseline, 24, 48 and 72 hours. CO2 extraction was measured at BF of 100, 300 and 500 ml/min. Sweep air flow/blood ratio was 1:10. CO2 was add to obtain PaCO2 of 80 mmHg. CO2 removal rate calculation (2): CO2 removal rate = (CO2 ECCO2R inlet– CO2 ECCO2R outlet)* blood flow (Eq.1)

    CO2 molar volume at 25 °C [l/mol] = 24; solubility of CO2 at 37 °C = 0.03 mmol/(l*mmHg); HCO3i = inlet HCO3 concentration [mmol/l]; HCO3o = outlet HCO3 concentration [mmol/l]; Pi CO2 = inlet CO2 partial pressure [mmHg]; PoCO2 = outlet CO2 partial pressure [mmHg] equation1 becomes: CO2 removal rate=24 x ((HCO3i + 0.03 x PiCO2) - (HCO3o + 0.03 x PoCO2)) x blood flow (Eq.2)

    Results: BF of 450 ml/min was always reached with the up-stream configuration. BF was reduced to 400 ml/min with the down-stream configuration due to high PFP alarm (Table 1). CO2 removal increased to 34.5±13.9 to 69.1±29.5, and 126.0±28.4 ml/min, at BF of 100, 300 and 500 ml/min (p<0.05).

    Conclusions: BF of 500 ml/min can be reached only with the up-stream configuration due to lower circuit PFPs. BF directly correlates to CO2 removal efficiency. We may speculate that simultaneous use of CRRT and LF-ECCO2R and activation of the UF rate function with the down-stream setting may further increase PFP thus forcing to more enhanced reduction of BF and less effective CO2-removal.


    1. Crit Care Med 2015;43(12):2570-81

    2. Anaesth Crit Care Pain Med 34;(2015):135-140

    Table 1 (abstract P202). Pre-filter pressures dependent on OXY position (ANOVA, p<0.001)

    P203 Low-flow combined extracorporeal renal-pulmunary support: a retrospective observational case series

    G Consales1, L Zamidei1, G Michelagnoli1, B Ammannati2, G Boscolo3, G Dipalma4, P Isoni5, F Turani6

    1Santo Stefano Hospital, Prato, Italy; 2University of Florence, Florence, Italy; 3Ospedale dell’Angelo, Maestre, Italy; 4IRCCS Policlinico San Donato, San Donato, Italy; 5 Marino Hospital, Cagliari, Italy; 6Aurelia and European Hospital, Rome, Italy

    Introduction: We describe the use of a novel low-flow ECCO2R-CRRT device (PrismaLung-Prismaflex, Baxter Healtcare Gambro Lundia-AB-Lund, Sweden) for simultaneous lung-renal support.

    Methods: A retrospective review of patients submitted to PrismaLung-Prismaflex due to AKI associated to hypercapnic acidosis during the period May 2016 - August 2017 at Prato Hospital ICU was performed. Data collected were: demographic, physiologic, complications, outcome. Data were presented as mean ± DS; Anova Test was used to compare changes of parameters over time; significance was set at P< 0,05.

    Results: We identified 13 patients (mean age 71 ± 13 yr, mean SOFA 12 ± 3). Causes of hypercapnia were moderate ARDS (n=4) and AE-COPD (n=9). In all patients a 13fr double lumen cannula was positioned and 350 ml/min blood-flow with 10 lt oxygen sweep-gas-flow was maintained; iv-heparin aiming to double aPTT was used. Haemo-diafiltration (effluent flow 35 ml/kg/hour) was delivered. In all cases PrismaLung-Prismaflex improved respiratory and metabolic parameters (Figs. 1 and 2) without any complications. All patients survived to the treatment, nevertheless 2patients (1AE-COPD; 1ARDS) died during ICU stay due to irreversible cardiac complications. In ARDS cases: 3 patients were successfully weaned from IMV, mean duration of the treatment was 88 ± 31hours, mean duration of IMV after ECCO2R-CRRT was 2 ± 2 days. In AE-COPD cases: intubation was avoided in 3 patients at risk of NIV failure, 6 patients were successfully weaning from IMV, mean duration of the treatment was 79 ± 31 hours, mean duration of IMV after ECCO2R-CRRT was 0,1 ± 0,3 days.

    Conclusions: The use of PrismaLung-Prismaflex has been safe and effective: it may be argued that it could be due to the low-blood-flow used. The positive results of this preliminary study may constitute the rational for the design of a larger randomized control trial.

    Fig. 1 (abstract P203).

    Mean CO2 ± DS (mmHg) during ECCO2R-CRRT

    Fig. 2 (abstract P203).

    Mean pH ± DS during ECCO2R-CRRT

    P204 Systemic IL-18 production and spontaneous breathing trial (SBT) outcome: the effect of sepsis

    A Kyriakoudi, N Rovina, O Koltsida, M Kardara, I Vasileiadis, M Daganou, N Koulouris, A Koutsoukou

    National and Kapodestrian University of Athens Medical school, Athens, Greece

    Introduction: Spontaneous breathing trial (SBT), a routine procedure during ventilator weaning, entails cardiopulmonary distress, which is higher in patients failing the trial. An intense inflammatory response, expressed by increased levels of pro-inflammatory cytokines, is activated during SBT. Sepsis, a common condition in ICU patients, has been associated with increased levels of the pro-inflammatory cytokine IL-18. IL-18 produced among others by skeletal muscles, has been associated with severe muscle wasting and maybe by ICU acquired weakness. We hypothesised that IL-18 increases during SBT, more evidently in SBT failures. We anticipate this response to be more pronounced in formerly septic patients fulfilling the criteria for SBT.

    Methods: 75 SBTs of 30-min duration were performed and classified as SBT failure or success. Blood samples were drawn before, at the end of the SBT and 24 hours later. Serum IL-18 levels and other inflammatory mediators, commonly associated with distress, were determined and correlated with SBT outcome. Subgroup analysis between septic and non-septic patients was performed.

    Results: SBT failure was significantly higher in septic patients compared to non-septic (41% vs 13%, OR=4.5 95% CI: 1.16-17.68, p 0.022). Septic patients had significantly elevated IL-18 levels at baseline compared to non-septic (p<0.05). A trend of increase in IL-18 at 30 min of SBT was observed in both groups. Serum levels of IL-18 were returned to baseline values after 24 h. These changes however, were not significantly different. Increased levels of IL-18 in septic patients were correlated with a rapid shallow breathing index >75 (p=0.028) and APACHE II score (p=0.05). No changes were observed in the remaining inflammatory mediators in both groups.

    Conclusions: Elevated IL-18 levels in septic patients were associated with cardiopulmonary distress and disease severity. IL-18 may have a potential role as a predictor of SBT failure in septic patients.

    P205 Assessing the effects of duration of apnea on adequacy of ventilation in the post-operative environment

    JE Freeman1, H Essber2, B Cohen2, M Walters2, D Chelnick2, L Glosser2, C Hanline2, A Turan2

    1Respiratory Motion, inc., Waltham, MA, USA; 2Cleveland Clinic Foundation, Cleveland, OH, USA

    Introduction: Maintaining respiratory sufficiency is a vital component of post-operative care, yet clinicians often rely on subjective assessments or secondary indicators. Patients with apneic episodes are often considered to have respiratory insufficiency regardless of whether these apneic episodes are compensated by large recovery breaths. We used a non-invasive respiratory volume monitor (RVM) to measure minute ventilation (MV) to assess ventilation in patients experiencing apnea in the PACU and general floor.

    Methods: We used an RVM (ExSpiron1Xi, Respiratory Motion, Inc.) to continuously monitor MV for 48h following elective abdominal surgery. MV was expressed as percent of predicted MV (MVPRED); Low MV was defined as MV<40%MVPRED. For each apnea (ie, respiratory pause >10 seconds), we calculated the patient’s corresponding MV over the 30, 60, 90, and 120s windows following the start of the apnea.

    Results: 216 patients (110 males, BMI: 26.7 (15.1-41.2)kg/m2) were monitored for 42.0±0.9 hours. 49985 apneas were identified ranging from 10-117s (Fig. 1A). Apneas were observed in 99% of patients, suggesting low predictability of respiratory insufficiency. The average MV was 73±2.4%MVPRED, as patients were often sleeping or mildly sedated. We assessed the effects of each apnea on the temporally associated MV (Fig. 1B). While apneas ranging in length from 10-18s decrease MV by as much as 30%, their effect over 1min is <10%. On a 2min time scale, even 60s apneas led to LowMV just 20% of the time (Fig. 1C).

    Conclusions: While apneas were ubiquitous, they seldom led to LowMV over clinically relevant time scales. Large compensatory breaths following an apnea generally restored MV to near pre-apnea levels. Nonetheless, some apneas can become dangerous when ignored, as when subsequent sedation decreases compensatory breath size. RVM data provide a better metric of respiratory competence, driving better assessment of patient risk and individualization of care.

    Fig. 1 (abstract P205).

    The effect of apneas on respiratory status. (A) Distribution of apneas across a surgical population. Apneas >10sec were recorded in 99% (214 out of 216) patients in the post-operative period and 92% of the recorded apneas were <20s long. (B) Sustained MV over various time windows (30-sec – red, 60-sec – green, 90-sec – purple, 2-min – cyan) following the onset of apnea. Note that, an apnea of 30-sec will (by definition) drive MV over a 30-sec window down to 0, but will only decrease MV over a 60-sec window down to ~35% MVPRED and to less than 60% over a 2-min window. (C) Likelihood of an apnea of specific length to decrease MV below the Low MV cutoff over various time windows. Note that a single 10-sec apnea has just a 25% chance to decrease MV below 40% in a 30-sec window and less than 2% chance to decrease MV below the cutoff over a 2-min window. Even 60-sec apneas have just 20% chance of decreasing sustained MV over a 2-min window below the 40% MVPRED cutoff

    P206 Spectrum of histopathological findings and underlying causes in 59 patients with diffuse alveolar hemorrhage

    S Sangli, M Baqir, J Ryu

    Mayo Clinic, Rochester, MN, USA

    Introduction: Diffuse alveolar hemorrhage (DAH) is an acute life-threatening event and recurrent episodes of DAH may result in irreversible interstitial fibrosis. Identifying the underlying cause is often challenging but is needed for optimal treatment. Lung biopsy is often performed in the diagnostic evaluation of patients with suspected DAH. However, the role of lung biopsy in this clinical context is unclear. Hence, we sought to identify the spectrum of histopathologic findings and underlying causes in patients with DAH who underwent lung biopsy, surgical or transbronchial.

    Methods: We identified 59 patients who underwent surgical lung biopsy (n = 25) or bronchoscopic biopsy (n = 34) in the evaluation of DAH over a 19-year period from 1999 to 2017. We extracted relevant clinical pathologic and laboratory data.

    Results: The median age in our cohort was 67 years with 51% females. Serologic evaluation was positive in 47% of patients (n=28). Most common histopathologic findings on surgical lung biopsy included alveolar hemorrhage (AH) with capillaritis in 11 patients of whom six had necrotizing capillaritis, followed by AH without capillaritis in 7 patients. The most common histopathologic finding on bronchoscopic lung biopsy was AH without vasculitis/capillaritis in 22 patients, followed by AH with capillaritis in 11 patients. There were no procedure related complications or mortality observed with either method of lung biopsy. The clinico-pathologic diagnoses in these patients are shown in Tables 1 and 2.

    Conclusions: In patients with DAH undergoing lung biopsy alveolar hemorrhage without capillaritis was found to be the most common histopathologic finding followed by pulmonary capillaritis. These histopathologic findings contributed to the final clinico-pathologic diagnoses of granulomatous polyangiitis and microscopic polyangiitis in a substantial portion of cases. Future studies are needed to ascertain the benefits vs. risks of lung biopsy in patients with suspected DAH.

    Table 1 (abstract P206). Histopathological findings on surgical lung biopsy and common clinico-pathologic diagnoses
    Table 2 (abstract P206). Histopathological findings on bronchoscopic lung biopsy and common clinico-pathologic diagnoses

    P207 Cardiovascular risk in patients with severe COPD in early stages of renal dysfunction: the role of vitamin D

    N Trembach, V Yavlyanskaya

    Kuban State Medical University, Krasnodar, Russia

    Introduction: The aim of research was to study the frequency and structure of acute cardiovascular events (ACE) in patients with severe COPD in the early stages of renal dysfunction with different levels of vitamin D.

    Methods: The study included 102 patients with 3-4 stage of COPD. All patients were diagnosed with 1-2 stage of chronic kidney disease according to the recommendations of KDIGO (2012). In addition to general clinical examination, the level of vitamin D was assessed in all patients and the frequency and structure of ACE in the previous 12 months were studied.

    Results: The average level of vitamin D was 7.2 ± 2.1 ng/ml. The incidence of insufficiency, deficiency and severe deficiency was 5%, 33% and 64%, respectively. The frequency of occurrence of the ACE was 42 cases. Acute coronary syndrome (ACS) and pulmonary thromboembolism (PTE) were observed relatively more frequently. When ranking according to the degree of severity of vitamin D level reduction, it was found that acute cardiovascular events occurred significantly more frequently in patients with severe vitamin D deficiency: 24 cases out of 43, including ACS (5 cases), PTE (2 cases), ischemic stroke (4 cases), severe arterial hypertension (5 cases), cardiac arrhythmia (8 cases). In patients with vitamin D levels of 10-20 ng/ml 12 cases of ACE were observed (p <0.05 compared to group with severe deficiency) - 3 cases of ACS, 2 cases of ischemic stroke, 4 cases of severe arterial hypertension, 3 cases of cardiac arrhythmia). Only 6 cases of ACE were observed (p <0.05 compared to group with severe deficiency) in patients with insufficiency of vitamin D (2 cases of ACS, 3 cases of ischemic stroke, 1 OHMK, 1 case of PTE).

    Conclusions: The frequency of acute cardiovascular events is significantly higher in patients with severe COPD and lower values of vitamin D. Thus, vitamin D can be considered as a possible target for the control and therapy of severe COPD.

    P208 Intravenous magnesium sulfate in the treatment of acute exacerbations of COPD: a randomized controlled trial

    E Pishbin, E Vafadar Moradi

    Mashhad University of Medical Sciences, Mashhad, Iran

    Introduction: Intravenous (IV) magnesium has an established role in the treatment of acute asthma attack. It is attractive as a bronchodilator because it is relatively cheap and has minimal side effects. There are few studies evaluating the effect of magnesium on acute exacerbation of COPD and the results of them are contradictory.

    The aim of this study was to investigate the effect of IV magnesium sulfate in the treatment of patients presenting to the emergency department (ED) with acute exacerbation of COPD.

    Methods: In this randomized controlled trial, adult patients presented to the ED with acute exacerbation of COPD were randomly allocated in 2 groups. Patients in the study group received the standard treatment, plus 2 gram of IV magnesium sulfate and patients in the control group received the standard treatment, plus placebo. The outcomes included admission rate, intubation rate, changes in PEFR, SpO2 and dyspnea severity score. PEFR, SpO2 and dyspnea severity score were documented before the treatment and 20 minutes after administration of MgSO4 or placebo.

    Results: 34 patients were included in the study (17 patients in each group), 16 patients were men and 18 patients were women. The study group had significantly more improvement in the dyspnea severity score (P=0.001) and SpO2 (P= 0.004) as compared to the control group. But the difference in the changes in PEFR, intubation rate and admission rate were not clinically significant between 2 groups. (P>0.05)

    Conclusions: Adding magnesium sulfate to the standard treatment in patients with acute exacerbation of COPD leads to more improvement in dyspnea severity score and SpO2 although it does not reduce admission rate and intubation rate.

    Small sample size and the dependency of PEFR on patient’s effort are limitations of the present study. Furthermore we did not exclude the patients with the Cor pulmonale which may had an effect on the low response rate to MgSO4.

    P209 Assessment of peripheral chemoreflex sensitivity in patients with COPD

    N Trembach, I Zabolotskikh

    Kuban State Medical University, Krasnodar, Russia

    Introduction: Assessing the sensitivity of the peripheral chemoreflex (SPCR), we can predict the likelihood of developing respiratory and cardiovascular disorders. SPCR is one of the markers of disease progression and good prognostic marker [1]. Disturbed respiratory mechanics can make it difficult to evaluate. Breath-holding test may be helpful in such situation, the results of this test are inversely correlated with peripheral receptor sensitivity to carbon dioxide in healthy people [2].The aim of the study was to compare the breath-holding test to single-breath carbon dioxide test in the evaluation of the sensitivity of the peripheral chemoreflex in subjects with COPD.

    Methods: The study involved 78 patients with COPD with FEV1/FVC <70% of predicted, all participants were divided into two groups depending of disease severity (GOLD classification, 2017). In group 1 (mild-to-moderate COPD, n=46) all patients had FEV1>=50% and in group 2 (severe-to-very severe COPD, n=32) all patients had FEV1<50%. Breath-holding test was performed in the morning before breakfast: voluntary breath-holding duration was assessed three times, with 10 min intervals [2]. A mean value of the duration of the three samples was calculated. The single-breath carbon dioxide test [3] was performed the next day. The study was approved by the local ethics committee. All subjects provided signed informed consent to both tests. The reported study was funded by RFBR, research project No. 16-34-60147 mol_a_dk.

    Results: The average SPCR measured with single-breath carbon dioxide test was 0.42±0.12 L/min/mmHg in group 1 and 0.26±0.08 L/min/mmHg in group 2. The average breath-holding duration was 44±13 seconds in group 1 and 37±13 seconds in group 2. During the correlation analysis a significant negative correlation between the results of two tests was noted (-0.79, p <0.05) in group 1 and a weak negative correlation in group 2 (-0.32, p <0.05).

    Conclusions: Peripheral chemoreflex sensitivity to carbon dioxide can be indirectly evaluated by a breath-holding test in patients with mild-to-moderate COPD, its assessment in severe COPD need further investigations.


    1. Giannoni A et al. J Am Coll Cardiol 53:1975–80, 2009

    2. Trembach N et al. Respir Physiol Neurobiol 235:79–82, 2017

    3. Chua TP et al. Eur Clin Invest 25:887–92, 1995

    P209A Outcomes of patients admitted to the intensive care unit with acute exacerbation of pulmonary fibrosis.

    S Sahota, J Paddle, C Powell

    Royal Cornwall Hospital, Truro, UK

    Introduction: We investigated outcomes of patients admitted to our general adult ICU for acute exacerbation of pulmonary fibrosis. British Thoracic Society (BTS) and National Institute for Health and Care Excellence (NICE) guidelines state that these patients should not routinely be admitted to ICU for respiratory support as the mortality is very high [1,2]. We decided to investigate our mortality rates for this cohort of patients and review them in the context of the BTS and NICE guidelines.

    Methods: This retrospective observational study reviewed all patients admitted to the ICU for respiratory support between October 2012 and January 2017. The data was collected from the hospital electronic and paper notes, and data collected was mortality rate, APACHE II score, ICNARC score, type of respiratory support received and whether there was documentation of advanced decisions in case of acute deterioration.

    Results: There were 12 patients admitted to the ICU with acute respiratory failure as a complication of pulmonary fibrosis. The median APACHE II score was 22 and ICNARC standardised mortality ratio was 5.2. Nine patients died on ICU (75%) and hospital mortality was ten (83%). Eight patients (67%) received high flow nasal oxygen, six (50%) received non-invasive ventilation, and two (17%) received invasive ventilation. The median time to death was 3.7 days. Of 11 patients for whom paper notes were available, no patient had any documented ceiling of care or end of life decisions.

    Conclusions: Our study confirmed a very high mortality in this cohort of patients, supporting national guidance that invasive respiratory support has limited value. We advise that frank discussion with patients and their families should happen early after diagnosis, such that end of life plans are already in place in the event of acute deteriorations.


    1. Wells et al. Thorax 63. Suppl 5 v1-v58, 2008

    2. NICE Clinical Guidelines: Idiopathic pulmonary fibrosis in adults: diagnosis and management (June 2013), accessed 15 Nov 2017

    P210 Acute respiratory failure (ARF): differences in diaphragmatic ultrasonography in medical and surgical patients

    G Zani, F Di Antonio, M Pichetti, V Cricca, A Garelli, C Gecele, M Campi, F Menchise, M Valbonetti, S Mescolini, E Graziani, M Diamanti, C Nencini, FD Baccarini, M Fusari

    Santa Maria delle Croci Hospital, Ravenna, Italy

    Introduction: ARF is common in critically ill patients. We compared diaphragm contractile activity in medical and surgical patients admitted to ICU with a diagnosis of ARF.

    Methods: Adult medical and major abdominal laparotomic surgical patients admitted to a general ICU with a diagnosis of ARF were enrolled. ARF was defined as a PaO2/FiO2 ratio<=300 mmHg/% and need for mechanical ventilation (MV) for at least 24 hours. Diaphragmatic ultrasound was realized bedside when the patient was stable and able to perform a trial of spontaneous breathing. A convex probe was placed in right midaxillary line (8th-10th intercostal space) to evaluate right hemidiaphragm. Diaphragmatic respiratory excursion and thickening were evaluated in M-mode on 3 consecutive breaths and thickening fraction (TF) was calculated. Antropometric, respiratory and hemodynamic parameters, SAPS2, SOFA score, duration of MV, need for tracheotomy and timing, septic state and site of infection, superinfections, ICU and inhospital length of stay (LOS) and outcome were recorded. Patients with trauma and neuromuscular disorders were excluded. P<0.05 was considered significant.

    Results: We enrolled 30 patients: 40% medical and 60% surgical, without differences for age, sex, BMI, SAPS2, SOFA score, sepsis and superinfections. Moderate ARF was prevalent in both groups. During diaphragmatic examination, no differences were recorder for respiratory rate, hemodynamic state and fluid balance. Surgical patients showed a lower but not significant diaphragm excursion (1.6vs1.8cm), instead TF was significantly reduced (58vs90%,p<0.05). No differences emerged on duration of MV, but tracheotomy were higher in medical ones (30vs11%,p<0.05). ICU and inhospital LOS do not differ between medical and surgical patients and mortality rate was respectively 17% and 22%.

    Conclusions: In ARF, surgical patients showed a lower diaphragm contractility compared to medical ones, maybe due to the combination of anesthetic and surgical effects, but with no influence on outcome.

    P211 Ultrasound and diaphragmatic dysfunction: impact of surgical approach

    M Luperto1, M Bouard2, S Mongodi1, F Mojoli1, B Bouhemad2

    1Fondazione IRCCS Policlinico San Matteo, University of Pavia, Pavia, Italy; 2Centre Hospitalier Régional Universitaire de Dijon Bourgogne, Dijon, France

    Introduction: Diaphragm ultrasound (DUS) easily identifies diaphragmatic dysfunction (DD) in ICU patients [1], a potential cause of weaning failure (WF).

    Methods: Prospective observational monocenter study. We enrolled adults ICU patients at day 1 after surgery, extubated, with no neuromuscolar diseases. In each patient DUS (caudal displacement (CD) and thickening fraction (TF)[2]), was performed. WF was defined as NIV or reintubation within 48h after extubation.

    Results: We enrolled 43 patients (25 males, age 70.9±10.2 years, BMI 27.1±5, MV length 5h [4-6], ICU stay 1.5 days [1.0-2.0]). Surgery was performed by laparotomy (La-28%), sternotomy (St-53%) or right thoracotomy (Rt-19%). No differences were remarked in patients’ characteristics, MV length, ICU stay Table 1. Rt patients had WF more than other approaches (p=0.0080); right CD and TF were significantly lower in Rt (p<0.05, Table 2). Rt was a risk factor for WF (OR 17.5, 95% CI 1.38-222.62, p=0.0024).

    Conclusions: Postsurgical right DD is significantly affected by surgical approach; Rt has the highest percentage of DD, probably explaining the highest rate of WF.


    1. Zambon et al Annual Update in Inten Care and Emerg Med 427-438, 2013

    2. Umbrello M et al. Respir Care 2016

    Table 1 (abstract P211). Population characteristics (#thoracotomy significant vs. sternotomy and laparotomy; MV mechanical ventilation, BMI body mass index)
    Table 2 (abstract P211). Ultrasound results (#thoracotomy significant vs. sternotomy and laparotomy)

    P212 Does it make difference to measure diaphragm thickness with M mode or B mode?

    B Kalın, G Gursel

    Gazi University School of Medicine, Ankara, Turkey

    Introduction: Diaphragm dysfunction occurs quickly in mechanically ventilated patients and is associated with prolonged mechanical ventilation and poor outcome. Recent literature suggest that diaphragm thickening fraction (DTF) measured by ultrasound can be useful to predict weaning outcome. However, there is no standardized approach in the measurement of diaphragm thickness (DT) and limited data exist comparing different measurement techniques of diaphragm thickness (M mode-MM or B mode-BM). For example movements of the diaphragm relative to the transducer during the respiratory cycle may result in unintended comparison of different points of the diaphragm at end expiration and inspiration during MM measurements. The goal of this study was to compare MM with BM in the measurement of diaphragm thickness and DTF in the ICU patients.

    Methods: DT was measured in right diaphragm during tidal breathing. Three measurements of the DT were taken both in MM and BM and averaged to report the mean. DT was measured during inspiration and expiration and DTF was calculated. Bias and agreement between the 2 measurement methods was evaluated with Blant and Altman test.

    Results: Forty-two patients were enrolled in the study. Diaphragm could not visualize or measured in 8 patients and measurements of 34 patients were analyzed. Despite existance of different degree diaphragm movements relative to the ultrasound transducer secondary to respiratory movement there was no significant difference between the measurement results of MM and BM. There was good agreement and no significant proportional bias between the measurements of 2 modes (p>0.05). BM and MM measurements during the inspiratory (0.29± 0.07& 0.30±0.07 cm; p: 0.159), expiratory (0.22±0.06 & 0.22±0.06 cm; p:0.9) phases and DTF were (29±15 & 32±15 %, p:0.217) respectively.

    Conclusions: Our results suggest that there is no significant difference between in the measurement of diaphragm thickness with MM or BM.

    P213 Diaphragmatic dysfunction in critically ill patients and its impact on outcome. Assessment of diaphragmatic thickness and excursion by ultrasound

    Y Nassar, M Amin, M Ghonemy, H Abulwafa

    Cairo University, Giza, Egypt

    Introduction: The diaphragm is considered as the main respiratory muscle, and its dysfunction predisposes to many respiratory complications. Ultrasound (US) is now an accepted method of measuring Diaphragmatic Excursion (DE) and Diaphragmatic thickness (DT). We aimed to detect Diaphragmatic Dysfunction (DD) in critically ill patients and its impact on outcome.

    Methods: We prospectively recruited consecutively any critically ill adult requiring admission to MICU with SOFA >= 2. Exclusion criteria: diaphragmatic or spine injury, neuromuscular disease, any usage during hospital stay of paralytic agents, aminoglycosides, sedatives or analgesia other than morphine. The right hemi-diaphragm was evaluated by M mode US for DE and B mode US for DT with the patients in the supine position. DT and DE and laboratory measurments were taken on admission and every 48 hr for a total of 3 readings (Day 0, 2, 4).Patients were followed up for length of ICU stay and 30 day mortality. DD was diagnosed if a DT <=0.2 cm and DE <=1.0 cm.

    Results: The study included 106 subjects. In the total studied group mean age was 51.4 ± 16.3 years, 49 (46.2 %) were males while 57 (53.7 %) were females. The meanSOFA was 5.33 ± 3.66 and Mean APACHE II was 15.42 ±7.91.DD group included 38 (35.8 %) vs. Non DD group included 68 (64.1 %) subjects (p<0.001). PaCO2 was higher (48.2±4.1 vs. 39.3±6.9 mmHg, p=0.01) in DD vs. NDD group respectively. WBCs was higher (15.18×103 ± 7.2×103 vs. 11.78×103 ± 2.9×103 cell/ml, p=0.01) in DD vs. NDD group respectively. LOS was significantly higher (9.70±0.4 vs. 6.68±1.3 days, p=0.02) in DD vs. NDD group respectively. Comorbidities in the two groups are shown in Fig. 1. Mortality was significantly higher (81.5% vs. 23.5%, p<0.001) in DD vs. NDD group respectively.

    Conclusions: DD was present in nearly one third of ICU patients which was largely associated with a longer ICU stay and a higher mortality. PaCO2 and WBCs were associated with a negative effect on diaphragmatic function.

    Fig. 1 (abstract P213).

    Comorbidities on admission in Diaphragmatic Dysfunction (DD) and Non Diaphragmatic Dysfunction (NDD)

    P214 Multimodal evaluation of diaphragm function after cardiac surgery

    R Pinciroli1, M Bottiroli2, D Ceriani2, S Checchi1, A Danieli3, A Calini2, MP Gagliardone2, G Bellani1, R Fumagalli1

    1University of Milan-Bicocca, Monza, Italy; 2ASST Grande Ospedale Metropolitano Niguarda, Milan, Italy; 3University of Milan, Milan, Italy

    Introduction: Diaphragmatic dysfunction may occur following open-heart surgery. Diaphragm Ultrasound (US) provides a reliable evaluation of diaphragmatic motion. Surface Electromyography (sEMG) is a novel non-invasive technique to assess its electrical activity. [1] Aim of this study is to evaluate, through both sEMG and US, postoperative changes of diaphragm function in patients undergoing cardiac surgery.

    Methods: US measurements of right and left hemidiaphragm excursion during Quiet Breathing (QB) and Deep Breathing (DB) were obtained before surgery, and postoperatively, at the first spontaneous breathing trial. We simultaneously recorded bilateral sEMG traces. Values of Diaphragmatic Excursion (DE) and sEMG amplitude were analyzed and compared. Unilateral DD was defined as an asymmetry index (right/left excursion at DB) of >1.5 (left DD) or < 0.5 (right DD), as previously reported [2]. DP was defined as an excursion lower than 0.1 cm at DB, or evidence of paradoxical inspiratory motion.

    Results: 18 adult patients undergoing elective open-heart surgery were enrolled. A significant overall reduction of DE could be identified postoperatively, particularly at DB (5.4 [4.5-6.3] cm vs. 3.2 [2.8-3.6] cm, before vs. after surgery, p< 0.0001). As well, a significant reduction of sEMG amplitude could be measured during DB (18.1 [11.7-27.3] vs. 5.1 [3.2-7.4] μ V, p< 0.0001). DD with evidence of asymmetry was detected through US in 4/18 patients (22%) postoperatively (Fig. 1). Patients with DD showed a higher reintubation rate (2/4 vs. 0/14, DD vs. no DD), leading to a longer time of mechanical ventilation, ICU and hospital length of stay.

    Conclusions: Compared to baseline, postoperative diaphragmatic function was globally reduced in our patients, as shown by both US and sEMG data. A subgroup of subjects showed a monolateral DD, with an apparent impact on clinical outcome, despite the small sampled population.


    1. Lerolle N et al. Chest 2009

    2. Houston JG et al. Clin Radiol 1992

    Fig. 1 (abstract P214).

    Diaphragmatic excursion at deep breathing

    P215 Esophageal pressure and diaphragm

    A Menis, D Makris, V Tsolaki, E Zakynthinos

    University Hospital of Larisa, Larissa, Greece

    Introduction: Proper diaphragmatic function is of great importance for ICU patient under mechanical ventilation in order to facilitate the weaning process.

    Methods: We conducted a physiological study to correlate information derived from esophageal pressure measurements with measurements of diaphragmatic thickness in patients ventilated on a Pressure support mode, breathing through a T-piece tube and a situation of resistive breathing

    Results: We studied 12 patients hospitalized in the Intensive Care Unit of the University Hospital of Larisa. During the Pressure Support mode we found a positive correlation between lung compliance and diaphragmatic thickness during inspiration and expiration (r: 0.842, p=0.001 and r: 0.777, p=0.003 respectively) and also between Diaphragmatic thickness and Tidal volume (inspiration r: 0.650, p=0.022 and expiration r: 0.680, p=0.015). Transdiaphragmatic pressure generated when the patients on a T tube was correlated with Diaphragmatic thickness during inspiration and expiration (r:0.550, p=0.001 and r:0.471, p=0.004, respectively). The same was found between diaphragmatic thickness and the Tidal Volume (inspiration r:0.539, p<0.001, expiration r:0.465, p=0.004). Tidal Volume also correlated with Diaphragmatic displacement during inspiration (r: 0.463, p=0.004). During resistive breathing, both diaphragmatic thickness during inspiration and expiration were positively correlated with the generated tidal volume (r: 0.358 p=0.010 and r: 0.454, p=0.001, respectively) and so was the diaphragmatic displacement during inspiration (r:0.533, p<0.0001).

    Conclusions: In this study we found that the result of diaphragmatic function, meaning transdiaphragmatic pressure and the generated tidal volume, can be assessed focusing on the changes of the diaphragmatic thickness during the respiratory cycle.


    1. Supinski G et al. Chest, 2017

    P216 The relationship between transpulmonary pressure, end expiratory lung volume and intraabdominal pressure in patients with respiratory and intraabdominal pathologies

    D Diktanaite1, A Kalenka2, M Fiedler1, E Simeliunas1

    1University of Heidelberg, Heidelberg, Germany; 2Hospital Bergstrasse, Heppenheim, Germany

    Introduction: Measuring transpulmonary pressure (PL), end expiratory lung volume (EELV) and intraabdominal pressure (IAP) is a new approach for setting parameters in mechanical ventilation. We analyzed the relationship between these parameters in two challenging patient groups: group 1 with ARDS or severe pneumonia and group 2 with intraabdominal hypertension or BMI>45.

    Methods: Intensive care patients at the hospital Bergstrasse (Heppenheim, Germany), requiring mechanical ventilation>72h, were included in the study. Esophageal pressure (Pes), EELV and IAP were measured daily at 3 PEEP levels: 15, 10 and 5. Pes was measured via nasogastric catheter with an esophageal balloon, EELV – by nitrogen washout method and IAP as bladder pressure through a bladder catheter. Patients were in supine position and under full muscle relaxation. Tidal volume – 6ml/kg. PL and chest wall elastance (Ecw) were calculated using the elastance-derived method [1].

    Results: 18 Patients (N=18) were included in the study (N1=9; N2=9). In 53 measurements (N1=25; N2=28) there was no difference between groups regarding EELV (p>0.05). At PEEP of 5, patients in group 2 had higher Pinsp, Pes and IAP (Pinsp22±4 vs 18±2, p<0.01); (Pesmin 11.8±4 vs 9.9±3.5, p=0.04); (IAP 11.1±3 vs 8.7±3, p<0.01) Table 1. There was a strong positive correlation between EELV and Cstat at PEEP 5 (R=0.7, p=0.02), as well as Cstat and end expiratory PL (PLexp) (R=0.6, p<0.01) [Fig. 1]. In group 2 we saw a strong correlation between IAP and Ecw (R=0.6, p<0.01) and mild correlation between IAP and Cstat (R=0.4, p=0.02) [Fig. 2].

    Conclusions: Higher transpulmonary end expiratory pressure ensures better Cstat. In patients with intraabdominal pathology, the chest wall elastance strongly depends on the intraabdominal pressure.


    1. Chiumello D et al. Am J Respir Crit Care Med 178:346-55, 2008

    Table 1 (abstract P216). Differences between groups, showed in mean values ± standard deviation
    Fig. 1 (abstract P216).

    A strong correlation between end expiratory transpulmonary pressure (PLendexp) and static compliance (Cstat) at PEEP 5

    Fig. 2 (abstract P216).

    A strong correlation between IAP and Ecw and a mild correlation between IAP and Cstat at PEEP level of 5 in patients with intraabdominal pathologies

    P217 Occlusion test for esophageal manometry: large artifacts caused by fast compression

    C Lima1, C Morais1, T Yoshida2, S Gomes1, G Plens1, O Ramos1, R Roldan3, S Pereira1, M Tucci1, E Costa1, M Amato1

    1Heart Institute (InCor), Hospital das Clínicas, Faculdade de Medicina da Universidade de São Paulo, São Paulo, Brazil; 2University of Toronto, Toronto, Canada; 3Hospital Rebagliati, Lima, Peru

    Introduction: Pleural pressure (Ppl) is estimated by measuring esophageal pressure (Pes) with an esophageal balloon. To validate the correct position and inflation volume of the balloon, a positive or negative pressure occlusion test is used, considered valid when the ratio between oscillations in Pes (Δ Pes) versus airway-pressure (Δ Paw) are equivalent. We investigated the impact of two different compression/decompression rates during chest compression.

    Methods: We studied 8 pigs with injured lungs, paralyzed and supine. An esophageal balloon was placed at the lower portion of esophagus and a Ppl sensor (wafer-type, flat balloon) was placed at most dependent lung region. Both devices were inflated with minimal (non-stressed) volume derived from in-vitro and in-vivo PV curves. Two rates of thoracic compression (slow and fast) were tested and recorded at different PEEP levels. Plots of Pes/Paw and Ppl/Paw were fitted into a hysteresis and linear-regression model.

    Results: 70 repeated measurements were performed in 8 animals (PCV mode, PEEP = 12±3 cmH2O and VT = 6 ml/kg). The compression-time for slow maneuver was 0.49 ± 0.10 s, vs. 0.19 ± 0.04 for the fast one. Plots of Ppl/Paw showed small hysteresis, with phase angles of 1 vs. 3 grad for slow vs. fast maneuvers. In contrast, plots of Pes/Paw showed higher hysteresis, with phase angles of 3 vs. 32 grad (slow vs. fast) (Fig. 1). The slope of the regression line for Pes/Paw plots was consistently higher for slow compressions (0.98 ± 0.08), as compared to fast ones (0.84 ± 0.05). A good agreement between Δ Pes and Δ Paw (Fig. 2) was found during slow maneuvers, but not during the fast ones.

    Conclusions: Slow chest compressions must be used when checking position/inflation of esophageal balloon. The fast maneuver produces hysteresis and underestimation of Δ Pes (but not in direct Δ Ppl). Pes monitoring at high respiratory rates may be problematic.


    Fig. 1 (abstract P217).

    Representative model between slow chest compression with time of velocity 0.61 seconds versus fast chest compression with 0.20 seconds. a) Behavior of hysteresis between Paw with Pes. b) Behavior of hysteresis between Paw and direct Ppl measurement.

    Fig. 2 (abstract P217).

    Bland-Altman plots showing the agreement between ∆Pes and ∆Paw in two different strategies: a) slow chest compression and b) fast chest compression

    P218 Electrical impedance tomography guided ventilation

    R Knafelj, M Noc, V Gorjup

    University Medical Center Ljubljana, Ljubljana, Slovenia

    Introduction: In protective ventilation strategy tidal volume of 6 mL/kg IBW, PEEP to prevent tidal derecruitment, driving pressure <15 cm H2O, plato pressure <28 cm H2O and FiO2 <0.6 is used. While PEEP can be set individually using techniques like quasi static pressure-volume (PV) curve, expiratory transpulmonary pressure (PtPEEP) and electrical impedance tomography (EIT), real time visual bed-side techniques to determine overdystension is possible by EIT only. We compared different strategies for PEEP optimization during protective ventilation approach in comatose patients admited after cardiac arrest with no previous known pulmonary diseases and screened for inspiratory overdystension

    Methods: 20 consecutive comatose post cardiac arrest patients were ventilated with volume assist ventilation (6 mL/kg IBW, PEEP 5 cm H2O) using Elisa 800EIT (Lowenstein Medical, GE). Orogastric tube (NutriVent, Sidam, IT) was inserted, and EIT vest (swisstom AG, CH) was applied in all patients. Measurements were performed 60 min after admission and after 3 hrs (Fig. 1). Optimal PEEP was defined as lower inflection point using PV curve (PV), positive PtPEEP (Ptp) and optimal Regional Stretch/Silent Spaces (EIT)

    Results: Methods to determine PEEP using PV, Ptp and EIT were comparable in non obese patients (p=NS). Measures after 180 min were consistent with the first measures. PEEP set initially was too low in 7 patients and too high in 5 patients. When highest PEEP (Ptp, EIT, PV) was selected, anterior hyperinflation was present in 3 patients. In 2 obese patients (BMI 34 and 36) PV suggested lower PEEP, while EIT and Ptp measures suggested higher PEEP (6 vs 12 and 14 respectively), p=0.02

    Conclusions: Using protective ventilation strategy in comatose post cardiac arrest patients with no previous known lung disease does not prevent from anterior hyperinflation in some non-obese patients. PV, EIT and Ptp are comparable methods in determining PEEP in non obese patients, while in obese patients EIT was superior


    CritCare 2017;21:183

    Fig. 1 (abstract P218).

    A Ptp, B Flow, C Vt tracing, E, F Silent Spaces

    P219 Validity of esophageal pressure measurement by a commercially available ventilator: a comparison with an in vivo calibration method

    M Kyogoku1, S Mizuguchi2, T Miyasho3, Y Endo3, Y Inata4, T Ikeyama1, K Tachibana4, K Yamashita3, M Takeuchi4

    1Aichi Children’s Health and Medical Center, Obu-shi, Aichi, Japan; 2Kyushu University, Fukuoka-city, Fukuoka, Japan; 3Rakuno Gakuen University, Ebetsu, Hokkaido, Japan; 4Osaka Women’s and Children’s Hospital, Izumi, Osaka, Japan

    Introduction: Assessment of pleural pressure by measuring esophageal pressure (Pes) is useful to guide optimal ventilator settings in patients with respiratory failure. However, the measurement of Pes is affected by several factors, including filling volume of an esophageal balloon and an esophageal wall elastance. Recently, Mojoli et al. have reported an in vivo calibration method (M method) to make reliable the measurement of both absolute values of Pes (abs Pes) and tidal swings of Pes (Δ Pes). In contrast, the validity of Pes measurement by a commercially available ventilator has not been investigated. Thus, we evaluated the accuracy of abs Pes and Δ Pes measured by AVEA ventilator (CareFusion, Yorba Linda, USA), which regulates a filling volume of balloon automatically and is the only approved equipment to measure Pes in Japan, by comparison with the M method.

    Methods: Four anesthetized and paralyzed pigs (40.9-42.8 kg) were mechanically ventilated and subjected to lung injury by saline lung lavage. In each pig, two different extrathoracic pressures were applied by using a thoraco-abdominal belt. Abs end-expiratory Pes and Δ Pes were measured by AVEA ventilator and the M method, and obtained values were compared for their correlation and agreement.

    Results: Comparison of the two methods showed that correlation coefficients of abs end-expiratory Pes and Δ Pes were 0.85 (P = 0.007) and 0.88 (P = 0.004), respectively. By Bland-Altman analysis, the bias and precision of abs end-expiratory Pes were -2.2 and 2.3, and those of Δ Pes were 0.1 and 3.4, respectively (Figs. 1 and 2).

    Conclusions: Abs end-expiratory Pes tended to be higher in the A method than in the M method. Δ Pes of the two methods were well correlated. In our animal model, the accuracy of Pes measured by AVEA ventilator was clinically acceptable when compared to the M method.

    Fig. 1 (abstract P219).

    Agreement between the two methods of measuring absolute end-expiratory Pes by Bland-Altman analysis

    Fig. 2 (abstract P219).

    Agreement between the two methods of measuring tidal swings of Pes by Bland-Altman analysis

    P220 Comparison of esophageal pressure and CVP-derived pleural pressure in spontaneously breathing children

    N Okuda1, M Takeuchi2, M Kyogoku3, K Moon2, Y Inata2, T Hatachi2, Y Shimizu2

    1Tokushima University Hospital, Tokushima, Japan; 2Osaka Women’s and Children’s Hospital, Osaka, Japan; 3Aichi Children’s Health and Medical Center, Aichi, Japan

    Introduction: Maintaining inspiratory pleural pressure (Ppl) in an optimal range is important to prevent lung injury, respiratory muscle fatigue and ventilator-induced diaphragmatic dysfunction. While esophageal pressure (Pes) has been used as a gold standard surrogate for Ppl, the measurement of Pes has several challenges including correct positioning of an esophageal catheter. We hypothesized that Ppl in spontaneously breathing children during mechanical ventilation can be estimated by using the change in central venous pressure (Δ CVP).

    Methods: Spontaneously breathing children under mechanical ventilation with acute respiratory failure (PaO2/FIO2 <300), who has a central venous catheter and an esophageal catheter for clinical purposes, were enrolled. Correct positioning of the esophageal balloon catheter was ensured by an occlusion test (OT), in which the changes in Pes and airway pressure (Δ Pes and Δ Paw, respectively) were confirmed to be close to unity. First, we obtained a ratio (k) of Δ Paw (≈Δ Ppl) to Δ CVP during OT. Second, assuming k was the same as the ratio of Δ Ppl to Δ CVP during spontaneous breathing, Δ Ppl during pressure support of 10, 5, and 0 cmH2O was calculated by using Δ CVP as follows: CVP-derived Δ Ppl = k * Δ CVP. Finally, CVP-derived Δ Ppl was compared to Δ Pes at each ventilator setting.

    Results: Eight patients (Median age 4.8±3.3 months; median body weight 4.7±1.3 kg) were included in the analysis. CVP-derived Δ Ppl and Δ Pes were correlated significantly (y=0.59x+3.6, R2=0.56, p<0.01). Bland-Altman analysis showed that bias and precision of these two methods were -0.53 and 3.5 cmH2O, respectively (Fig. 1).

    Conclusions: In spontaneously breathing children during mechanical ventilation, CVP-derived Δ Ppl correlated well with Δ Pes. It could be used as a guide to estimate pleural pressure without using an esophageal balloon catheter.

    Fig. 1 (abstract P220).

    Bland-Altman analysis demonstrated that CVP-derived ΔPpl and ΔPes were correlated significantly

    P221 Driving pressure during assisted ventilation: an observational study

    A Proklou1, E Akoumianaki1, E Pediaditis1, C Psarologakis1, E Koutsiana2, I Chouvarda2, E Kondili1, D Georgopoulos1, K Vaporidi1

    1University Hospital of Heraklion, Heraklion, Greece; 2Medical School, Aristotle University of Thessaloniki, Thessaloniki, Greece

    Introduction: The driving pressure of respiratory system (DP) reflects the extent of lung stretch during tidal breathing, and has been associated with mortality in ARDS patients during controlled mechanical ventilation [1]. Aim of this study was to examine DP during assisted ventilation, and examine if and when high DP occurs in patients in assisted ventilation with PAV+.

    Methods: Critically ill patients hospitalized in the ICU of the University Hospital of Heraklion, on mechanical ventilation in PAV+ mode were studied. Continuous recordings of all ventilator parameters were obtained for up to three days using a dedicated software. DP was calculated from the PAV+ computed compliance (C) [2], and the measured exhaled tidal volume (VT, DP=VT/C). Periods of sustained DP above 15 cmH2O were identified, and ventilation and clinical variables were evaluated.

    Results: Sixty-two patients and 3200 hrs of ventilation were analyzed. In half of the patients, DP was lower than 12 cmH2O in 99% of the recording period, while high-DP (>15cmH2O) more than 10% of the total time was observed in 10% of patients. ICU non-survivors had more time with high DP than survivors (p=0.04). Periods of sustained high-DP (>15cmH2O for >1h) were observed in 9 patients. Level of assist, minute ventilation, and respiratory rate were not different between the periods of high DP and the complete recordings, while VT was higher and C was lower during the high-DP period compared to the complete recording. The median compliance was below 30 ml/cmH2O during the high-DP period, and above 50 ml/cmH2O during the complete recording.

    Conclusions: High DP is not common, but does occur during assisted ventilation, predominantly when compliance is below 30 ml/cmH2O, and may be associated with adverse outcome.


    [1] Amato MB et al. N Engl J Med. 372(8):747-55, 2015

    [2] Younes M et al. Am J Respir Crit Care Med. 164(1):50-60, 2001

    P222 Comparison of respiratory volume monitoring vs. capnography during intravenous propofol-based anesthesia

    JE Freeman1, S Pentakota2, E Blaney2, B Kodali2

    1Respiratory Motion, inc., Waltham, MA, USA; 2Brigham and Women’s Hospital, Boston, MA, USA

    Introduction: Capnography (EtCO2) is the current standard of care for monitoring ventilation in patients under general anesthesia. However, in non-intubated patients, EtCO2 monitoring is challenging and clinicians therefore often rely on pulse oximetry, a late indicator of respiratory depression, or on subjective assessment. A noninvasive respiratory volume monitor (RVM) provides accurate and continual monitoring of minute ventilation (MV), tidal volume (TV) and respiratory rate (RR). Here, we compared RVM and EtCO2 monitoring in patients receiving propofol-based sedation.

    Methods: In an observational study approved by Partners IRB, simultaneous data were recorded by an RVM (ExSpiron, Respiratory Motion, Inc.) and capnography (Capnostream 20, SmartCapnoLine, Covidien) during colonoscopy procedures. Baseline MV, TV, and EtCO2 were established prior to sedation. Periods of High EtCO2 (>50mmHg or >130%Baseline), Low EtCO2 (<20mmHg or <70%Baseline), and Low MV (<40%Baseline) were identified. The number of High EtCO2 and Low EtCO2 events that were preceded by Low MV events was quantified.

    Results: 50 patients (22 males, 52±16years; 26.1±6.3kg/m2) were monitored for 33.2±14.7min. Table 1 summarizes the percent of monitored time with reported data for the two devices. Figure 1 depicts MV decrease following propofol and cannula dislodgement following a jaw thrust. Table 2 presents the number of EtCO2 events across all patients. Low MV events preceded all 9 High EtCO2 events by 7.7min. Low MV events preceded 43 out of 47 of the Low EtCO2 events by 2.9min. The 4 Low EtCO2 events that did not correspond to a change in MV were potentially false alarms.

    Conclusions: In non-intubated patients, the RVM identified all High EtCO2 events and detected the decrease in MV 7.7min before EtCO2. The RVM provides reliable measurements when capnography data is unavailable and is not subject to the EtCO2 limitations of nasal cannula placement, dilution with O2, and mouth breathing.

    Table 1 (abstract P222). Percent of monitored time with data reported
    Table 2 (abstract P222). Low Minute Ventilation Events and EtCO2 Alarms
    Fig. 1 (abstract P222).

    MV (top) decreases in response to propofol (purple), resulting in an LMVe at 13:45. EtCO2 (middle) stays relatively constant at the start of the LMVe without triggering either high or low EtCO2 alarms. A jaw thrust (13:46), results in dislodged nasal canula. MV increased slightly, but still remained low (40% MVBaseline). EtCO¬2 increased to >50 mmHg and remained high

    P223 Evaluation of the negative arterial to end-tidal co2 pressure gradient as a predictor of survival in acute respiratory failure

    C Ko1, M Yeh1, Y Chen1, C Chen2

    1Chang Gung Memorial Hospital, Yulin, Taiwan; 2National Chung Hsing University, Taichung, Taiwan

    Introduction: Measurement of CO2 concentrations in the expired air directly indicates changes in the elimination of CO2 from the lungs. In a healthy person, end tidal CO2 (elimination of PCO2 from alveolar) is lower than PaCO2 (arterial PCO2) by 2-5 mmHg. The Arterial to end-tidal CO2 pressure gradient [(a-ET) PCO2]is dependent on the amount of alveolar dead space. Negative (a-ET) PCO2 values were observed with increased cardiac output and increased CO2 production, reduced FRC and low compliance. Thus, (a-ET) PCO2 monitoring can be used as a simple index of pulmonary blood flow.The objective of this study was to investigate the association between (a-ET) PCO2 and mortality in patients with acute respiratory failure.

    Methods: We retrospective studied 197 intubated patients undergoing mechanical ventilation due to acute respiratory failure patients between 2014 and 2015 in Chang Gung Memorial Hospital. PaCO2 to end-tidal CO2 pressure gradient with clinical data and outcomes were measured after admission.

    Results: Forty patients with negative (a-ET) PCO2 values had lower mortality, lower FiO2 setting, lower respiratory rate and higher blood pressure (Table 1). Odds ratios (95% confidence intervals) associated with Age(<65yr), gender(male), PaO2/FiO2(<300) and minute volume(>8L/min) were 0.078(0.010-0.624), 0.148(0.042-0.521), 0.089(0.011-0.701) and 0.229(0.075-0.698), respectively (Table 2). Negative (a-ET) PCO2 was strongly associated with good outcome and were significantly associated with overall survival (Fig. 1)

    Conclusions: In conclusion, the negative arterial to end-tidal CO2 pressure gradient may predict patient survival in some subgroups.

    Table 1 (abstract P223). Comparison of patients who Negative (a-ET) PCO2 values with Positive (a-ET) PCO2 values
    Table 2 (abstract P223). Associations between negative (a-et) pco2 values and mortality in subgroups
    Fig. 1 (abstract P223).

    Kaplan-Meier survival curves of patients

    P224 Predictive indexes in the extubation success or failure: role of the expiratory peak flow

    IR Porto, MM Silva, RS Zaponi, JL Jaskowiak, LR Abentroth, BA Kanezawa, ER Penteado, PS Quadros, TC Schnaufer, EF Osaku, CR Costa, SM Ogasawara, MA Leite, AC Jorge, P Duarte

    Hospital Universitario do Oeste do Parana, Cascavel, Brazil

    Introduction: Cough efficiency may be an important factor at weaning, and expiratory Peak Flow (ePF) could be a marker of risk for extubation failure. Objective: To verify the relationship between predictive indexes and extubation failure in mechanically ventilated patients, analyzing specifically the role of the ePF.

    Methods: Retrospective study, conducted from January to December 2016, in a 14-beds General Intensive Care Unit (ICU), from a University Hospital. Predictive indexes, such as Maximal Inspiratory Pressure (MaxIP), Rapid Shallow Breathing Index (RSBI) and ePF, were collected on the day of extubation of adult patients with MV >24h (excluded tracheostomized). For statistical analysis, the data were described by mean, standard deviation and percentage. Pearson’s Chi-square test was used for correlation, adopting p <0.05.

    Results: It was included 154 patients. Most commom causes of admission: medical non-neurological (33%), elective postoperative (20%), and Traumatic Brain Injury (TBI) (14%); mean APACHE II 25.6, 1st day SOFA 9.6, mean age 48.2y, 58% female. The duration of MV and sedation were 123.2 and 57.3 hours, respectively. ICU and hospital length of stay were 10.7 and 25.1 days. Mean MaxIP was 28.0 ± 14.22 mmHg, RSBI 58.7 ± 35.76, and ePF 57.6 ± 33.86 L/min. Among the patients included in the study, 99% were extubated and of these, 12% had extubation failure. There was a correlation between MaxIP and ePF (p < 0.001), and extubation failure was associated with MaxIP (p<0.001) and ePF (p < 0.001).

    Conclusions: MaxIP and ePF (collected at the extubation day) were correlated with extubation failure, demonstrating that they are reliable indices to predict extubation failure in adult MV patients.

    P225 Expression of proinflammatory and fibrogenetic biomarkers in ards: a case-series of 4 patients

    M Cozzolino1, M Del Re2, E Rofi2, G Rombolà1, A Franci1, M Bonizzoli1, R Danesi2, A Peris1

    1A.O.U. Careggi, Florence, Italy; 2University Hospital, Pisa, Italy

    Introduction: ARDS may result from various diseases and is characterized by diffuse alveolar injury, lung edema formation, neutrophil-derived inflammation and surfactant dysfunction. Various biomarkers have been studied in diagnostics and prognostication of ARDS. The purpose of the study was to measure the expression of proinflammatory mediators like IL-8 and TNF, a cellular receptor with a role in innate immunity(TLR-2),and a biomarker of fibrogenesis (MMP-7) in different phases of ARDS patients.

    Methods: We studied 4 patients admitted to our ICU with diagnosis of ARDS during the month of January 2016. Six ml of blood were prospectively collected at two times: during the acute phase and in a sub-acute phase before ICU discharge. Blood samples were centrifuged to obtain the platelet-rich plasma and plasmatic RNA (cRNA) was isolated from platelets.IL-8, TNF, TLR-2 and MMP-7 expression in cRNA was determined by the Droplet Digital™ PCR as copies/ml.

    Results: All patient showed a decrease in IL-8, TNF, TLR2 and MMP-7 levels after the acute phase of ARDS (Fig. 1). Patient 1 and 3 were affected by influenza A virus (H3N2), patient 2 was admitted for pneumococcal pneumonia and patient 4 was affected by Legionella. Adequate ethiologic treatment was promptly started in patients with bacterial infection. Mean duration of mechanical ventilation was 17.5 days. All patient survived ICU stay and were discharged from hospital.

    Conclusions: IL-8, TNF, TLR-2 and MMP-7 expression detected by extracted platelets RNA, may be a novel tool useful for clinicians indicating persistent inflammation with resulting progressive alveolar fibrosis and impaired lung function. More data are necessary to understand the real clinical significance of this biomarkers and their role in fibroproliferation and progression of ARDS.

    Fig. 1 (abstract P225).

    Biomarkers expression (copies/ml)

    P226 Genetic modification of mesenchymal stem cells overexpressing angiotensin II type 2 receptor increase lung engraftment in LPS-induced acute lung injury mice

    X Xu1, LL Huang1, LL Huang1, SL Hu1, JB Han1, HL He2, JY Xu1, JF Xie1, AR Liu1, SQ Liu1, L Liu1, YZ Huang1, FM Guo1, Y Yang1, HB Qiu1,

    1Nanjing Zhongda Hospital, School of Medicine, Southeast University, Nanjing, China; 2Affiliated Hospital of University of Electronic Science and Technology of China & Sichuan Provincial People’s Hospital, Chengdu, China

    Introduction: Although mesenchymal stem cells (MSCs) transplantation has been shown to promote lung respiration in acute lung injury (ALI) in vivo, its overall restorative capacity appears to be restricted mainly because of low engraftment in the injured lung. Ang II are upregulated in the injured lung. Our previous study showed that Ang II increased MSCs migration in an Angiotensin II type 2 receptor (AT2R)-dependent manner [1]. The objective of our study was to determine whether overexpression of AT2R in MSCs augments their cell migration and engraftment after systemic injection in ALI mice.

    Methods: A human AT2R expressing lentiviral vector was constructed and introduced into human bone marrow MSCs. We also down-regulated AT2R mRNA expression using a lentivirus vector carrying AT2R shRNA to transduce MSCs. The effect of AT2R regulation on migration of MSCs was examined in vitro. A mouse model of lipopolysaccharide (LPS) induce ALI was used to investigate the engraftment of AT2R-regulated MSCs and the therapeutic potential in vivo.

    Results: Overexpression of AT2R dramatically increased Ang II-enhanced human bone marrow MSC migration in vitro. Moreover, MSC-AT2R accumulated in the damaged lung tissue at significantly higher levels than control MSCs 24h and 72h after systematic MSC transplantation in ALI mice. Furthermore, MSC-AT2R-injected ALI mice exhibited a significant reduction of pulmonary vascular permeability and improved the lung histopathology and had additional anti-inflammatory effects. In contrast, there were less lung engraftment in MSC-ShAT2R-injected ALI mice compared with MSC-Shcontrol after transplantation. Thus, MSC-ShAT2R-injected group exhibited a significant increase of pulmonary vascular permeability and resulted in a deteriorative lung inflammation.

    Conclusions: Our results demonstrate that overexpression of AT2R enhance the migration and lung engraftment of MSCs in ALI mice and may provide a new therapeutic strategy for the injured lung.


    1. Xu XP et al. Stem Cell Res Ther 8:164, 2017.

    P227 Hepatocyte growth factor protects against lipopolysaccharide-induced endothelial barrier dysfunction with Akt/mTOR/STAT-3 pathway

    S Meng, F Guo, X Zhang, W Chang, H Qiu, Y Yang

    Department of Critical Care Medicine, Zhongda Hospital, School of Medicine, Southeast University, Nanjing, China

    Introduction: Reorganization of endothelial barrier complex is critical for increased endothelial permeability implicated in the pathogenesis of acute respiratory distress syndrome. We have previously shown hepatocyte growth factor (HGF) reduced lipopolysaccharide (LPS)-induced endothelial barrier dysfunction. However, the mechanism of HGF in endothelial barrier regulation remains to be unclear.

    Methods: Recombinant murine HGF with or without mTOR inhibitor rapamycin were introduced on mouse pulmonary microvascular endothelial cells (PMVECs) barrier dysfunction stimulated by LPS. Then, endothelial permeability, adherent junction protein (occludin), endothelial injury factors (Endothelin-1 and von Willebrand factor), cell proliferation and mTOR signaling associated proteins were tested.

    Results: Our study demonstrated that HGF decreased LPS-induced endothelial permeability and endothelial cell injury factors, and attenuated occludin expression, cell proliferation and mTOR pathway activation.

    Conclusions: Our findings highlight activation Akt/mTOR/STAT-3 pathway provides novel mechanistic insights into HGF protective regulation of LPS-induced endothelial permeability dysfunction.

    Fig. 1 (abstract P227).

    Fluorescein isothiocyanate-Dextran or fluorescein isothiocyanate-BSA analysis of the effect of HGF on PMVECs permeability

    Fig. 2 (abstract P227).

    Western blot analysis of HGF on mTOR signaling pathway

    P228 The circadian clock protein BMAL1 regulates the severity of ventilator-induced lung injury in mice

    M Felten, LG Teixeira-Alves, E Letsiou, HC Müller-Redetzky, N Suttorp, A Kramer, B Maier, M Witzenrath

    University Medicine Charité, Berlin, Germany

    Introduction: Mechanical ventilation (MV) is a life-saving intervention for critically ill patients, but may also exacerbate pre-existing lung injury, a process termed ventilator-induced lung injury (VILI). Interestingly, we discovered that the severity of VILI is modulated by the circadian rhythm (CR). In this study, we are exploring the role of the myeloid BMAL1, a core clock component, in VILI.

    Methods: We employed mice lacking Bmal1 in myeloid cells (LyzMCre-Bmal1-/-) and LyzMCre mice as controls. At circadian time (CT) 0 or CT12, mice were subjected to high tidal volume MV to induce VILI. Lung compliance, pulmonary permeability, neutrophil recruitment, and markers of pulmonary inflammation were analyzed to quantify VILI. To assess neutrophil inflammatory responses in vitro, myeloid cells from bone marrow of WT and Bmal1-deficient animals were isolated at dawn ZT0 (Zeitgeber time 0) and dusk (ZT12), incubated with DCFH-DA and stimulated for 15 min with PMA or PBS. Neutrophil activation (Ly6G/CD11b expression) and ROS production (DCFH-DA/Ly6G+ cells) were quantified.

    Results: Injurious ventilation of control mice at CT0 led to a significant worsening of oxygenation, decrease of pulmonary compliance, and increased mortality compared to CT12. LyzMCre-Bmal1-/- mice did not exhibit any significant differences when subjected to MV at CT0 or CT12. Mortality in LyzMCre-Bmal1-/- mice after VILI was significantly reduced compared to LyzMCre controls (CT0). Neutrophils isolated from control mice at ZT0 showed a significantly higher level of activation and increased ROS production after PMA-stimulation compared to ZT12. ROS production of LyzMCre-Bmal1-/- neutrophils did not differ from ZT0 to ZT12.

    Conclusions: The lack of the clock gene Bmal1 in myeloid cells leads to increased survival after injurious ventilation and to loss of circadian variations in neutrophil ROS production. This suggests that the internal clock in myeloid cells is an important modulator of VILI severity.

    P229 Hemodynamic resuscitation with fluids bolus and norepinephrine increases severity of the lung damage in an experimental model of septic shock

    P Guijo Gonzalez1, MI Monge García1, MA. Gracia Romero1, A Gil Cano1, M Cecconi2

    1Jerez Hospital SAS, Jerez de la Frontera, Spain; 2St. George’s Healthcare NHS Trust, London, UK

    Introduction: Hemodynamic resuscitation by means of fluids and norepinephrine (NE) is currently considered as a cornerstone of the initial treatment of septic shock. However, there is growing concern about the side effects of this treatment. The aim of this study was to assess the relationship between the hemodynamic resuscitation and the development of the ARDS.

    Methods: 18 New Zealand rabbits. Animals received placebo (SHAM=6) or lipopolysaccharide (LPS) with or without (EDX-R, n=6; EDX-NR, n=6) hemodynamic resuscitation (fluids: 20 ml/kg of Ringer’s lactate; and later NE infusion titrated up to achieve theirs initial arterial pressure). Animals were monitored with an indwelling arterial catheter and an esophageal Doppler. Respiratory mechanics were continuously monitored from a side-stream spirometry. Pulmonary edema was analyzed by the ratio between lung wet and lung dry weight (W/D), and the histopathological findings.

    Results: SHAM group did not show any hemodynamic or respiratory changes. The administration of the LPS aimed at increasing cardiac output and arterial hypotension. In the LPS-NR group, animals remained hypotensive until the end of experiment. Infusion of fluids in LPS-R group increased cardiac output without changing arterial blood pressure, while the norepinephrine reversed arterial hypotension. Compared to the LPS-NR group, the LPS-R group had more alveolar neutrophils and pneumocytes with atypical nuclei, thicker alveolar wall, non-aerated pulmonary areas and less lymphocyte infiltrating the interstitial tissue. In addition, the airway pressure increased more in the group LPS-R, and the W/D, although slightly higher in the LPS-R, did not show significant differences.

    Conclusions: In this model of experimental septic shock resuscitation with fluid bolus and norepinephrine increased cardiac output and normalized blood pressure but worsened lung damage.



    P231 Titration of positive end-expiratory pressure in severely obese patients with acute respiratory distress syndrome

    J Fumagalli1, R De Santis Santiago2, M Teggia Droghi2, G Larson2, S Kaneki2, S Palma2, D Fisher2, M Amato3, R Kacmarek2, L Berra2

    1Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, Milano, Italy; 2 Massachusetts General Hospital, Boston, USA,3University of São Paulo, São Paulo, Brazil

    Introduction: The selection of positive end-expiratory pressure (PEEP) in acute respiratory distress syndrome (ARDS) patients is an ongoing debate. Obese patients have been excluded from most of the clinical trials testing the effects of PEEP in ARDS. We hypothesized that in morbidly obese patients the massive load of the abdomen/chest further increases lung collapse thus aggravating the severity of respiratory failure due to ARDS.

    Methods: We performed a clinical crossover study to investigate the contribution of lung collapse to the severity of respiratory failure in ARDS obese patients and to determine the specific contribution of titrated PEEP levels and lung recruitment to changes in lung morphology, mechanics and gas exchange. Patients were studied at the PEEP (PEEPICU) levels selected at our institution and at PEEP levels establishing a positive end-expiratory transpulmonary pressure (PEEPINC) and at PEEP levels determining the lowest lung elastance during a decremental PEEP (PEEPDEC) trial following RM.

    Results: Thirteen patients were studied. At PEEPICU end-expiratory transpulmonary pressure was negative, lung elastance was increased and hypoxemia was present (Table 1). Regardless the titration technique there was no difference in the PEEP level obtained. At PEEPINC level end-expiratory lung volume increased, lung elastance decreased thus improving oxygenation. Setting PEEP according to a PEEPDEC trial after a RM further improved lung elastance and oxygenation. At PEEDEC level after a RM lung collapse and overdistension were minimized (Fig. 1). All patients maintained titrated PEEP levels up to 24 hours without complications.

    Conclusions: In severely obese patients with ARDS, setting PEEP according to a PEEPINC trial or PEEPDEC trial following a RM identifies the same level of optimal PEEP. The improvement of lung mechanics, lung morphology and oxygenation at PEEPDEC after a RM suggests that lungs of obese ARDS patients are highly recruitable and benefit from a RM and high PEEP strategy.

    Table 1 (abstract P231). Ventilator settings, respiratory mechanics, gas exchange
    Fig. 1 (abstract P231).

    Lung Collapse and Lung Overdistension. PEEP= Positive End-Expiratory Pressure; ICU= Intensive care Unit; EIT= Electrical Impedance Tomography. *p<0.05 compared to PEEP ICU (p < 0.05); # p<0.05 compared to PEEP INCREMENTAL

    P232 hMSCs subjected to cyclic mechanical stretch change the damage of endothelial cell induced by LPS

    J Li, C Pan, H Qiu

    Nanjing Zhongda Hospital, School of Medicine, Southeast University, Nanjing, China

    Introduction: Mechanical stretch could change the paracrine function of mesenchymal stem cells (MSCs), and possibly change the damage of endothelial cell induced by LPS. Although previous studies have focused intensively on the effects of chemical signals that regulate MSC commitment, the effects of physical/mechanical cues of the microenvironment on MSC fate determination have long been neglected.

    Methods: Human bone marrow MSCs proliferation was measured using CCK-8 assays after treatments with different time duration and stretch magnitude. To uncover the effect of stretch on MSC pro- and anti-inflammation, we measured the inflammation factors after been stimulated with the stretch. Additionally, we employed the morphological examination through Wright-Giemsa staining method to investigate the role of stretch on hMSCs. The VE-cadherin protein activities were assessed by using immunofluorescence.

    Results: Cyclic mechanical stretch could significantly change the morphological and paracrine function of hMSCs, but do not alter the surface markers expression.Furthermore, stretched hMSCs deteriorate the endothelial permeability.

    Conclusions: Our results showed that cyclic mechanical stretch significantly regulate human bone marrow MSC paracrine function. Therefore, more consideration would be took when MSCs engraftment in lung while breathing.This study provides insights into the mechanisms by which MSCs could be changed by the mechanical stretch.

    P233 Tidal volume reduction on ICU: a quality improvement project

    R Samanta, V Wong, R Talker, A Ecole, C Summers

    Addenbrooke’s Hospital, Cambridge, UK

    Introduction: Lung protective ventilation (LPV) strategies, principally focused around the use of tidal volumes <6 ml/kg predicted body weight (PBW) remains an enduring standard of care for ventilated patients. However, implementation of and compliance with LPV is highly variable. We used ‘nudge’-based interventions to assess if these can improve LPV.

    Methods: Ventilation data analysis over 2 years (186000 hours in 685 patients) showed patients had been ventilated with a median tidal volume of 7.4 ml/kg PBW with a significant proportion receiving over 8 ml/kg PBW (Fig. 1), an effect more pronounced in female patients and those with higher BMI.


    1. 1)

      Creation of a software tool to easily identify and monitor patients receiving tidal volumes that were too high for their PBW

    2. 2)

      Attached laminated reference guides to each ventilator to calculate PBW

    3. 3)

      Presentation, opportunistic education and verbal prompts to relevant clinical care staff regarding importance of LPV and use of PBW rather than actual body weight

    4. 4)

      Incorporating checking of tidal volumes on a daily ward rounds from junior clinical members

    Results: We collected hourly ventilation data of the patients over a 2-week period (2479 hours in 22 patients) following our interventions. There was, overall a statistically significant reduction tidal volume (p<0.001). There was improvement in the ventilation of male patients (p<0.001) but female patients endured higher tidal volumes. There was a mixed picture in different BMI grades.

    Conclusions: Reducing tidal volumes in mechanically ventilated patients can be done through a mix of behavioural and educational interventions, as well as using technological shortcuts. This helps to reduce the effort on the part of clinical staff to adhere to best practices, and ultimately improve patient outcomes.

    Fig. 1 (abstract P233).

    Distribution of tidal volumes recorded over a period of 2 years on the unit

    P234 Size matters: how can we improve intra-operative ventilation?

    L Raman, M Grover

    Northwick Park Hospital, London, UK

    Introduction: Lung protective ventilation (LPV) using a tidal volume (VT) of 6-8mL/Kg ideal body weight (IBW) is recommended in the intensive care unit and theatres to reduce the incidence of pulmonary complications. The aim of this audit was to assess the extent to which LPV is used in theatres in a busy district general hospital and to implement measures to promote adherence to the recommendations.

    Methods: Anaesthetists completed questionnaires for all patients undergoing general anaesthesia at Northwick Park Hospital over 1 week. Demographics, actual body weight (ABW), height, American Society of Anesthesiologists (ASA) score, and procedural information were recorded. Ventilatory parameters included the ventilation mode, VT, and positive end expiratory pressure (PEEP). The body mass index (BMI), IBW and VT (expressed in mL/Kg of ABW and IBW) were calculated for each patient. A Mann Whitney U test was used to compare IBW and PBW and a Chi squared test was used to identify an association between VT and other variables.

    Results: 129 patients were included; 65 males and 64 females. Mean age was 51. 73 patients were overweight (BMI >=25). 88% patients received PEEP. IBW was calculated in 106 patients and was significantly lower than ABW (61Kg [54-71] vs 72Kg [62-85] p<0.05) (Table 1). VT was higher when calculated from IBW than ABW (8.7mL/Kg [7.1-9.3] vs 7.5mL/Kg [5.8-7.9] p<0.05). 52 patients (49%) received LPV with VT of <8mL/Kg IBW in accordance with the recommendations (Fig. 1). Significantly more females (75%) received VT >=8ml/kg than males (29%) (p<0.01) (Fig. 2). VT was independent of age, ASA, BMI, ventilation mode, speciality, and patient position.

    Conclusions: Over half of the patients received VT >=8ml/kg IBW. Females were more likely to be over ventilated. A likely contributing factor is the disparity between ABW and IBW in this cohort. We organised staff teaching and constructed IBW charts with the appropriate corresponding tidal volumes to be displayed in all theatres to promote the use of LPV.

    Table 1 (abstract P234). Patient characteristics
    Fig. 1 (abstract P234).

    Lung protective vs Non lung protective ventilation

    Fig. 2 (abstract P234).

    Sex specific tidal volumes

    P235 Protective mechanical ventilation to prevent acute respiratory distress syndrome

    A Kuzovlev1, A Shabanov2, T Smelaya1, A Goloubev1, V Moroz1

    1Federal Research and Clinical Center of Intensive Care Medicine and Rehabilitology, V.A. Negovsky research institute of general reanimatology, Moscow, Russia; 2N.V. Sklifosofsky Moscow City Research Institute of Emergency Care, Moscow, Russia

    Introduction: Mechanical ventilation (MV) in protective mode seems the most reasonable way for prevention of acute respiratory distress syndrome (ARDS) in ventilator-associated pneumonia (VAP). The aim was to evaluate the efficiency of protective MV in preventing ARDS in VAP.

    Methods: This retrospective study was done in 2013—2017. 102 patients with abdominal sepsis and VAP were enrolled in the stuidy. Patients were split in 2 groups: 1. protective MV: VAP patients were ventilated in protective mode (tidal volume (TV) 6-8 ml/kg); 2. standard MV: VAP patients were ventilated with TV 8—10 ml/kg. The ARDS incidence was assesed as primary endpoint. Secondary endpoints: duration of MV, length of ICU stay, 30-day mortality. Statistical analysis was done by Statistica 7.0 (±25—75 percentiles interquartile range (IQR); p <0.05).

    Results: There were significant differences in ARDS incidence between groups: ARDS developed in 12.4% of protective MV groups vs. 68.3% of standard MV group (p=0.0001, Fisher’s exact test). VAP patients ventilated in a protective mode presented with lower duration of MV (12.2±4.2 days) and ICU stay(16.1±3.2 days) than patients with standard MV (17.2±5.2 and 20.1±5.5 days). There were significant differences in mortality rates between patient groups: 24.1% in protective MV and 47.2% in standard MV (p=0.0043, Fisher’s exact test).

    Conclusions: Protective MV prevents the development of ARDS in VAP septic patients.

    P236 Ultra-protective ventilation with coaxial endotracheal tube and moderately high respiratory rates reduces driving pressure

    N Carvalho1, C Moraes1, A Beda2, M Nakamura1, M Volpe3, S Gomes1, O Stenqvist4, M Amato1

    1University of São Paulo, São Paulo, Brazil; 2Federal University of Minas Gerais, Belo Horizonte, Brazil; 3Federal University of São Paulo, Guarujá, Brazil; 4Sahlgrenska Universty Hospital, Goteborg, Sweden

    Introduction: Reduction of tidal volumes (TV) below 6 mL/kg associated with low driving pressure (dP) might improve lung protection in patients with acute respiratory distress syndrome (ARDS). The current study tests the combination of coaxial double lumen endotracheal tube (to reduce instrumental dead-space) and moderately respiratory rate (RR) (<80 bpm) to maintain CO2 at clinically acceptable levels while using ultraprotective TV. The objective is to considerably reduce dP, which has been preconized as an index more strongly associated with survival than TV, per se, in ARDS patients. The ultraprotective ventilation setup proposed here kept the original tracheal tube and require nothing else than standard ventilator circuit and monitoring.

    Methods: 8 juvenile pigs were anesthetized, intubated and mechanically ventilated. Severe lung injury (P/F<100) was induced using a double-hit model: repeated surfactant wash-out followed by injurious mechanical ventilation using low positive end-expiratory pressure and high dP (~40cmH2O) for 3 hours. Then VTs of 6, 4, and 3 ml/kg were used in random sequence for 30 min each, both using a standard and coaxial endotracheal tube. At each VT level, RR was adjusted to achieve PaCO2=60 mmHg but not exceeding 80 bpm. Lung functional parameters and blood gas analysis were measured at each VT level. Statistical analysis was performed using mixed linear model.

    Results: Coaxial endotracheal tube, but not the conventional tube, allowed decreasing VT to 4 and 3 ml/kg, while keeping PaCO2 at approximately 60 mmHg and RR<80 bpm, reducing dP of 4.0 cmH2O and 6.0 cmH2O, respectively, compared to the conventional VT of 6 ml/kg (Fig. 1).

    Conclusions: In this ARDS model, coaxial tube ventilation associated with moderately high RR allowed ultraprotective ventilation (VT=3 ml/kg) and reduced dP levels, maintaining PaCO2 at acceptable levels. This strategy might have a significant impact on mortality of severe ARDS patients.

    Fig. 1 (abstract P236).

    Arterial pressure of CO2 and respiratory rate at each studied tidal volume (6, 4, and 3 ml/kg) and group (with and without coaxial ventilation)

    P237 The use of acute cor pulmonale risk score for the diagnosis of acute cor pulmonale in acute respiratory distress syndrome

    H Elsayed, T Zaytoun, O Hassan, A Sarhan

    Faculty of medicine - Alexandria university, Alexandria, Egypt

    Introduction: Acute cor pulmonale (ACP) is a common sequel in acute respiratory distress syndrome (ARDS) patients and represents the most severe presentation of right ventricular dysfunction, secondary to pulmonary vascular dysfunction. Although most previous studies adopt trans-oesophageal echocardiography in the diagnosis of ACP in ARDS, transthoracic echocardiography (TTE) appears as an attractive alternative being noninvasive, more available with continuously improving expertise in its use by the intensivists. Our study aimed to test the accuracy of ACP risk score by using TTE.

    Methods: Our study was carried out on 45 mechanically ventilated ARDS patients. The patients were mechanically ventilated using lung protective approach. TTE was performed within the first 72 hours of ARDS diagnosis. ACP was diagnosed when the ratio of right ventricular end-diastolic area/left ventricular end-diastolic area >0.6 in apical 4 chamber view (indicating right ventricular diastolic overload) associated with interventricular septum dyskinesia at end-systole (indicating right ventricular systolic overload). ACP risk score parameters were checked and scored (1 point for each parameter) (pneumonia as a cause of ARDS, driving pressure >= 18 cmH2O, PaCO2 >= 48 mmHg, PaO2/FiO2 ratio < 150 mmHg).

    Results: ACP risk score showed high sensitivity (100%), average specificity (51.43%) and good overall accuracy (62.2%) when 2 was used as cut off value. Hypercapnia, hypoxia, pneumonia, high plateau pressure and high positive end-expiratory pressure were associated with increased incidence of ACP in ARDS patients and considered as independent factors of mortality in ARDS patients.

    Conclusions: ACP risk score is a highly sensitive score in predicting and diagnosis of ACP in ARDS patients. The disease process, as well as the ventilatory maneuvers, may share in the development of ACP in ARDS.

    P238 Expiratory ventilation assistance improves arterial oxygenation in ARDS – a randomized controlled study in pigs

    J Schmidt, C Wenzel, S Spassov, S Wirth, S Schumann

    Medical Center - University of Freiburg, Freiburg, Germany

    Introduction: Mechanical ventilation aggravates ARDS. Expiratory ventilation assistance (EVA) showed an improved oxygenation in lung healthy pigs with similar tracheal pressure (Ptrach) amplitude and tidal volume (VT). We hypothesized that EVA improves gas exchange and attenuates ventilator induced lung injury in a porcine model of ARDS.

    Methods: 19 pigs with an oleic acid induced moderate ARDS (initial Horovitz index (HI) 100-150 mmHg) were randomly allocated to volume controlled ventilation or EVA ventilation with identical ventilation parameters (FiO2 0.8, VT 7 ml/kg body weight, PEEP 9 mbar, respiratory rate set to maintain arterial blood pH >7.2). PaO2 and Ptrach were measured every 30 min. After 3h lung tissue was excised, stained and alveolar wall thickness measured. Statistics were performed with linear mixed model analyses and unpaired t-test.

    Results: 5 pigs were excluded due to HI < 100 mmHg (n=2), malignant arrhythmia (n=1) and software error (n=2). EVA elevated PaO2 (107±11.3 vs. 164±21 mmHg, p=0.04) and mean Ptrach (16.7±1.6 vs. 21.5±1.1 mbar, p<0.0001). Alveolar walls were thinner in the EVA group (7.8±0.2 vs. 5.5±0.1 μm, p<0.0001).

    Conclusions: EVA ventilation improves gas exchange due to elevated mean Ptrach in experimental ARDS. Reduced alveolar wall thickness indicates potential lung protective effects.

    Funding: European Union, Horizon 2020 research and innovation programme, Grant #691519.

    P239 Application of transpulmonary pressure measurement in severe ARDS

    S Jog, J Mulchandani, A Garg, P Kalyani, P Rajhans, P Akole, B Pawar, B Bhurke, S Chavan, P Dalvi, S Sable, A Kulkarni, V Giri, N Kunjir, N Mahale, G Ranade, R Desai

    Deenanath Mangeshkar Hospital, Pune, India

    Introduction: Transpulmonary pressure(TPP) measurement is a promising tool used for ventilatory adjustments in ARDS patients. Its usefulness and application in the management of severe ARDS remains unclear. Aim of this study is – To measure TPP in severe ARDS patients and describe the possible interventions in ventilatory settings.

    Methods: Prospective observational study. TPP was measured by an esophageal balloon catheter(Nutrivent, Sidam, Italy) and a suitable ventilator(Hamilton G5, Switzerland). Inclusion criteria – Adult patients with severe ARDS (PO2/FiO2 ratio < 100) having PEEP >= 14 cm, Airway plateau pressure > 30 cm and under deep sedation with neuromuscular blockade. Exclusion Criteria – 1) Patients receiving ECMO 2) Patients in whom position of esophageal balloon and measurement of TPP could not be confirmed by cardiac oscillation and abdominal pressure technique. Ventilatory parameters and blood gas parameters were documented before measuring TPP and after measuring TPP. Ventilatory setting adjustments done were – i) increasing PEEP in hypoxic patients(SpO2 <88 %) till oxygen saturation remained > 90% provided plateau TPP remained < 23 cm and end expiratory TPP remained > 0 cm ii) Increasing tidal volume(>6 ml/kg) in hypercapnoeic acidotic patients (pH< 7.20 and PCO2> 80mm Hg) provided plateau TPP remained < 23cm

    Results: Data of 10 patients analysed. 6 out of 10 patients were obese. Average plateau TPP was 21.1 ±2.91 cm. Average end expiratory TPP was 1.2 ± 1.1cm. None of these 10 patients developed pneumothorax even at average PEEP of 20.7± 2.53cm, highest PEEP being 24cm (Table 1). 5 out of 10 patients died due to sepsis and MODS.

    Conclusions: TPP measurement is useful in setting appropriate PEEP and Tidal volume in severe ARDS patients in whom airway plateau pressure is more than 30 cm.


    Talmor D et al NEJM 2008

    Grasso S et al ICM 2012

    Table 1 (abstract P239). Ventilatory settings and blood gas variables before and after TPP measurement

    P240 Impact of a stepwise increase in PEEP (PEEP-trial) on haemodynamics, static compliance, functional residual capacity and tidal volume: a study in patients with picco-monitoring

    W Huber, J Wettstein, S Rasch, T Lahmer, A Herner, M Heilmaier, G Batres-Baires, R Schmid, U Mayr

    Klinikum rechts der Isar, Munich, Germany

    Introduction: Advanced respirator technologies provide automated stepwise PEEP-increases in parallel with measurement of functional residual capacity (FRC) and static pulmonary compliance (C_stat). While this PEEP-Trial (PT) might facilitate optimal PEEP-setting, it also carries the risk of haemodynamic instability due to the reduced venous return. We hypothesized that a standard PT with stepwise increases in PEEP of a total of 8cm H2O might result in substantial haemodynamic changes. Since heart lung interactions have been suggested to diagnose volume responsiveness, the PT could be used as a combined approach to optimize ventilation and haemodynamics.

    Methods: In 28 mechanically ventilated patients (Carescape R860; GE; pressure-controlled ventilation; PF-ratio of 249+-83) and PiCCO-monitoring, an automated five step PEEP-trial was performed. PEEP was increased from values 4cmH20 below to 4cmH2O above the preset PEEP-level. PiCCO-data, ventilator settings, FRC and C_stat were measured at baseline and after each increase.

    Results: Mean values for none of the haemodynamic parameters including heart rate, MAP, CVP, global end-diastolic volume index GEDVI, cardiac index, stroke volume variation SVV, extravascular lung water index EVLWI and cardiac power index CPI were significantly different at the end compared to the start of the PT. PEEP (12.2+-1.3 vs. 4.4+-1.4cmH2O; p<0.001) and FRC (1884+-850 vs. 1546+-612ml; p<0.001) were significantly higher after the PT, while all other respiratory data including tidal volume and C_stat were not different after the PT. Individual maximum values during the PT were significantly higher for tidal volume (516+-126 vs. 467+-146mL; p<0.001) and C_stat (50+-24 vs. 40+-23mL/cmH2O; p<0.001) compared to baseline.

    Conclusions: 1.) The PT did not impair haemodynamics. While the PT is haemodynamically safe, its use to detect volume deficiency seems to be low. 2.) Adjustment of PEEP according to the data of the PT substantially improves C_stat and tidal volume.

    P242 Restoration of homogenous ventilation does not impair the right heart function

    M Teggia Droghi1, R De Santis Santiago2, J Fumagalli3, S Kaneki2, G Galli2, S Palma2, G Larson2, M Bottiroli4, R Pinciroli4, F Marrazzo2, A Bagchi2, K Shelton2, A Sonny2, M Amato5, R Kacmarek2, L Berra2

    1University of Milano-Bicocca, Monza, Italy; 2Massachusetts General Hospital, Boston, MA, USA; 3Fondazione IRCCS Cà Granda Ospedale Maggiore Policlinico, Milan, Italy; 4ASST Grande Ospedale Metropolitano Niguarda, Milan, Italy; 5University of São Paulo, Heart Institute, São Paulo, Brazil

    Introduction: High levels of PEEP during mechanical ventilation can impair right heart function by increasing afterload [1]. Due to increased pleural pressure, patients with morbid obesity and Acute Respiratory Distress Syndrome (ARDS) need Recruitment Maneuver (RM) and a higher level of PEEP to restore and maintain homogeneous ventilation, thereby avoiding atelectasis [2]. We hypothesize that optimizing PEEP will not impair RV function, even if high levels of PEEP are needed in morbidly obese patients with ARDS.

    Methods: A clinical longitudinal study was performed in mechanically ventilated morbidly obese patients with ARDS admitted to the Surgical or Medical ICU. Echocardiographic indices of right heart function: TAPSE (Tricuspid annular plane systolic excursion) and S’ (Velocity of the tricuspid annulus during systole) were measured in the study cohort at 3 different levels of PEEP

    1. 1.

      Baseline: PEEP based on bedside saturation of oxygen (SatO2>92%).

    2. 2.

      Incremental: PEEP at +2cmH2O of transpulmonary pressure.

    3. 3.

      Decremental: PEEP titrated to best compliance of the respiratory system after a RM.

    During each phase of the study ventilatory parameters and oxygenation were also recorded.

    Results: 17 patients were enrolled (age: 50.4±14.2 years; BMI: 55.8±13.1 kg/m2). Table 1 shows oxygenation and respiratory mechanics. Figure 1: Echocardiographically measured right heart function.

    Conclusions: In morbidly obese mechanically ventilated patients with ARDS an increase in PEEP by 9 cmH2O (from 12.5±1.5 cmH2O to 21.2±3.3 cmH2O) did not impair right heart function, but improved respiratory mechanics and oxygenation.


    [1] Repessé et al. Chest 147.1: 259-265, 2015.

    [2] Fumagalli et al. Crit Care Med 45.8: 1374-1381, 2017.

    Table 1 (abstract P242). Ventilatory setting, oxygenation, mechanics and echocardiography measurements
    Fig. 1 (abstract P242).

    Mean and SD of TAPSE (Tricuspid annular plane systolic excursion) and S’ (Velocity of the tricuspid annular systolic motion) at the three different phases of the study: Baseline, Incremental, Decremental.

    P243 Heparin-binding protein in ventilator induced lung injury

    J Tyden1, N Larsson1, H Herwald2, M Hultin1, J Wallden1, J Johansson1

    1Umea university, Umea, Sweden; 2Lund university, Lund, Sweden

    Introduction: Mechanical ventilation can, while being lifesaving, also cause injury to the lungs. The lung injury is caused by high pressures and mechanical forces but also by inflammatory processes which are not fully understood [1]. Heparin binding protein (HBP) released by activated granulocytes has been indicated as a possible mediator of increased vascular permeability in the lung injury associated with trauma and sepsis [2, 3]. We wanted to investigate if HBP levels were increased in bronco alveolar lavage (BAL) fluid or plasma in a pig model of ventilator induced lung injury.

    Methods: Anaesthetized pigs were surfactant depleted by saline lavage and randomized to receive ventilation with either tidal volumes of 8 ml/kg with a PEEP of 8 cm H2O (controls, n=6) or 20 ml/kg with a PEEP of 0 cm H2O (ventilator induced lung injury (VILI) group, n=6). Plasma and BAL samples of HBP were taken at 0,1,2,4 and 6 hours (Fig. 1).

    Results: Characteristics of pigs by study group are shown in Table 1. Plasma levels of HBP did not differ significantly between pigs in the control and VILI group at any time of sampling. HBP levels in BAL fluid were significantly higher in the VILI group after 1 (p=0.04), 2 (p=0.03), 4 (p<0.01) and 6 (p=0.02) hours of ventilation (Fig. 2).

    Conclusions: In a model of ventilator induced lung injury in pigs, levels of Heparin binding protein in BAL fluid increased significantly over time compared to controls. Plasma levels however did not differ significantly between groups.


    1. Slutsky AS et al. N Engl J Med 369(22):2126-2136, 2013

    2. Johansson J et al. Acta Anaesthesiol Scand 57(5):580-586, 2013

    3. Bentzer P et al. Intensive Care Med Exp 4(1):33, 2016

    Table 1 (abstract P243). Characteristics of pigs by study groups. Data presented as median (interquartile range)
    Fig. 1 (abstract P243).

    Schematic overview of protocol

    Fig. 2 (abstract P243).

    Levels of Heparin binding protein (HBP)(ng/ml) in bronco alveolar lavage (BAL) fluid. HBP levels were significantly higher in the group receiving harmful ventilation at 2 (p=0.03), 4 (p<0.01) and 6 (p=0.02) hours of ventilation.

    P244 Efficacy and safety of corticosteroids for acute respiratory distress syndrome: an updated meta-analysis

    N Owattanapanich1, N Phaetthayanan 2

    1Faculty of Medicine, Siriraj hospital, Bangkok, Thailand; 2Nakhonpathom hospital, Nakhonpathom, Thailand

    Introduction: Background: The pharmacological treatment options for acute respiratory distress syndrome are limited. The use of corticosteroid remains controversial. In this study, we aimed to analyze the effect of corticosteroid regarding its efficacy and safety.

    Objectives: The primary outcome of this study was to assess the mortality benefit in corticosteroid group. The secondary outcomes were improvement of lung function, ventilator day and nosocomial infection rate.

    Methods: PubMed, Medline-Ovid, Scopus and EMBASE data searching were conducted. Additional searching was also performed by reviewing of relevant primary literatures and review articles. Randomized controlled trials and cohort studies which reported the mortality associated with corticosteroid treatment were selected. A random effect model was used to estimate mortality and other outcomes.

    Results: Of the 1,254 initially reviewed studies, 20 were selected for this meta-analysis. There was no statistically significant decrease mortality in corticosteroid group (95% CI: 0.53, 1.11) (Fig. 1). Heterogeneity was observed with I2 69%, df= 17 (P<0.00001). Corticosteroid group showed improvement of lung function regarding PaO2/FiO2 ratio (95% CI: 19.60, 73.74). However, there was no significant difference in ventilator day (95% CI: -4.56, 0.27) In terms of safety, the corticosteroid group had found no evidence of significant increase in nosocomial infection (95%CI: 1.00, 1.87) (Fig. 2).

    Conclusions: This meta-analysis concluded that corticosteroid treatment in ARDS provided no benefit in decreasing mortality. In addition, this treatment was not associated with increasing risk of nosocomial infection.


    Meduri GU et al. Chest 131(4):954-63, 2007

    Fig. 1 (abstract P244).

    Forest plot showed effect of steroid on mortality

    Fig. 2 (abstract P244).

    Forest plot showed incidence of nosocomial infection

    P245 The effect of surfactant administration on outcomes of adult acute respiratory distress syndrome patients: a systematic review and meta-analysis of randomized controlled trials

    S Meng, W Chang, Z Lu, J Xie, H Qiu, Y Yang, F Guo

    Department of Critical Care Medicine, Zhongda Hospital, School of Medicine, Southeast University, Nanjing, China

    Introduction: Acute respiratory distress syndrome(ARDS) patients usually lack of surfactant. Surfactant administration may be a useful therapy in adult ARDS patients. The purpose of this study was to perform a systematic review and meta-analysis of the effect of surfactant administration on outcomes of adult acute respiratory distress syndrome patients.

    Methods: PubMed, EMBASE, Medline, Cochrane database, Elsevier, Web of Science and were searched until December 2016.Randomized controlled trials comparing surfactant administration with general therapy in adults with acute respiratory distress syndrome were included. The primary outcome was mortality (7-10 days, 28-30 days and 90-180 days). Secondary outcome included a change in oxygenation (PaO2/FiO2 ratio). Demographic variables, surfactant administration, and outcomes were retrieved. Internal validity was assessed using the risk of bias tool. Random errors were evaluated with trial sequential analysis. Quality levels were assessed by Grading of Recommendations Assessment, Development, and Evaluation methodology.

    Results: Eleven RCTs and 3038 patients were identified. Surfactant administration could not improve mortality of adult patients [RR (95%CI)=1.02(0.93-1.12), p=0.65]. Subgroup analysis revealed no difference of 7-10-day mortality [RR(95%CI)=0.86(0.52-1.43), p=0.56], 28-30-day mortality[RR(95%CI)=1.00(0.89-1.12), p=0.98] and 90-180-day mortality [RR(95%CI)=1.11(0.94-1.32), p=0.22] between surfactant group and control group (Fig. 1). The change in the PaO2/FiO2 ratio was significant [RR(95%CI)=0.29(0.12-0.46), p=0.0008] (Fig. 2). Finally, trial sequential analysis and GRADE indicated lack of firm evidence for a beneficial effect.

    Conclusions: Surfactant administration may improve oxygenation but has not been shown to improve mortality for adult ARDS patients. Large rigorous randomized trials are needed to explore the effect of surfactant to adult ARDS patients.

    Fig. 1 (abstract P245).

    Forest plots of subgroup analyses on the effect of surfactant based on different days mortality. CI Confidence interval, M-H Mantel-Haenszel.

    Fig. 2 (abstract P245).

    Forest plots of the effect of surfactant based on PaO2/FiO2. CI Confidence interval, M-H Mantel-Haenszel.

    P246 Moderate to severe acute respiratory distress syndrome in a population of primarily non-sedated patients, an observational cohort study

    L Bentsen, T Strøm, P Toft

    Odense University Hospital, Odense C, Denmark

    Introduction: Acute Respiratory Distress Syndrome (ARDS) is a common condition in the intensive care unit (ICU). It is characterized by hypoxemia, bilateral opacities at thoracic radiography and the need for mechanical ventilation. The current standard treatment for patients with ARDS is the use of lung protective ventilation and deep sedation. The use of non-sedation for ICU patients has shown that it can reduce ICU length of stay and time in mechanical ventilation, but haven’t been investigated for ARDS.

    Methods: We conducted a retrospective study of all patients with ARDS admitted to the ICU of Odense University Hospital in the period from 1st January 2012 to 31st December 2016. All patients with moderate to severe ARDS as defined by The Berlin Definition of ARDS were included. We searched the ICU database for patients of at least 18 years of age, mechanical ventilated with at least 72 hours of ICU stay. At least two PaO2/FiO2 ratios of <200mmHg and intubation at some point during admission. The patients in the ICU are not sedated but treated with morphine bolus injections of 2.5 or 5mg until tolerance of the tracheal tube or tracheostomy. If non-sedation strategy is insufficient for patient treatment, sedation is targeted to a Richmond Agitation-Sedation Scale (RASS) of -2 to -3 with a daily wake up call.

    Results: We evaluated 1446 patients in the ICU database and of these 306 had moderate or severe ARDS. The median age was 67 (57-74) and 201 were male (65.7%). 30-day mortality was 39.9% and 25% of the patients was mobilized out of bed within 14 days after admittance to the ICU. Median APACHE II were 27 (23-31) and median SAPS II were 50 (41-62). The median RASS were 0 (-3-0) at day 1 and 7 of ICU admittance. Most patients were ventilated using pressure support.

    Conclusions: Non-sedation was used in patients with ARDS and the 30-day mortality was 39.9%. Compared to other studies our patients were more ill with similar mortality.

    P247 A new quantitative ct parameter as a one stop-shop to describe status and outcome of patients with acute respiratory distress syndrome

    P Leiser1, H Haubenreisser1, S Schönberg1, C Weiß2, M Hagmann2, J Schoettler3, FS Centner3, T Kirschning3, M Thiel3, J Krebs3

    1Institute of Clinical Radiology and Nuclear Medicine, Mannheim, Germany; 2Department for Medical Statistics and Biomathematics, Mannheim, Germany; 3Department of Anaesthesiology and Intensive Care Medicine, Mannheim, Germany

    Introduction: The aim of this study was to establish quantitative CT (qCT) parameters for pathophysiological understanding and clinical use in patients with ARDS. The most promising parameter is introduced.

    Methods: 28 intubated patients with ARDS obtained a conventional and a dual energy CT scan under an end-expiratory hold manoeuvre. Following manual segmentation, 138 volume-, perfusion- and lung weight-related qCT parameters were correlated with 71 anaesthesiological parameters such as applied ventilation pressures (PEEP, Pdrive), the patients’ oxygen supply (SO2, PaO2/FiO2) and established status and prognosis scores (SOFA, SAPS II). Multiple regression analysis was then performed to enable the prediction of these scores by a single CT scan.

    Results: Of all examined qCT parameters, excess lung weight (ELW) displayed the most significant results [1]. ELW correlates positively with the amount of extravascular lung water (r=0.72), atelectatic lung volume (r=0.92), applied PEEP (r=0.37) and negatively with the lung’s mean CM-Enhancement (r=-0.65; all p<0.05). More significantly than any other anaesthesiological parameter it correlates with the patient’s SOFA- (p<0.0001, r=0.69) and SAPS II-Score (p=0.0005, r=0.62). A combination of ELW, mean CM and Pdrive can predict SOFA up to r2=87.95%.

    Conclusions: ELW constitutes the best parameter to assess pathophysiology and status of patients with ARDS. It can help the clinician to predict outcome and mortality with higher accuracy than current standard Horowitz index (PaO2/FiO2) and should therefore be considered a first range diagnostic tool during the first hours of ICU treatment.


    1. Cressoni M et al. Critical Care 17:R93, 2013

    P248 Bronchoalveolar lavage as a diagnostic and prognostic tool in acute respiratory failure

    Y Iwasaki, Y Kida, M Kyo, K Hosokawa, S Ohshimo, N Shime

    Hiroshima University, Hiroshima, Japan

    Introduction: Bronchoalveolar lavage (BAL) is a useful tool for diagnosing diffuse pulmonary diseases. However, the utility of BAL for acute respiratory failure in the intensive care unit (ICU) setting have not been well investigated.

    Methods: We retrospectively collected consecutive 91 patients who were diagnosed as having acute respiratory failure with diffuse pulmonary involvements in an emergency and critical care unit of a University-affiliated Hospital from 2010 to 2017. We investigated the correlations between BAL analysis, clinical diagnosis and prognosis in these patients.

    Results: There were 63 males and 28 females (median age, 68 years [IQR 60-75]; PaO2/FIO2(P/F) ratio, 124 [IQR 75-176]. All patients were diagnosed as having acute respiratory distress syndrome (ARDS) based on the radiological and laboratory findings. The major findings of BAL included alveolar hemorrhage (n=34), neutrophilia (n=84), lymphocytosis (n=13), increased total cell counts (n=49) and positive polymerase chain reaction of Pneumocystis jirovecii DNA (n=7). Diagnoses considering the BAL results included bacterial pneumonia (n=10), viral pneumonia (n=5), fungal pneumonia (n=1), acute exacerbations of interstitial pneumonia (n=29), diffuse alveolar hemorrhage (n=34) and pneumocystis pneumonia (n=10). No significant differences were found in the P/F ratio between before and after the procedure of BAL (172 vs. 174, p=0.33). In the receiver operating characteristic (ROC) curve analysis, low lymphocyte fraction in BAL was poor prognostic factor (AUC, 0.81; 95%CI, 0.71-0.91). In the multivariate analysis, low lymphocyte fraction in BAL was the independent poor prognostic factor (p=0.0004; hazard ratio, 0.92; 95%CI, 0.88-0.99) after adjustment by age, sex, APACHE II score and SOFA score.

    Conclusions: BAL provided additional information for the clinically-diagnosed ARDS patients. Lymphocyte fraction in BAL sample could be a useful prognostic factor.

    P249 Oxygen delivery, carbon dioxide removal, energy transfer to the lungs and pulmonary hypertension behavior during venous-venous extracorporeal membrane oxygenation support: a mathematical modeling approach

    BA Besen1, LM Melro1, TG Romano2, PV Mendes1, M Park1

    1Hospital das Clínicas HCFMUSP, Faculdade de Medicina, Universidade de São Paulo, São Paulo, Brazil; 2ABC Medical School, Santo Andre, Brazil

    Introduction: Our objective is to describe and analyze, in a hypothetical patient with severe acute respiratory distress syndrome (ARDS), the following: (1) the energy transfer from the ventilator to the lungs; (2) venous-venous extracorporeal membrane oxygenation (VV-ECMO) oxygen transfer to patient oxygen consumption (VO2) match; (3) ECMO carbon dioxide removal, and (4) the potential effect of systemic venous oxygenation on pulmonary artery pressure.

    Methods: Mathematical modeling approach with hypothetical scenarios, using computer simulation.

    Results: The transition from protective ventilation to ultraprotective ventilation in a severe ARDS patient with static respiratory compliance of 20 mL/cmH2O reduced the energy transfer from the ventilator to the lungs from 35.3 to 2.6 Joules/minute. A hypothetical patient with VO2 = 200 mL/minute, high cardiac output and slightly anemic can reach an arterial oxygen saturation of 80%, while keeping the match between ECMO oxygen transfer and patient VO2 (Fig. 1). Carbon dioxide is easily removed and normal PaCO2 is frequently reached. The venous blood oxygenation through ECMO circuit drives the PO2stimulus of pulmonary hypoxic vasoconstriction to normal values (Fig. 2).

    Conclusions: Ultraprotective ventilation massively reduces the energy transfer from the ventilator to the lungs. Severe hypoxemia on VV-ECMO support may occur despite matching between ECMO oxygen transfer and patient’s VO2. Normal range of PaCO2 is easy to reach. VV-ECMO support potentially relieves hypoxic pulmonary vasoconstriction.

    Fig. 1 (abstract P249).

    SatO2 according to progressive QECMO increment, in different VO2 levels and a QLshunt fixed at 95%. Panel A shows this relation with QCO of 10 L/minute and Panel B shows this relation with QCO of 5.5 L/minute. VO2: oxygen consumption; QCO: Cardiac Output; QECMO: ECMO blood flow; QLshunt: Shunt fraction

    Fig. 2 (abstract P249).

    Oxygen partial pressure responsible for the hypoxic pulmonary vasoconstriction inhibition in four different clinical scenarios. The dotted line at PstimulusO2 of 19.2 mmHg represents the partial pressure during breathing of a healthy person. The other clinical scenarios reproduce mechanical ventilation in a severe ARDS patient with hypercapnia, pulmonary shunt fraction of 45%, and PvO2 = 20 mmHg (closed triangle); the closed squares represent this same patient after ECMO initiation, when the pulmonary shunt increased to 60% (Whitening-up phenomenon) and the PvO2 increased to 180 mmHg; at last, the open circles represent this same patient with a pulmonary shunt of 95% and a PvO2 of 300 mmHg

    P250 Early recovery from acute kidney injury during venovenous ECMO associates with improved survival in severe ARDS

    S Gaião, C Meng, R Vilares-Morgado, R Roncon-Albuquerque, J Paiva

    São João Hospital Center, Oporto, Portugal

    Introduction: Acute kidney injury (AKI) is frequently observed in patients with severe acute respiratory distress syndrome (ARDS) rescued with veno-venous extracorporeal membrane oxygenation (VV-ECMO) and has been associated with a negative impact in patient outcome. In the present study, we analyzed renal function of ARDS patients requiring VV-ECMO support and its association with clinical outcomes.

    Methods: Single-center retrospective study of patients (n=147; 45±11.9 years; 63% males) undergoing VV-ECMO for severe ARDS. Renal function was evaluated before VV-ECMO initiation and at ECMO-Day-1, -Day-3 and -Day-7, using the Kidney Disease: Improving Global Outcomes (KDIGO) AKI classification.

    Results: At intensive care unit admission, the median Simplified Acute Physiology Score II (SAPS-II) and Sequential Organ Failure Assessment (SOFA) scores were 45±15.9 and 9±3.1, respectively. Hospital mortality was 29.0%. At VV-ECMO initiation 86 patients (58.5%) had AKI, of which 54 (62.8%) improved renal function in the first week of VV-ECMO support. Patients with early recovery from AKI had lower SOFA (9±2.4 vs. 12±3.4), lactate (1.9±0.86 vs. 3.2±3.04; mM), renal replacement therapy (3.7 vs. 75.0; %) and hospital mortality (11.3 vs. 43.8; %), compared with patients without early AKI recovery. Regarding mechanical ventilation parameters before VV-ECMO initiation, patients with early recovery from AKI presented lower plateau pressure (30±6.1 vs. 35±8.9; cmH2O), lower driving pressure (18±5.9 vs. 22±8.1; cmH2O) and higher static respiratory system compliance (31±16.5 vs. 21±9.9; mL/cmH2O), compared with patients without early AKI recovery.

    Conclusions: In severe ARDS, AKI is frequently observed at VV-ECMO initiation. Nevertheless, patients showing early recovery from AKI during VV-ECMO present low hospital mortality. The potential impact of mechanical ventilation parameters in AKI recovery of patients with severe ARDS requiring VV-ECMO deserves further investigation.

    P251 Feasibility and safety of low-flow extracorporeal CO2 removal managed with a renal replacement platform to enhance lung-protective ventilation of patients with mild-to-moderate ARDS

    M Schmidt1, S Jaber2, E Zogheib3, T Godet4, G Capellier5, A Combes6

    1APHP, Pitie Salpetrière, Paris, France; 2CHU de Montpellier, Montpellier, France; 3chu amiens, Amies, France; 4Centre Hospitalier Universitaire (CHU) Clermont-Ferrand, Clermont-Ferrand, France; 5Besançon University Hospital, Besançon, France; 6APHP, Paris, France

    Introduction: Extracorporeal carbon-dioxide removal (ECCO2R) might allow ultraprotective mechanical ventilation with lower tidal volume (VT) (<6 mL/kg predicted body weight), plateau (Pplat) (<30cmH2O) and driving pressures to limit ventilator-induced lung injury. This study was undertaken to assess the feasibility and safety of ECCO2R managed with a renal replacement therapy (RRT) platform to enable ultraprotective ventilation of patients with mild-to-moderate ARDS.

    Methods: 20 patients with mild (n=8) or moderate (n=12) ARDS were included. VT was gradually lowered from 6 to 5, 4.5 and 4 mL/kg, and PEEP adjusted to reach 23<=Pplat<=25 cm H2O. Stand-alone ECCO2R (PRISMALUNG, no hemofilter associated with the RRT platform) was initiated when arterial PaCO2 increased by >20% from its initial value. Ventilation parameters (VT, RR, PEEP), respiratory system compliance, Pplat and driving pressure, arterial blood gases, and ECCO2R-system characteristics were collected during at least 24 hours of ultraprotective ventilation. Complications, day-28 mortality, need for adjuvant therapies, and data on weaning off ECCO2R and mechanical ventilation were also recorded.

    Results: While VT was reduced from 6 to 4 mL/kg and Pplat kept <25 cmH2O, PEEP was significantly increased from 13.4±3.6 at baseline to 15.0±3.4 cm H2O, and the driving pressure was significantly reduced from 13.0±4.8 to 7.9±3.2 cm H2O (both p<0.05). The PaO2/FiO2 ratio and respiratory-system compliance were not modified after VT reduction. Mild respiratory acidosis occurred, with mean pH decreasing from 7.39 ± 0.1 to 7.32 ± 0.10 from baseline to 4-mL/kg VT. Mean extracorporeal blood flow, sweep-gas flow and CO2 removal were 421±40 mL/min, 10±0.3 L/min and 51±25 mL/min, respectively. Mean treatment duration was 31±22 hours. Day-28 mortality was 15%.

    Conclusions: A low-flow ECCO2R device managed with an RRT platform easily and safely enabled ultraprotective mechanical ventilation in patients with mild-to-moderate ARDS. (ClinicalTrials: NCT02606240)

    P252 Thromboelastography-based anticoagulation management during extracorporeal membrane oxygenation: a safety and feasibility pilot study

    M Panigada1, G Iapichino2, M Brioni2, G Panarello3, A Protti1, G Grasselli1, G Occhipinti3, C Novembrino1, D Consonni1, A Arcadipane3, L Montalbano2, L Gattinoni4, A Pesenti1

    1Fondazione IRCCS Ca’ Granda Ospedale Maggiore Policl