37th International Symposium on Intensive Care and Emergency Medicine (part 3 of 3)

Introduction Imbalance in cellular energetics has been suggested to be an important mechanism for organ failure in sepsis and septic shock. We hypothesized that such energy imbalance would either be caused by metabolic changes leading to decreased energy production or by increased energy consumption. Thus, we set out to investigate if mitochondrial dysfunction or decreased energy consumption alters cellular metabolism in muscle tissue in experimental sepsis. Methods We submitted anesthetized piglets to sepsis (n = 12) or placebo (n = 4) and monitored them for 3 hours. Plasma lactate and markers of organ failure were measured hourly, as was muscle metabolism by microdialysis. Energy consumption was intervened locally by infusing ouabain through one microdialysis catheter to block major energy expenditure of the cells, by inhibiting the major energy consuming enzyme, N+/K + -ATPase. Similarly, energy production was blocked infusing sodium cyanide (NaCN), in a different region, to block the cytochrome oxidase in muscle tissue mitochondria. Results All animals submitted to sepsis fulfilled sepsis criteria as defined in Sepsis-3, whereas no animals in the placebo group did. Muscle glucose decreased during sepsis independently of N+/K + -ATPase or cytochrome oxidase blockade. Muscle lactate did not increase during sepsis in naïve metabolism. However, during cytochrome oxidase blockade, there was an increase in muscle lactate that was further accentuated during sepsis. Muscle pyruvate did not decrease during sepsis in naïve metabolism. During cytochrome oxidase blockade, there was a decrease in muscle pyruvate, independently of sepsis. Lactate to pyruvate ratio increased during sepsis and was further accentuated during cytochrome oxidase blockade. Muscle glycerol increased during sepsis and decreased slightly without sepsis regardless of N+/K + -ATPase or cytochrome oxidase blocking. There were no significant changes in muscle glutamate or urea during sepsis in absence/presence of N+/K + -ATPase or cytochrome oxidase blockade. Conclusions These results indicate increased metabolism of energy substrates in muscle tissue in experimental sepsis. Our results do not indicate presence of energy depletion or mitochondrial dysfunction in muscle and should similar physiologic situation be present in other tissues, other mechanisms of organ failure must be considered.


Introduction
Imbalance in cellular energetics has been suggested to be an important mechanism for organ failure in sepsis and septic shock. We hypothesized that such energy imbalance would either be caused by metabolic changes leading to decreased energy production or by increased energy consumption. Thus, we set out to investigate if mitochondrial dysfunction or decreased energy consumption alters cellular metabolism in muscle tissue in experimental sepsis.

Methods
We submitted anesthetized piglets to sepsis (n = 12) or placebo (n = 4) and monitored them for 3 hours. Plasma lactate and markers of organ failure were measured hourly, as was muscle metabolism by microdialysis. Energy consumption was intervened locally by infusing ouabain through one microdialysis catheter to block major energy expenditure of the cells, by inhibiting the major energy consuming enzyme, N+/K + -ATPase. Similarly, energy production was blocked infusing sodium cyanide (NaCN), in a different region, to block the cytochrome oxidase in muscle tissue mitochondria.

Results
All animals submitted to sepsis fulfilled sepsis criteria as defined in Sepsis-3, whereas no animals in the placebo group did. Muscle glucose decreased during sepsis independently of N+/K + -ATPase or cytochrome oxidase blockade. Muscle lactate did not increase during sepsis in naïve metabolism. However, during cytochrome oxidase blockade, there was an increase in muscle lactate that was further accentuated during sepsis. Muscle pyruvate did not decrease during sepsis in naïve metabolism. During cytochrome oxidase blockade, there was a decrease in muscle pyruvate, independently of sepsis. Lactate to pyruvate ratio increased during sepsis and was further accentuated during cytochrome oxidase blockade. Muscle glycerol increased during sepsis and decreased slightly without sepsis regardless of N+/K + -ATPase or cytochrome oxidase blocking. There were no significant changes in muscle glutamate or urea during sepsis in absence/presence of N+/K + -ATPase or cytochrome oxidase blockade.

Conclusions
These results indicate increased metabolism of energy substrates in muscle tissue in experimental sepsis. Our results do not indicate presence of energy depletion or mitochondrial dysfunction in muscle and should similar physiologic situation be present in other tissues, other mechanisms of organ failure must be considered.

Introduction
Bone mineral density (BMD) is reduced in critical care survivors [1], and long-term follow up has shown increased fracture risk [2]. It is unclear if these changes are a consequence of acute critical illness, or reduced activity afterwards. Bone health assessment during critical illness is challenging, and direct bone strength measurement is not possible. We used a rodent sepsis model to test the hypothesis that critical illness causes early reduction in bone strength and changes in bone architecture. Methods 20 Sprague-Dawley rats (350 ± 15.8g) were anesthetised and randomised to receive cecal ligation and puncture (CLP) (50% cecum length, 18G needle single pass through anterior and posterior walls) or sham surgery (cecum mobilised, no CLP), and then returned to their cages. 10 rodents (5 CLP, 5 sham) were sacrificed at 24 hours, and the remaining 10 at 96 hours. Femur bones were harvested and bone strength testing was conducted using the Instron 5543 (Instron Corp, USA). Trabecular bone strength was measured using a femoral neck break and cortical bone strength tested using a femoral shaft 3point bending test. Bone architecture was assessed using microcomputerised tomography (microCT) imaging (PerkinElmer, USA), and images analysed with BoneJ [3].

Results
All 20 rats survived to the end of the protocol. The load required to fracture the femoral neck and shaft was not significantly different for CLP and sham groups at 24 hours (97 ± 19N vs 81 ± 10N p = 0.12 and 127 ± 8N vs 119 ± 18N p = 0.35, respectively). However, at 96 hours there was a significant reduction in the fracture force at both the femoral neck and shaft in the CLP group, compared to sham (75 ± 11N vs 97 ± 13N p = 0.02 and 102 ± 20N vs 139.9 ± 28N p = 0.04). In contrast, there were no bone architecture differences, as measured by bone volume/total volume, trabecular thickness/separation, connectivity density, anisotropy and BMD (all p > 0.20) using microCT at 24 or 96 hours.

Conclusions
In this rodent model of sepsis, there is a significant reduction in trabecular and cortical bone strength at 96 hours. In the absence of changes in bone architecture, these findings suggest sepsis may induce early biochemical changes affecting bone strength. We plan further rodent experiments to confirm these results, increase our power, assess nano-mechanics and complete a histological analysis.

Introduction
Increasing evidence implicates mitochondrial dysfunction and endoplasmic reticulum (ER) stress, which activates the unfolded protein response (UPR), as contributors to critical illness-induced organ failure. Both can be alleviated by autophagy, a cellular defense mechanism. However, a phenotype of insufficiently activated autophagy has been observed during critical illness. We hypothesized that insufficient hepatic autophagy during critical illness aggravates liver damage/failure, hallmarked by mitochondrial dysfunction and ER stress.

Methods
In a centrally catheterized mouse model of critical illness, induced by cecal ligation and puncture, the effect of genetic inactivation of hepatic autophagy (via inducible deletion of autophagy gene 7 in liver) on survival, markers of organ damage, apoptosis, UPR and mitochondrial content and function was evaluated in the acute (30 hrs) and prolonged (3 days) phase. For each time point, 2 groups of critically ill mice and 2 groups of healthy pair-fed mice were included (at least 10 surviving mice per group), where each time autophagy was inactivated in one group but not in the other.

Results
Hepatic autophagy deficiency during critical illness did not affect survival, but increased hepatic damage/dysfunction. In the acute phase, this was illustrated by higher plasma ALT (P = 0.0001), and by elevated markers of apoptosis (P = 0.001) and more mitochondrial dysfunction (Complex V activity, P = 0.02) in liver. In the prolonged phase, hepatic autophagy inactivation increased apoptosis (P = 0.01) and aggravated mitochondrial dysfunction (Complex V activity, P = 0.005) in liver. Autophagy deficiency did not affect mitochondrial DNA content (day 1 P = 0.98, day 3 P = 0.57). Autophagy deficiency time-dependently modulated several branches of the UPR in liver. On day 1, it decreased activation of the IRE1alpha-XBP1s (P = 0.003) and ATF6-CREB3L3 pathway (P = 0.003), coinciding with a diminished inflammatory response as shown by lower C-reactive protein gene expression (P = 0.006), but did not affect the p-eIF2alpha pathway (P = 0.26). At day 3, autophagy deficiency increased the activation of the p-eIF2alpha pathway (P = 0.03), but not the IRE1alpha-XBP1s (P = 0.37) or ATF6-CREB3L3 (P = 0.14) pathway. Conclusions Insufficient hepatic autophagy during critical illness aggravates liver damage, coinciding with more mitochondrial dysfunction and a time-dependent modulation of the UPR, hereby likely aggravating liver failure.

Introduction
Obesity increases the risk of sepsis but how obesity shapes the immune responses to infection is unknown. Similar to patients, we previously demonstrated that Western diet fed obese mice have reduced lung inflammation during early sepsis. In this study we explore the potential mechanisms to explain this finding. Proprotein Convertase Subtilisin/Kexin Type 9 (PCSK9) is a protein involved in cholesterol homeostasis that is implicated in sepsis survival. Leptin is a hormone produced by adipocytes that regulates energy homeostasis and is increased in obesity and sepsis. We hypothesized that either Fig. 1 (abstract P351). See text for description

Introduction
The concept resuscitation of patients with septic shock, aiming at normalization of oxygen delivery (DO2), to limit tissue dysoxia and organ failure has not been confirmed in recent trials. Elevated plasma lactate in septic shock is considered as a key marker of inadequate DO2. We hypothesized that, apart from severely decreased levels, DO2 is not associated to plasma lactate in a model of septic shock.

Methods
We investigated the effects of circulatory shock and inflammation on plasma lactate in a retrospective analysis of 105 anesthetized endotoxemic (N = 61) or bacteremic (N = 44) piglets in shock. Tumor Necrosis Factor alpha (TNF-α) and Interleukin-6 (IL-6) were measured hourly during 6 hours (h) of shock. Muscle metabolism was monitored by microdialysis. The animals were stratified per degree of shock by DO2. The primary analysis was the breakpoint of insufficient DO2 to yield an elevated plasma lactate. ANOVA and regression models were used.

Results
All animals developed macrocirculatory shock, elevated plasma and muscle lactate levels, elevated levels of cytokines in plasma, as well as renal and pulmonary failure. At 3 h, DO2 was 289 ± 68 mL x min-1 x m-2 (mean ± SD) and plasma lactate levels were 2.7 (2.0-3.6) mmol x L-1 (median(IQR)). Mixed venous saturation (SvO2) decreased and oxygen extraction increased linearly with DO2 (p > 0.001). Oxygen consumption (VO2) was not DO2 dependent. Plasma lactate increased at DO2 < 250 mL x min-1 x m-2 (p < 0.001). Urinary output decreased at DO2 < 250 mL x min-1 x m-2 (p < 0.01), but static lung compliance was not DO2-dependent. Muscle glucose, lactate and pyruvate, urea and glutamate were not DO2-dependent. Muscle glycerol was DO2-dependent without breakpoint. Plasma lactate correlated to Mean Arterial Blood Pressure (MAP), DO2 and peak IL-6, but not Systemic Vascular Resistance Index (SVRI) and peak TNF-α.Urinary output correlated to DO2 and MAP. Static lung compliance did not correlate to any parameters above. Over time, muscle pyruvate increased and muscle glycerol and glucose decreased but no changes in muscle lactate and glutamate were seen. Muscle pyruvate correlated to MAP. Muscle glycerol correlated to MAP and to TNF-α.

Conclusions
In porcine experimental sepsis, elevated plasma lactate was only associated with very low DO2 while oxygen consumption was unaffected by low DO2 despite development of organ failure. Tissue metabolism was associated with both inflammatory and circulatory changes. Our findings suggest that the current concepts of resuscitation focusing on restoration of oxygen delivery must be combined with measures to limit the inflammatory response.

Introduction
We used a porcine sepsis model to investigate pulmonary hypoxia as an explanation for lactate elevation in sepsis. We measured the pulmonary lactate production and shunt fraction during bacteremia. Sepsis is a condition characterized by severe organ failure as a result of a dysregulated response to infection. Central in many pathophysiological theories is the decreased delivery or utilization of oxygen by tissues. Normal physiology dictates that hypoxia leads to lactate production, a prognostic marker in sepsis. Thus, hypoxia has been used as an explanation for both organ dysfunction and elevated plasma (p-)lactate in sepsis. However, new research has implied other mechanisms. Previous studies have reported increased pulmonary lactate production in sepsis [1], the lungs being a generally well-oxygenated organ. However, these studies have not measured pulmonary shunt fraction and hence cannot estimate if the entire lung is ventilated.

Methods
We used 13 anesthetized pigs where 9 were randomized to a sepsis group and 4 were randomized to a sham group. All pigs received a pulmonary artery catheter and an arterial line. Pigs in the sepsis group were infused with live Escherichia coli for 3 hours (h) and in the sham group with NaCl. Blood cultures were used to confirm bacteremia. Blood gases, blood tests and physiological parameters were collected hourly. Lactate production was calculated by the plactate gradient from pulmonary artery to systemic artery and cardiac index. Shunt fraction was estimated at 3 h after ventilation with 100% O2 for 5 minutes.

Results
Sepsis occurred in all pigs in the sepsis group, according to the criteria from Sepsis-3, and in no pigs in the sham group (p = 0.03). Global oxygen delivery (DO2) remained equal in both groups. Arterial lactate was higher (p = 0.003) in the sepsis group after 1 hour with a median value of 2.3 mmol/L vs. 0.95 mmol/L. Negative and positive lactate production occurred over the lungs in both groups (Fig. 2). There was no difference in pulmonary lactate production or pulmonary shunt fraction between groups. Neither group had significant shunt formation.

Conclusions
In this study we found a high p-lactate in septic pigs despite a high DO2. In absence of pulmonary shunts, the lung was not a major source, nor a major scavenger, of plasma lactate.

Introduction
Systemic lactate clearance (LaCl) higher than 10% during the first hours of sepsis resuscitation is associated with better outcomes, but the mechanisms are unclear. We aimed to investigate the relationship between lactate clearance, inflammatory response, and mitochondrial respiration.

Introduction
Endotoxin released during Gram-negative bacterial infections induces the production of pro-inflammatory cytokines and may accentuate the development of septic shock [1]. Gram-negative bacteria exposed to â-lactam antibiotics in vitro release endotoxin but in a minor extent when the â-lactam antibiotic is combined with an aminoglycoside [2]. The primary purpose of the study was to investigate the dynamics in endotoxin and interleukin-6 (IL-6) concentration as well as leukocyte activation and subsequent organ dysfunction in a large animal intensive care sepsis model in order to explore the relevance of antibiotic-induced endotoxin liberation and inflammatory response in vivo. Whether the addition of an aminoglycoside to a â-lactam antibiotic results in a reduced endotoxin release and systemic inflammation constituted a secondary aim. Methods A prospective placebo-controlled study was conducted on anesthetized pigs in an intensive care setting. All pigs were administered Escherichia coli as a 3h intravenous infusion. At 2h the animals were subjected to antibiotic treatment (n = 18) receiving either cefuroxime alone (n = 9) or the combination of cefuroxime and tobramycin (n = 9), whereas controls received saline (n = 18). During 4h after administration of antibiotics/saline, plasma endotoxin, IL-6, leukocytes and organ dysfunction variables were recorded hourly and differences to the values before treatment were calculated.

Results
All animals developed sepsis. Antibiotic-treated animals demonstrated a higher IL-6 response (p < 0.001), stronger leukocyte activation (p < 0.001) and more pronounced deterioration in pulmonary static compliance (p < 0.01) over time in comparison with controls. Animals treated with the combination demonstrated only a trend towards less inflammation in comparison with animals treated with cefuroxime alone. In plasma no differences in endotoxin concentration were observed between the groups. Conclusions Treatment with antibiotics elicits an inflammatory IL-6 response which is associated with leukocyte activation and pulmonary organ dysfunction, whereas no observable differences were seen in plasma endotoxin concentration. The reduction in cefuroxime-induced endotoxin release after the addition of an aminoglycoside in vitro could not be reproduced in vivo.

Introduction
Recent studies have revealed that inflammation mediated by CD4+ T cells may contribute to the pathogenesis of sepsis. The role of the Th(T helper)1/Th2 balance in sepsis remains largely unknown. The aim of this study was to investigate the th2/th1 pattern and its impact on disease severity and outcomes in patients with new onset community-acquired severe sepsis.

Methods
This was a prospective observational study. Patients with community-acquired severe sepsis admitted to ICU within 24 hours were included. Blood sample was collected on day of admission(Day0, D0), 3rd Day (D3) and 7th Day (D7) after admission. Th2 and Th1 in lymphocyte were tested by flow cytometry. The increase of th2/th1 (>0.22) indicated immunosuppression. According to the change of th2/th1, patients were divided into 3 groups: immunosuppression recovered in early stage (th2/th1 began to decrease on D3, Group 1), immunosuppression recovered in late stage (th2/th1 began to decrease on D7, Group 2), immunosuppression worsened (th2/th1 kept increasing in a week, Group 3). The organ dysfunction, hospitalacquired infection(HAI) and 28-day prognosis was recorded. All patients or their legal representatives provided written informed consent. The study is registered with ClinicalTrials.gov, NCT 02883218.

Results
Seventy-four patients were eligible for study during Sept 18, 2014 to Sept 30, 2016. There were 34 cases in Group 1, 19 cases in Group 2, and 21 cases in Group 3. Baseline characteristics(age, sex, source of bacteremia, presence of comorbidities) of the population in different groups were similar.
(2) There were no significant differences in terms of incidence of HAI and organ dysfunction among groups. (3) The areas under the receiver operating characteristic (AUC) curves of value of th2/th1 on D7 was of great value(0.875) (Fig. 4). Using a th2/th1 on day7 cutoff value of >2.74 to determine 28-day mortality, the sensitivity was 76.2% with 96.1% specificity. Conclusions Persisitent shift of th1 to th2 in one week after diagnosis of community-acquired severe sepsis may be a predictor for immunosuppression. Patients with persistently increasing th2/th1 have poor outcome.

Introduction
Product of AQP5 gene belongs to a family of aquaporins (AQPs), membrane proteins, responsible for the selective transmembrane transport of water. However, the value of polymorphic variants AQP5 in the development and progression of pul_monary edema in severe lung infection was studied so far. The aim of the investigation was to determine the value of genetic variants of a single nucleotide polymorphic site rs3736309 of intron 3 of aquaporin_5 (AQP5) gene in the course of critical illness in patients with documented pulmonary infection.

Results
The distribution of frequencies of genotypes AA, GA and GG (AQP5, rs3736309) in cohort of patients corresponded to Hardy_Weinberg equilibrium (P = 0.923) and was similar to frequencies of the alleles determined in healthy Caucasian individuals (literature data) (P > 0.05). In a subgroup of patients with septic shock and AQP5 AA (rs3736309) genotype the lower EVLWI values were found compared to patients with geno_types GG and GA with septic shock in spite of the same approach to treatment. Genetic variant AQP5 G+ (rs3736309) contributed to the development of pulmonary edema resistant to treatment (odds ratio, OR = 6,75; P = 0.032). Only the subgroup of patients with septic shock and geno_type G + (but not all patients or the subgroup of patients without  Introduction Sepsis causes impairment of innate and adaptive immunity by multiple mechanisms, including depletion of immune effector cells and T cell exhaustion. Although lymphocyte dysfunction is associated with increased mortality and potential reactivation of latent viral infection in patients with septic shock, the relation between viral reactivation and lymphocyte dysfunction is obscure. The objectives of this study were 1) to determine the relation of lymphocyte dysfunction to viral reactivation and mortality, and 2) to evaluate recovery of lymphocyte function during septic shock, including T cell receptor (TCR) diversity and the expression of programmed death 1 (PD-1).

Methods
In 18 patients with septic shock and latent cytomegalovirus infection, serial blood samples were obtained on days 1, 3, and 7 after the onset of shock, and immune cell subsets and receptor expression were characterized by flow cytometry. TCR diversity of peripheral blood mononuclear cells was analyzed by Multi-N-plex PCR, and cytomegalovirus DNA was quantified using a real-time PCR.

Results
Monocytes showed a decrease of TCR diversity and HLA-DR expression in the early stage of septic shock, while CD4+ T cells displayed an increase of PD-1 expression. Normalization of TCR diversity and PD-1 expression was observed by day 7, except in patients who died. cytomegalovirus reactivation was detected in 3 of the 18 patients during the first week of their ICU stay and all 3 patients died.

Conclusions
These changes are consistent with the early stage of immune cell exhaustion and indicate the importance of normal lymphocyte function for recovery from septic shock. Ongoing lymphocyte dysfunction is associated with cytomegalovirus reactivation and dissemination, as well as with unfavorable outcomes.

Introduction
Vasopressin is a safe and effective 'catecholamine-sparing' vasopressor in septic shock [1]. It also has anti-inflammatory properties, including inhibition of endotoxin-induced inflammatory cytokine release from macrophages in-vitro [2]. To further assess vasopressin's anti-inflammatory effects in sepsis, we developed an in-vitro assay of monocyte priming and deactivation to model the pro-and antiinflammatory responses, respectively.

Results
Pre-treatment of monocytes with LPS resulted in a primed phenotype of higher HLA-DR expression and TNF release on stimulation, whereas IL-10 reduced HLA-DR, CD86, and TNF release indicating a deactivated phenotype. Under normal and priming conditions, coincubation with vasopressin alone or with noradrenaline significantly reduced TNF release (Fig. 5), but not HLA-DR/CD86 expression. In contrast, neither vasopressin nor noradrenaline affected the IL10induced deactivated phenotype.

Conclusions
The vasopressin-mediated suppression of TNF release in normal or primed monocytes, but not in deactivated monocytes, suggests a selective immune-modulatory activity that may be beneficial in septic patients and warrants further investigation.

Introduction
Programmed death antigen (PD-1) and ligand (PD-L1) are inducible negative regulators on leukocyte surface. The PD-1/PD-L1 pathway contributes to lymphocyte exhaustion and immunosuppression in sepsis [1]. A clinical trial is currently evaluating the safety of an anti-PD-L1 antibody in sepsis patients [2]. However, serum levels and differences in PD-1/PD-L1 expression by B and T cell subsets are unknown and are likely to influence the efficacy of this intervention. We tested the hypothesis that surface PD-1/PD-L1 expression will differ in B and T cell subsets, and that serum PD-1/PD-L1 levels will be high in sepsis. Methods A prospective observational cohort study in 22 critically ill adult sepsis patients, excluding those with immune deficiency states, was done with ethics approval and informed consent. Blood was taken on ICU admission day and PBMCs isolated. Patients (11 survivors & 11 non-survivors) were compared with contemporaneously collected and analysed samples from 11 healthy controls. Cell surface staining was performed with antibodies to CD3, CD19 [BD Biosciences], CD4, CD27, PD-1, PD-L1 and PD-L2 [Biolegend], and live dead stain Amcyan [Invitrogen]. FACS analyses were performed on a FACScalibur flow cytometer [BD Biosciences] and using Tree Star FlowJo software. Serum PD-1 and PD-L1 in the same cohort were measured by ELISA [Proteintech]. Data was analysed using PRISM.

Results
The percentages of B and CD4+ T cells expressing PD-1 and PD-L1 were significantly higher in survivors and non-survivors of sepsis compared to healthy controls. There were no significant differences between survivors and non-survivors in expression of PD-1 or PD-L1 by any lymphocyte subset. CD4 + CD27-memory T cells had significantly higher mean fluorescence intensity (MFI) and percentage positivity of PD-1 than CD4 + CD27+ T cells. The PD-1 MFI was significantly higher in CD19 + CD27+ memory B cells than CD19 + CD27-B cells. Serum PD-1 and PD-L1 concentrations were not different in sepsis patients compared to controls and values did not correlate with surface expression of PD-1 or PD-L1 by any lymphocyte subset in sepsis patients.

Conclusions
We show higher expression of PD-1 and PD-L1 by B cells and CD4+ T cells in sepsis compared to health, and higher expression of PD-1 by memory compared to naïve B cells and CD4+ T cells. Further research to identify patients likely to benefit from PD-1/PD-L1 blockade in sepsis is required.

Introduction
Reduced activity of proprotein convertase subtilisin/kexin type 9 (PCSK9), by increasing the density of low density lipoprotein (LDL) receptors on hepatic cells, may decrease the systemic inflammatory response to sepsis due to increased clearance of pathogen lipids incorporated into LDL [1]. The purpose of this study was to determine the relationship between PCSK9 loss-of-function (LOF) variants and the risk of short and long-term death and/or hospital readmission(s) within a year following an episode of sepsis, in two distinct cohorts.

Methods
This was a retrospective observational study involving two cohorts from St. Paul's Hospital in Vancouver, Canada: Cohort 1 was composed by 189 patients with septic shock admitted to intensive care unit between July 2000 and January 2004 and who survived at least 60 days posthospitalization; Cohort 2 included 185 patients admitted to the Emergency Department from January 2011 to July 2013, with clinical diagnosis of sepsis. R46L, A53V, and I474V PCSK9 missense LOF SNPs were genotyped in all patients, who were classified in 3 groups: WT, 1 LOF, and 2 or more LOF according to the number of LOF alleles. Results Cohort 1: Time to event curves for 5-year mortality showed a trend to statistically lower probability of death within 5 years in patients with PCSK9 LOF (overall p = 0.062) and the presence of 1 PCSK9 LOF allele (Cox model) was independently associated with decreased hazard-ratio for 5-year mortality (0.578, 95% CI = 0.36-0.93, p = 0.024).
Cohort 2: Patients from the 2 or more LOF group had lower probability of death within 90 days in comparison to WT (p = 0.010) or 1 LOF patients (p = 0.028), and the presence of 1 PCSK9 LOF allele (Cox model) was independently associated with decreased HR for 90-day mortality (0.46, 95% CI 0.23-0.92, p = 0.030). Patients from the 2 or more PCSK9 LOF group also had lower probability of death or infection related readmissions when compared to WT (p = 0.015) or 1 LOF allele (p = 0.002), and the presence of 2 or more PCSK9 LOF alleles had the lowest adjusted HR for this outcome (HR = 0.32, 95% C.I. 0.11-0.92, p = 0.035). Conclusions PCSK9 LOF genotype is associated with decreased risk of 5-year and 90-day mortality, and all-cause 1-year death or readmissions due to infection after an episode of sepsis.

Introduction
Some critically ill patients are at high risk of severe sepsis due to infections with Pseudomonas aeruginosa (PSA) in the lung or abdomen, which is difficult to treat. The present study was performed to find out whether these critically ill patients on a surgical intensive care unit with sepsis due to PSA have a characteristic monocyte surface receptor expression and cytokine secretion patterns.

Methods
The surface markers CD163 (hemoglobin scavenger receptor; clearance of hemoglobin, adhesion to endothelial cells, tolerance induction, tissue regeneration; soluble form: antiinflammation), CD206 (mannose receptor for mannose on surface on microorganisms), intracellular levels of IFN-γ, CXCR1 (IL-8α chemokine receptor) and CXCR2 (IL-8β chemokine receptor) of monocytes of critically ill patients with PSA sepsis and of healthy controls were analyzed by flow cytometry. Furthermore, the IL-8 secretion levels in vivo and ex vivo after LPS stimulation were determined by ELISA. Results 20 surgical patients with severe sepsis / septic shock with underlying PSA infections and of 22 healthy controls were monitored. The monocytes of the patients showed differences in the IL-8 secretion level. In line with high IL-8 or low IL-8 secretion in serum (965 ± 139 pg/ml vs. 232 ± 15 pg/ ml) as well as in culture after LPS stimulation (2838 ± 259 pg/ml vs. 1097 ± 356 pg/ml), the expression of surface markers IFN-γ, CXCR1, CXCR2 and CD163 was high or low, respectively, however, always above those of the healthy control group (p < 0.05). CD206, only, showed the opposite behavior in that CD206 was highly expressed (3657 ± 279 MFI) on IL-8 low cells, whereas a low expression (17 ± 6 MFI) could be observed on IL-8 high cells (p < 0.001). The IL-8 low group had markedly higher severity of disease scores (SAPSII 34 ± 8) than the IL-8 high group (SAPSII 17 ± 6), and worse outcome (p < 0.001). Conclusions Different patterns of monocyte surface pattern expressions are associated with low or high IL-8 secretion of monocytes in patients with PSA sepsis. Low IL-8 expression and the respective surface pattern on monocytes may be associated with worse outcome.

Introduction
The goal of this investigation is to characterize the effect of acute influenza infection on the host metabolome. Metabolomics is an emerging field of research studying small molecules or metabolites providing a profile of the physiologic status of an organism at a point in time. The human metablomic response to acute influenza infection is not well-characterized. We hypothesized that acute influenza infection will induce a distinct metabolic response in the host which may enable the identification of host mediators in influenza pathogenesis.

Methods
We are conducting a randomized clinical trial administering atorvastatin or placebo to patients with acute influenza infection. As an exploratory aim, we assessed the metabolomic profile of enrolled patients at baseline, prior to study drug administration, compared to healthy controls. T-tests were used to compare 117 metabolites. Raw data were entered into MetaboAnalyst statistical software for analysis. We report findings based on Partial Least Squares-Discriminant Analysis (PLS-DA) which uses multivariate regression techniques to predict class membership based on original variables.

Results
We performed metabolomic analysis on serum samples of 49 statinnaïve patients with acute influenza and 25 healthy controls. We found 17 individual metabolites that were significantly different between groups at a threshold of p = 0.05. PLS-DA score plot demonstrated a distinct difference between influenza subjects and controls (Fig. 6) and PLS-DA model validation by permutation tests was highly significant (p < 5e-04). The most significant discriminating metabolites between influenza patients and controls were phosphocholine, phosphoethanolamine, nicotinamide, taurine, ADP, tryptophan, threonine, proline, citrulline (lower in flu samples) and kynurenine, acetoacetate, 3-hydroxybutyrate, hypoxanthine (higher in flu samples). Each of these metabolites had a VIP score of >1.4

Conclusions
In our metabolomics analysis of ED patients with acute influenza, we found a statistically significant difference in 17 metabolites as compared to healthy controls. We believe these data represent a potentially unique metabolic fingerprint in influenza infection. Further study is needed to elucidate potential metabolic pathways and host mediators that may contribute to influenza pathogenesis.

Introduction
The size and functionality of the multimeric von Willebrand factor (VWF) molecule is regulated by its cleaving protease ADAMTS-13. While VWF and ADAMTS-13 levels correlate with disease course and outcome in the heterogenous population of septic patients, animal models have been inconclusive and mainly focused on gramnegative abdominal sepsis.

Conclusions
In conclusion, this is the first study that consistently shows the relation of VWF, ADAMTS-13 and their ratio to disease severity in patients and mice with [i]S. aureus[/i] sepsis. Not only is the balance of VWF and its cleaving protease implicated in primary adhesion and bacterial retention, but VWF/ADAMTS-13 ratio also regulates the amount of organ microthrombi containing platelets, neutrophils and bacteriaand thus potentially end organ failure. Targeting VWF multimers and/or the relative ADAMTS-13 deficiency that occurs in sepsis, should be explored as a potential new therapeutic target in [i]S. aureus[/i] endovascular infections.

Introduction
During infection, there is an activation of the L-arginine-nitricoxide pathway, with a shift from nitric oxide synthesis to a degradation of L-arginine to its metabolites, asymmetric and symmetric dimethylarginine (ADMA and SDMA). We investigated the association of L-arginine, ADMA, and SDMA with adverse clinical outcomes in a well-defined cohort of patients with communityacquired pneumonia (CAP).

Methods
We measured L-arginine, ADMA, and SDMA in 268 CAP patients from a Swiss multicenter trial by mass spectrometry and used Cox regression models to investigate associations between blood marker levels and disease severity as well as mortality over a period of 6.1 years.

Results
Six-year mortality was 44.8%. Admission levels of ADMA and SDMA (μ mol/L) were correlated with CAP severity as assessed by the pneumonia severity index (r = 0.32, p < 0.001 and r = 0.56, p < 0.001 for ADMA and SDMA, respectively) and higher in 6-year non-survivors versus survivors (median 0.62 vs. 0.48; p < 0.001 and 1.01 vs. 0.85; p < 0.001 for ADMA and SDMA, respectively). Both ADMA and SDMA were significantly associated with long-term mortality (hazard ratios [HR] 4.44 [95% confidence intervals (CI) 1.84 to 10.74]) and 2.81 [95%CI 1.45 to 5.48], respectively). No association of L-arginine with severity and outcome was found. Conclusions Both ADMA and SDMA show a severity-dependent increase in patients with CAP and are strongly associated with mortality. This association is mainly explained by age and comorbidities.

P369
Decreased functional protein c levels may be predictive of the severity of sepsis associated coagulopathy and DIC D Hoppensteadt 1 , A Walborn 1 , M Rondina 2 , K Tsuruta 3 , J Fareed 1

Introduction
Sepsis is a severe systemic inflammatory response to infection that manifests with widespread inflammation as well as endothelial and coagulation dysfunction that may lead to hypotension, organ failure, shock, and death. Disseminated intravascular coagulation (DIC) is a complication of sepsis involving systemic activation of the fibrinolytic and coagulation pathways that can lead to multi-organ dysfunction, thrombosis, and bleeding, with a two-fold increase in mortality. Several studies have reported that low levels of protein C predict outcome in patients with severe sepsis. Protein C helps to regulate coagulation by controlling the activation of factors Va and VIIIa. In addition, activated protein C has anti-inflammatory functions. The purpose of this study was to determine the functional protein C levels in this cohort of patients over the 8 day study period and to correlate protein C levels with survival.

Methods
De-identified serial plasma samples from patients diagnosed with sepsis-associated coagulopathy (n = 137) were obtained from the University of Utah under an IRB approved protocol. The citrated plasma samples were collected from adult patients in the ICU upon admission and ICU days 4 and 8. In addition, plasma samples from healthy volunteers (n = 50) were purchased from George King Biomedical (Overland, KS). P). Patients were assigned a DIC score based on the International Society of Thrombosis and Hemostasis (ISTH)criteria and categorized as having sepsis and no DIC, non-overt DIC and overt DIC. Plasma samples were analyzed for functional protein C levels using a clot-based method (Diagnostica Stago, Parsippany, NJ).

Results
The functional protein C levels on day 0 and day 4 were decreased in the septic patients compared to normal controls (p < 0.0010). In addition, the functional protein C levels decreased with an increase in severity of DIC. On day 8, there was a decrease in the functional protein C levels in both the non-overt and overt DIC groups compared to normal and patients with sepsis and no DIC. There also was a significant decrease in functional protein C levels in survivors compared to non-survivors (p < 0.010).

Conclusions
These results underscore the importance of functional protein C levels in the regulation of hemostasis. Furthermore, these studies demonstrate that decreased functional protein C contributes to the pathogenesis of sepsis associated coagulopathy as evident by the observed relationship with the severity of sepsis and decreased protein C levels. Thus functional protein C levels may be a useful prognostic marker to risk stratify patients with sepsis and DIC.

Introduction
According to the third international consensus sepsis is defined as life-threatening organ dysfunction caused by a dysregulated host response to infection. The aim of the study was to investigate the possibility of using additional laboratory marker cholesterol for early detection of multiple organ dysfunction syndromes (MODS) after abdominal surgeries. Methods After approval the ethics committee of the Mogilev Regional Hospital in a prospective observational study included 58 patients aged 18 to 85 years. All patients underwent laparotomy abdominal surgeries after which hospitalized in the intensive care unit. In R group (n = 30) of patients in the postoperative period was followed by the development of MODS, in the C group (n = 28) -patients without MODS. Determination of cholesterol produced daily by AU 680 biochemistry analyzer.

Results
Patients on the R group marked decrease in the level of cholesterol on the 4th day after surgery with 158. 7    with a fatal outcome 116.1 (100.6; 127.7) mg / dL (Mann-Whitney U test, p = 0.034). In the surviving patients reduced cholesterol level normalization was observed after 14 days after the operation. We performed ROC-analysis to determine the diagnostic significance of cholesterol as a marker MODS. The area under the curve was 0.726 (p <0.001) 95% confidence interval from 0.684 to 0.769, sensitivity 73.6%, specificity 63.8%. The optimal threshold level of cholesterol as a predictor of MODS was determined of 130.0 mg / dL.

Conclusions
Determining the level of cholesterol, together with an assessment of clinical symptoms and blood formula, will provide early diagnosis of MODS after abdominal surgeries.

Introduction
Antibiotic therapy is a very important treatment for critically ill patients, but long-term administration can lead to antimicrobial resistance. Procalcitonin (PCT) has been used as a biomarker to monitor the effectiveness of antibiotic therapy with the aim of shortening the administration period. This study aimed to clarify the relationship of WBC counts to sepsis-related biomarkers (procalcitonin [PCT], endotoxin activity assay [EAA], interleukin-6 [IL-6], and presepsin) and 28day mortality rate in critically ill patients.

Methods
We studied 422 patients (L-group with WBC counts <4000: 79 patients; H-group with WBC counts >12,000: 343 patients). Blood biochemistry and PCT, EAA, IL-6, and presepsin were measured immediately after ICU admission. Results were expressed as the mean ± SD (median). The Mann-Whitney U-test and chi-square test or Fisher's exact test were used for statistical analysis. A p-value of <0.05 was considered to indicate a statistically significant difference.

Results
Regarding background factors, the APACHE2 scores in the L-group and H-group were 24.9 ± 9.2 (23.5) and 22.4 ± 9.1 (23.0), and the SOFA scores were 9.3 ± 3.9 (9) and 12.9 ± 3.4 (8), respectively. These values showed no significant differences between groups. PCT levels in the L-group and H-group were 65 ± 96 (25) and 20 ± 51 (3.2) respectively; EAA levels in the L-group and H-group were 0.59 ± 0.26 (0.62) and 0.36 ± 0.22 (0.32), respectively. IL-6 levels in the L-group and H-group were 61,728 ± 10,8417 (15,150) and 2440 ± 3422 (217), respectively. All of these values in the L-group tended to be higher than in the H-group. On the other hand, presepsin levels in the Lgroup and H-group were 1277 ± 943 (882) and 2440 ± 422 (1210), respectively. The 28-day mortality rate in the L-group was 34%, and 14.7% in the H-group. There was a significant difference between groups (p < 0.05).

Conclusions
In critically ill patients who entered the ICU, most sepsis-related biomarkers (PCT, EAA, IL-6) tended to be higher in the low leukocyte count (WBC < 4000) group, but only presepsin tended to be higher in the high WBC (>12,000) group. Further study to explain this difference is necessary.

Introduction
Immunologic alterations are common during critical illness and may determine outcome. We evaluated whether lymphopenia (lymphocyte count <1000*106/mmc) or a higher neutrophil-lymphocyte ratio (NLR, as a marker of inflammation) were associated with mortality in critically ill patients.

Methods
Prospective observational study on 221 consecutive adult patients admitted to our 14-bed Intensive Care Unit (ICU). Neutrophil and lymphocyte counts were recorded every day. Receiver operating characteristics (ROC) curves and binary logistic regression analysis were used to test the association between lymphopenia/NLR and ICU-mortality.

Introduction
In sepsis, chemotactic factors induce swelling and activation of leukocytes. We have invented a novel portable bedside device based on the principles of magnetic levitation (two opposing magnets with a capillary tube in between that suspend cells) to image and quantify morphological properties of circulating leukocytes using whole blood. The device separates blood cells based on their mass and magnetic properties. Our aim was to determine whether magnetic levitation technique, by measuring leukocyte size and morphology parameters can accurately identify Emergency Department(ED) patients with sepsis.

Methods
Single-center, prospective, observational cohort study of a convenience sample of adult (>17y) patients from a 56.000 visit ED or affiliated out-patient lab between 3/2016-11/2016. Inclusion criteria: Sepsis: patients admitted to the hospital with suspected or confirmed infection. Non-infected Controls: ED or outpatient clinic patients without infection or acute illness. Procedures: Half a microliter of whole blood collected in EDTA tube, mixed with a paramagnetic gadolinium solution, transferred to a capillary tube, and placed between magnets for imaging and data analysis. Primary analysis: comparison of sepsis patients vs nonseptic controls. Covariates of interest: leukocyte area, length, width, roundness, standard distribution(SD) of levitation height (measure of mass/charge dispersion). Means reported with t-test comparisons and calculation of area under the curve for assessment of diagnostic accuracy.

Introduction
Bedside tracheostomies are standardized, repeated procedures performed on relatively stable patients. The effects of tracheostomies at the biochemical level have not been studied before. Methods 5 blood samples were obtained from patients undergoing bedside tracheostomies at t = 0, 4, 8, 12 and 24 hrs. Vital signs and clinical lab measures were also recorded. The blood samples were assayed for analytes shown in Fig. 10.

Results
We report results for 23 patients from this ongoing study. Parametric and nonparametric tests showed few statistically significant changes between the time points. Clearly discernible patterns of response were observed, which were different between patients. Using a selforganizing map clustering algorithm, we were able to cluster the responses over time, for every analyte eg GCSF showed 3 possible patterns of response: peak at t = 2 and then a steady decrease (4 pts); peak at t = 2 and then a plateau starting at t = 3 (12 pts); and peak at t = 2 with an increase at t = 4 (5 pts; 2 pts were excluded by the algorithm). The other cytokines showed between 3 to 5 clusters of response patterns each.

Conclusions
Results were not amenable to parametric or non-parametric approaches. We could classify the patients according to response patterns. This highlights the need for the development of individuallevel measures and the personalization of clinical care

Introduction
The endothelial protein C receptor (EPCR) is a protein that regulates the protein C anticoagulant and anti-inflammatory pathways. A soluble form of EPCR (sEPCR) circulates in plasma and inhibits activated protein C (APC) activities. The clinical impact of sEPCR and its involvement in the septic process is under investigation. This study investigated possible association of EPCR haplotypes with sEPCR levels in critically-ill patients with suspected infection.

Methods
Two polymorphisms in the EPCR gene were previously genotyped in 239 Caucasian critically-ill patients, hospitalized in the intensive care unit (ICU) of &#8220;Evangelismos&#8221; Hospital, Athens, Greece. The old [1][2] and new [3] sepsis definitions were used to divide patients in groups. Patients were further divided according to their genotype. Plasma sEPCR levels were measured using a dedicated ELISA assay in all patients at the time of admission to the intensive care unit (ICU).

Results
Patients were categorized using both old and new sepsis definitions. With the old definitions patients were divided in two groups: severe sepsis/septic shock-positive (SS/SS + ve) and severe sepsis/septic shock-negative (SS/SS-ve). sEPCR levels were slightly higher in the SS/SS-ve group (p < 0.05). The patients were also divided according to their qSOFA score. In the three groups of patients, sEPCR levels were comparable (p > 0.05). However, when patients were divided according strictly to their genotype, plasma levels of sEPCR differed between genotypes (p < 0.0001) and between H3 and non-H3 carriers (p < 0.0001), with higher sEPCR levels in the H3 carriers.

Conclusions
Frequencies of SNPs determining EPCR haplotypes were in concordance with Caucasian frequencies. Critically-ill patients carrying at least one H3 allele had significantly higher levels of sEPCR than patients with no H3 alleles. Using both classification systems, sEPCR levels were not associated with sepsis severity.

Conclusions
Based on our findings copeptin serves as a strong prognostic marker on ED admission and might help to improve risk stratification in unselected medical ED patients.

P377
The association of admission procalcitonin levels and adverse clinical outcome across different medical emergency patient populations: results from the multi-national, prospective, observational TRIAGE study

Introduction
Although Procalcitonin (PCT) has been extensively studied in infectious conditions, the clinical relevance of PCT in unselected emergency department (ED) patients remains incompletely understood. We investigated association of admission serum PCT levels and adverse clinical outcomes in a large ED cohort from a previous multicenter study.

Methods
We prospectively enrolled 7132 adult medical patients seeking ED care in three tertiary care hospitals in Switzerland, France and the United States. We used adjusted multivariable logistic regression models to examine association of admission PCT levels and 30-day mortality, stratified by principal medical diagnoses and concomitant comorbidities. We calculated regression models across different clinically-established PCT cut-offs (0.05, 0.1, 0.25 and 0.5ng/ml).

Results
During a 30-day follow-up 328 (4.9%) participants died. Mortality rate in the cut-off stratified groups were 1%, 3%, 7%, 14 and 17%, respectively. PCT had a high prognostic power for 30-day mortality with an area under the receiver operating characteristic curve of 0.74 (95%CI 0.72-0.77; SE 0.0133). After adjustment for age, gender and main diagnoses, PCT cut-off levels were associated with a step-wise increase in risk of 30-day mortality with an adjusted odds ratio of 1.9, 4.5, 8.6 and 11.0 for pulmonal disease; 1.9, 4.7, 8.7 and 11.0 for cardiovascular disease and 1.9, 4.6, 8.8 and 11.4 for infectious disease, respectively. These associations were similar among different types of patients in regard to main diagnoses, comorbidities and age of patients.

Conclusions
In this large medical ED patient cohort, admission PCT was a strong and independent outcome predictor for 30-days mortality across different medical diagnoses. PCT may help to improve risk assessment in undifferentiated medical ED patients.

Results
As regards demographic data significant differences were showed except ICU LOS that was longer in KPCG. The main site of isolation of KPCs was the respiratory tract. According to the different sepsis severity diagnosis we found the following PCT levels (ng/ml). Sepsis: KPCG 0.9 ± 1.9, CG 6.6 ± 11.1 (p < 0.01); Severe Sepsis: KPCG 3.4 ± 6, CG 11.6 ± 19.8 (p < 0.01); Septic Shock: KPCG 8.9 ± 15.9, CG 30.5 ± 35.7 (p < 0.01) In KPC Group the mean PCT level in non-survivors was 7ng/ml, while in survivors was 2.7 ng/ml (p < 0.01). PCT mean values increased according to SOFA score increases.

Conclusions
Our data demonstrated that in septic patients with KPC infection PCT levels were significantly lower that those in septic patients with infection due to other Gram negative species. As regards outcome PCT levels confirmed to be related to severity of the disease with a mean level significantly higher in non-survivor compared to survivor.
Mortality rate was higher in KPC Group compared to Control Group, but the difference was not significant.

Introduction
Several prognostic scores and biomarkers have been assessed for risk stratification in septic patients in intensive care units (ICUs). Serum lactate is a routinely used biomarker for the management of patients with sepsis and correlates with hypoperfusion and fluid resuscitation. Procalcitonin (PCT) is a bacterial infection marker and kinetics indicates the response to antimicrobial management. There is insufficient data comparing these two markers regarding outcome prediction. Herein, we compared the prognostic accuracy of lactate and PCT kinetics, and the combination of them, in a large, well defined US sepsis patient population.

Methods
This is an observational cohort study of adult patients with confirmed severe sepsis or septic shock included in a sepsis database from 14 different BayCare hospitals (Florida). All patients had PCT and lactate measurements on admission and during follow-up based on the treatment protocol. We used logistic regression and area under the curve (AUC) as a measure of discrimination of lactate and PCT with in-hospital mortality.

Results
The in-hospital mortality rate of the 1075 included patients (mean age 66.9 years) was 18.8%. Concerning prognosis, the initial lactate level was a better mortality predictor (AUC 0.64) compared to PCT (AUC 0.55). For follow-up measurements, PCT (AUC 0.75) showed a better discrimination than lactate (AUC 0.73). When looking at biomarker kinetics, PCT increase was more strongly associated with fatal outcomes compared to initial levels alone (AUC 0.74) and was a better predictor compared to lactate kinetics (AUC 0.63). A joint logistic regression model combining follow-up measurements of lactate and PCTkinetics showed a superior prognostic accuracy (AUC 0.82) compared to these markers alone.

Conclusions
Both biomarkers, PCT, and lactate, provide prognostic information in ICU patients with sepsis, particularly when looking at kinetics.

Introduction
Sepsis is known to be the leading cause of morbidity and mortality of ICU patients. APACHE 2, SAPS 2 ve SOFA scores are most frequently used for predicting mortality. Herein, it's aimed to search for the effect of C-reactive protein (CRP), procalcitonin (PCT) and mean platelet volume (MPV) on mortality of sepsis in the ICU.

Methods
Retrospectively, 25 sepsis patients admitted to our ICU on May 2016 with at least 5 days of stay in the ICU were searched for APACHE 2, SAPS 2 and SOFA scores. The effect on mortality of CRP, procalcitonin and MPV values on the 1st, 5th and 10th days were analyzed with SPSS version 15 through receiver operating characteristic (ROC) curve analysis. Sensitivity, specificity, positive and negative predictive values were calculated.

Results
Patients were between the ages of 18-79 years with a median of 53 and a female-male ratio of 56-44%. ROC analysis revealed that the last day MPV values were highly predictive for mortality (AUC: 0.99, p < 0.01). The cut-off for this value was calculated as 7.73, with a sensitivity of 90%, specificity of 100%, and positive predictive value as100%, negative predictive value as 53%. CRP values on the last day of ICU stay were predictive for mortality as well (AUC: 0.85, p < 0.01) with a cut-off value of 7.55, sensitivity of 92.3% but a specificity of 87.5% (positive predictive value: 92.3% and negative predictive value: 63.6%). Procalcitonin values of neither the first nor the last day of ICU stay were predictive of mortality. APACHE 2, SAPS 2 and SOFA were found to be well correlated with MPV values and the last day CRP values, but not with any of the procalcitonin values.

Conclusions
Together with APACHE 2, SAPS 2 and SOFA scores, CRP and MPV values are good predictors of mortality of sepsis in ICU. We consider a further study with a larger sample size to evaluate better the effect of procalcitonin in this issue.

Introduction
Bacterial infection often leads to decompensated liver cirrhosis in cirrhotic patients. Its diagnosis should be swift, otherwise it is associated with a poor prognosis. Our aim is to quantitate the various inflammatory markers in cirrhotics with and without obvious infectious disease. Methods A total of 80 patients admitted due to decompensated liver cirrhosis were included in this study. We explored the key epidemiologic, clinical and biological features of these patients.
Results 54 patients (67,5%) were assigned to the infection group of which nine patients (16,6%) had positive blood culture. In this group, the following diagnoses were made : 34 lung infections, 11 spontaneous bacterial peritonitis, nine urinary tract infections, three skin infections, and two gastroenteritis. Three patients had an infection without an identified source. The mean age for the infected patients (group 1) was 62.8 years and for the uninfected ones 63.4 years (group 2). There were higher levels of C reactive protein (CRP) and procalcitonine (PCT) in group 1 than in group 2 (a mean of 52.7 mg/ L and 1.3 ng/ mL versus 13,3 mg/l and 0,1 ng/mL respectively). The median of CRP for group 1 was 43,15 mg/ L. Infection at admission was associated with significantly higher levels of these proteins (p = 0.000). Leucocyte count was also higher in group 1 (p = 0.04). 49 patients had CRP levels below 43.15 mg/ L, wherein 25 (51%) had infections. In this particular group, CRP was significantly higher in infected patients (p = 0.000), unlike PCT (p = 0,06) and leucocyte count (p = 0.23).

Conclusions
Elevated PCT appear only in inflammations of an infectious etiology with systemic signs. Therefore in cirrhotic patients, procalcitonin determination seems appropriate for the diagnosis of severe infections but does not seem more reliable than CRP in other situations.

Introduction
Development of new microbial load biomarkers proceeds, but stillthere is no perfect one. Procalcitonin (PCT) is commonly used and its level correlates with the extent of microbial invasion. Earlier it was found that aromatic microbial metabolites (AMM) -are associated with the severity and mortality of ICU patients [1,2]. Also the effectiveness of the treatment can be evaluated by the AMM level [1]. Experimental studies have shown their biological activity [3,4]. The aim of this work is to analyze different criteria of bacterial load in ICU patients with nosocomial pneumonia.

Results
In serum samples of 46 patients the total concentration of 9 AMM was 3.4 (2.2-17.4) μ M that was higher (p < 0.05) than in healthy donors -1. Introduction According to in vitro data, levels of pro-and anti-inflammatory mediators can be markedly decreased in septic shock by hemoperfusion using a novel cytokine adsorbent therapy (CytoSorb). However, the capacity and performance of the absorber over time has not yet been investigated. Our aim was determine elimination of procalcitonin (PCT) by the adsorbent in vivo, in patients with septic shock treated with CytoSorb for 24 hours.

Methods
The current study is a spin-off part of the ongoing "Adsorption Cytokines Early in Septic Shock", the ACESS-trial. CytoSorb therapy was commenced early (<48h) after the onset of septic shock and performed for 24 hours. Blood samples were taken from the systemic circulation in every 6 hourly from the beginning (T0) to the end of CytoSorb therapy (T6, T12, T18, T24). Serum PCT, CRP, IL-1, IL-1ra, IL-6, IL-8, IL-10, TNF-α levels were measured. PCT levels were determined from blood taken simultaneously from the pre-, and postadsorbent samples. showed an exponential decline from 90% to a negligible degree after 12 hours. This phenomenon, on the one hand, shows the early efficacy of CytoSorb therapy, while on the other hand, it raises the question of changing the adsorbent earlier than the current practice of 24 hours.

Introduction
The SOFA score is associated with an increased probability of mortality in sepsis. Assessment at admission in the ED requires several laboratory variables. The Third International Consensus Definitions for Sepsis and Septic Shock defined the qSOFA, which does not require laboratory tests and can be assessed at patient admission. Objective: To compare sepsis biomarkers with SOFA and qSOFA to discriminate sepsis, severe sepsis or septic shock and to evaluate the association with increased risk of mortality. Methods 66 Patients admitted to the ED with signs of sepsis and =/>2 SIRScriteria were included. Severe sepsis and septic shock were defined according to current guidelines. qSOFA score was calculated using the recommended thresholds: respiratory rate =/>22/min, GCS score <15, stystolic blood pressure >100mmHG. Presepsin and procalcitonin were determined using the POC assay PATHFAST Presepsin (PSEP), LSI Medience, Japan and the BRAHMS luminescence immune assay (PCT).

Conclusions
The results demonstrated that the qSOFA score is not a standalone criterion for risk stratification in sepsis at admission. Simultaneous assessment by combining qSOFA and PSEP improved the validity significantly. The POC assay PATHFAST Presepsin showed superior performance compared to lactate and PCT.

Introduction
Early identification of sepsis and its differentiation from non-infective SIRS are important for sepsis outcome. We intended to evaluate the use of presepsin in differentiating sepsis from non-infectious SIRS and its prognostic value compared to CRP.

Methods
We included 31 patients (median age 60 year old, 16 males) admitted with SIRS to El-Sahel Teaching Hospital, Egypt after excluding 21 patients with preadmission corticosteroids therapy, blood transfusion, immunosuppressive illness, and ICU length of stay (ICU-LOS) less than 24-hours. Patients were classified into non-infective SIRS group (13 patients) and sepsis group (18 patients). Presepsin, CRP and SOFA score were measured on admission and on days 2 and 4 of admission. The outcome parameters studied were ICU length of stay (ICU-LOS) and in-hospital survival.

Results
Apart from temperature and AST which were significantly higher in sepsis group, the two groups were comparable. All the presepsin levels and CRP on days 2 and 4 were significantly higher in sepsis than in SIRS groups. The ICU-LOS was positively correlated with all the presepsin levels and with the CRP levels on days 2 and 4. All Presepsin values were significantly higher in survivors while none of the CRP levels were significantly different in survivors and non-survivors. The decrease of presepsin over time was significantly associated with better survival. It was found to be 70% sensitive and 91% specific for predicting survival in SIRS patients. This relation was not found in CRP levels.

Conclusions
We concluded that the presepsin may be used for early differentiation between sepsis and non-infectious SIRS and predict higher mortality.

Introduction
Several cumbersome scoring systems were developed for prognosis and outcome prediction in sepsis. We intended in this study to evaluate the urinary albumin/creatinine ratio (ACR) as a prognostic predictor in sepsis.

Methods
We included 40 adult septic patients in a prospective observational study. We excluded patients with preexisting chronic kidney disease or diabetes mellitus. After clinical evaluation, urine spot samples were collected on admission and 24 h later for ACR1 and ACR2. Admission APACHE IV score and the highest recorded SOFA score of their daily estimation were considered. We also evaluated the need for mechanical ventilation, inotropic and/or vasoactive support, renal replacement therapy (RRT), and in-hospital mortality.

Results
In a population with 63 (55-71) year old with 29 (72.5%) males, we found that the ACR2 is correlated with the SOFA score (r =0.4, P = 0.03

Introduction
Sepsis is characterized by endothelial dysfunction. Crucial for endothelial function is the glycocalix (GLY), a carbohydrate-rich layer on the endothelial surface, which regulates vascular permeability, microcirculation and mechanotransduction. The GLY is anchored to the endothelium by glycoproteins and proteoglycans including Syndecan-1 (SDC1). Shedding of SDC1 has been observed in situations when the endothelium is damaged. Another key regulator of vascular permeability is sphingosine-1phosphate (S1P) that upon binding to endothelial S1P receptor 1 strengthens barrier function. It has been observed that S1P is able to protect the GLY by inhibiting matrix-metalloproteinase (MMP) activity. We have recently observed decreased S1P levels in patients with sepsis, and here, addressed the question whether decreased S1P levels are associated with increased concentrations of SDC1 in serum of septic patients in the cohort we have previously studied for S1P levels [1].

Methods
The local ethics committee approved this study by acceptance of an amendment to our previous protocol. 59 ICU patients with sepsis or septic shock were enrolled. Blood samples were drawn at day 1 after admission and serum was prepared. Soluble SDC1 was measured by ELISA and S1P by mass spectrometry. The SOFA Score was used to assess disease severity. Twenty health controls were included. Results SDC1 are increased and S1P levels are significantly decreased in sepsis patients when compared to controls. Highest SDC1 levels and lowest serum-S1P levels were found in patients with septic shock. The mean serum SDC1 concentration was 33 ± 24 ng/mL in controls, 196 ± 162 ng/mL in patients with sepsis and 260 ± 164 ng/L in patients with septic shock. Using linear regression analyses, S1P and SDC1 were found inversely associated (P < 0.001). The SDC1/S1P ratio was positively associated with the SOFA score (P < 0.01). By generating ROC curves for the SDC1/S1P ratio and the SOFA score to indicate septic shock similar areas under the curve (AUCs) were found with 0.77 for the SDC1/S1P ratio and 0.80 for the SOFA score.

Conclusions
We found increased SDC1 levels, which were associated with decreased serum-S1P levels in septic patients. SDC1 and S1P are inversely correlated and the ratio of SDC1/S1P reveals as valuable prognostic marker for septic shock.

Introduction
Postoperative inflammation is associated with a sepsis like syndrome including endothelial barrier disruption, volume depletion and hypotension. Postoperative inflammation is especially associated with surgical procedures which require extracorporeal circulation. Sphingosine-1-phosphate (S1P) is a signaling lipid regulating permeability and vascular tone. In animal models S1P and S1PR1 specific agonists were able to limit trauma induced barrier disruption. Moreover in septic humans decreased serum-S1P levels could be identified as marker for sepsis severity. Low levels were predictive for septic shock. Here we were interested if S1P levels are altered by surgical trauma induced by cardiac bypass surgery. Methods 26 patients with coronary heart disease planned for coronary bypass surgery were prospectively enrolled in this study. Serum samples were drawn pre-, post-procedure and on day 1 and day 4 after surgery. S1P concentration in μM was quantified by mass spectrometry. In addition we analyzed potential S1P sources: Red blood cells (RBC) and platelets. We further quantified levels of other inflammatory markers: C-reactive protein, procalcitonin, interleukin-6 and von-Willebrand antigen.

Results
Mean serum-S1P levels in patients before the procedure were 0.74 ± 0.2μM. S1P post-levels were lowest directly after surgery and dropped by 50% (0.37 ± 0.11μM). In recovery serum-S1P levels were increased to 0.60 ± 0.13μM on day 1 and was higher than pre levels on the fourth day 0.82 ± 0.18μM. There was no difference observed if patients were operated with or without extracorporeal circulation. In both groups the serum-S1P kinetics were similar, with significantly decreasing levels during surgery and increasing levels in recovery. In a correlation analysis serum-S1P levels correlated with its sources red blood cell (RBC) and platelet count. We did not observe a significant correlation of S1P levels with the dosage of vasopressors in the intraoperative phase or other inflammatory markers.

Conclusions
Cardiac bypass surgery induces a rapid decrease in serum-S1P levels. S1P might be a marker for early inflammation postoperative systemic inflammation, which is independent from other commonly used inflammatory markers.

Introduction
Early diagnosis of acute posttraumatic osteomyelitis (POM) is of vital importance to avoid devastating complications. Lack of specific and sensitive test as in myocardial infarct makes it difficult to diagnose POM. Serum inflammatory markers are not able to differentiate between response to infection and the host response to non-infection insult.

Methods
The prospective nonrandomised cohort study included 115 patients after high-energy injury to cruris required to be primary surgical treated. Values of biochemical and immunoinflamatory profile were measured on admission (ADD), first postoperative day (POD1) and fourth-postoperative day (POD4). Collected data from laboratory measurement and inpatient and outpatient medical records were analysed by descriptive statistics, univariate logistic regression and multivariate logistic regression with Firth correction and penalized regression.

Results
The best predictors of PO on admission are T regulatory cells CD4 + CD25+, CRP and WBC counts, on POD1: WBC counts, CRP and PCT and on POD4: CRP, albumins, PCT and WBC counts. Performing logistic regression with Firth correction and multivariate penalized regression we found out that we can best predict development of PO by measuring CRP on ADD, POD1, POD4, PCT on ADD and POD4, albumins on POD4 and assessment of extent of soft tissue damage, transfusion rate, need of conversion primary external fixation to intramedullary (IM) nailing or locking plate fixation and patient's physical status ASA 3-4 according to the American Society of Anesthesiologists. Our LASSO predictive model with cross-validated parameters has shown good performance (AUC = 0,75).

Conclusions
We can improve prediction of POM development by using perioperative inflammatory biomarkers PCT and CRP in combination with postoperative albumins and considering independent risk factors.

Introduction
The goal of this investigation was to determine if acute influenza infection is associated with depletion of Coenzyme Q10 (CoQ10) and to determine any associations between CoQ10 levels and severity of illness and inflammatory biomarkers. CoQ10 is a key cofactor in the mitochondrial respiratory chain and may be depleted in acute critical illness. Prior investigations have found a correlation between depleted CoQ10 levels in patients with both septic shock and postcardiac arrest and the inflammatory cascade. Statin medications have also been associated with decreased CoQ10 levels.

Methods
We analyzed serum CoQ10 concentrations of patients with acute influenza infection who were enrolled in a randomized clinical trial administering atorvastatin or placebo. Patients were enrolled at a single urban tertiary care center over 3 influenza seasons (12/2013 to 5/ 2016). Blood samples were obtained at enrollment prior to administration of any study drug. Healthy adult controls were used for comparison. We used Wilcoxon rank sum test to compare CoQ10 levels between influenza patients and controls. Correlations between CoQ10 levels and other inflammatory biomarkers and patientreported severity of illness were assessed using Spearman correlation coefficient.

Results
We analyzed CoQ10 from 50 patients with influenza and 29 healthy controls. Overall, patients with acute influenza infection had lower levels of CoQ10 compared to controls (0.61 ± 0.31 vs 0.77 ± 0.23 ug/mL, p = 0.004). There were significantly more patients in the influenza group with low CoQ10 levels compared to controls (48% vs 7%, p = 0.0001). There were significant correlations between CoQ10 levels in influenza patients and several inflammatory biomarkers: interleukin-2 (IL-2) (r = -0.297, p = 0.038), tumor necrosis factor alpha (TNF-α) (r = -0.352, p = 0.013) and vascular endothelial growth factor (VEGF) (r = 0.381, p = 0.007). There was no correlation between influenza severity of illness score at time of enrollment and CoQ10 levels.

Conclusions
We found that CoQ10 levels were significantly lower in patients with acute influenza infection as compared to controls and that CoQ10 levels had a significant correlation with several inflammatory biomarkers. Further study is warranted to define these relationships and identify potential targets for therapy in acute influenza.

Introduction
Sepsis is the body's systemic inflammatory response to infection and can progress to severe sepsis, septic shock and ultimately multiple organ failure. Rapid detection of severe sepsis and septic shock is crucial for good outcome. Heparin-binding protein (HBP) is a multifunctional inflammatory mediator. HBP has been demonstrated in various infectious diseases caused by a wide array of bacteria. Procalcitonin and heparin -binding protein value may be useful for early diagnosis in sepsis, severe sepsis and septic shock patients in intensive care unit. The present study was conducted to determine the heparin-binding protein, procalcitonin level at early diagnosis in patients with sepsis, in comparison with C-reactive protein, IL-6 and TNF-alfa.

Methods
This was a prospective, observational of ICU patients with suspected infection, conducted Health Sciences University, Kocaeli Derince Traning Hospital's ICU. The study was conducted aver a 7-month period between January and July 2016. Blood samples were taken on the first and second day of hospitalization, and the seventh day, on day of discharge or on the day of death.

Results
Heparin-binding protein, PCT, TNF-alaf-IL-6 and C-reactive protein levels increased in parallel with the severity of the clinical condition of the patient. Heparin binding protein exhibited a greatest sensitivity and specificity in differentiating patients with sepsis. Also, when comparing all the data; HBP was the best predictor of early diagnosis sepsis and organ dysfunction (Our sturdy continues).

Conclusions
In the present study heparin binding protein was found to be a more accurate diagnostic parameter for differentiating sepsis and organ dysfunction.Heparin binding proetin may be helpful in the follow up of sepsis patients. HBP measurement in the ICU may help monitor treatment in patients with sepsis, severe sepsis and septic shock.

Introduction
Ventilator-associated pneumonia (VAP) is the most common nosocomial infection in critically ill patients and associated with increased mortality and morbidity, and prolonged mechanical ventilatory support and length of stay in the intensive care unit. The Clinical Pulmonary Infection Score (CPIS) based on chest Xray has been developed to facilitate clinical diagnosis; however, this scoring system has a low diagnostic performance. In this study, the authors developed a new scoring system for early diagnosis of VAP called "Lung Ultrasound and Pentraxin-3 Pulmonary Infection Score (LUPPIS)" which is based on pentraxin-3 (PTX-3) levels and lung ultrasonography and evaluated the performance of this new scoring system.

Methods
This single center, observational, prospective study included 78 patients, who received therapy as an inpatient in the Intensive Care Unit between January 2015 and April 2016 and who were suspected of having VAP. At the day of study, an endotracheal aspirate was obtained for Gram staining and culture. PTX-3, procalcitonin (PCT), and other biomarkers were recorded. The diagnosis of VAP was confirmed according to either culture positivity or in culture negative patients, all clinical criteria and initiated or modified antibiotic regimen within 48 hours.

Results
No significant differences were found between groups with respect to age, mechanical ventilation time, APACHE II score, and SOFA score (p > 0.05). PCT and PTX3 levels were significantly higher in the VAP (+) group (p < 0.001 and p < 0.001, respectively). The threshold for LUPPIS in differentiating VAP (+) patients from VAP (-) patients was >7 according to the highest Youden Index. In predicting VAP, LUPPIS > 7 (sensitivity of 84%, specificity of 87.7%) was superior to CPIS > 6 (sensitivity of 40.1%, specificity of 84.5%).

Conclusions
Early diagnosis of VAP currently remains challenging due to lack of a validated gold standard diagnostic method. The diagnosis and follow-up of pulmonary infections rely on the chest x-ray, although this is not feasible for critically ill patients in daily practice. Lung ultrasound (USG) is an important, simple, noninvasive, and cost-effective technique used in the diagnosis and differentiation of lung pathologies. Considering the overlap between sensitivity analysis and confidence intervals in the present study, LUPPIS appears to provide better results in the prediction of VAP compared to CPIS, and the importance of lung USG and PTX-3 is emphasized, which is a distinctive property of LUPPIS.

Introduction
The National Early Warning Score (NEWS) is a UK-wide system measuring physiological variables to standardise evaluation of acute illness. We hypothesise that the NEWS may provide novel prognostic information for patients admitted to ICU.

Methods
We retrospectively assessed the NEWS and outcomes of emergency admissions in ward-based patients to the critical care unit in a UK tertiary care hospital. Data was obtained via patient notes and electronic records over a 6-month period. We evaluated: the NEWS trigger, the seniority of ward doctor reviewing the patient prior to ICU admission, Medical Emergency Team (MET) involvement, organ support requirements and mortality outcomes. in two distinct cohorts, NEWS > =7 versus NEWS < =6.
Results 50 sets of notes were analysed. In the NEWS > =7 group, 65% of patients were seen by a ward registrar or above prior to critical care involvement. Only 42% of patients admitted to the ICU with NEWS > =7 triggered a MET alert, below the national standard. Over 90% of patients in both groups required at least one form of organ support, with a preponderance of basic cardiac and renal support in the NEWS < =6 group. The one-week mortality was higher in the NEWS > =7 group at 13% versus 4%, but 1-year mortality difference was negligible.

Conclusions
The RCP guidance for triggering an emergency review is poorly adhered to in our hospital. The necessity for organ support and oneyear mortality were similar in both groups, suggesting that patients with NEWS < =6 are as vulnerable as the higher scoring group. This highlights a need to be identified more accurately. We propose the NEWS is combined with a nurse led observational "level of concern" score, supporting earlier referral to a senior physician.

Introduction
Several scoring systems are used to risk stratify patients and alert staff to clinical deterioration. In the UK, the most widely used of these is the NEWS, a scoring tool based on physiological observations. There is increasing evidence that NEWS may be a useful diagnostic and prognostic tool in sepsis [1]. A novel biochemical extension to the NEWS, using parameters from a blood gas (the metabolic score) was found to predict organ failure and death within 48 hours of admission [2]. We compared the predictive ability of NEWS with metabolic score for morbidity and mortality outcomes in patients with sepsis.

Methods
This retrospective observational study involved adult patients presenting to the emergency department of University Hospital Lewisham (UK) from September 2015-August 2016 with sepsis requiring inpatient care. Demographic data was gathered from the hospital electronic patient record system. NEWS was calculated from physical observations taken during the patients' triage. The presence of co-morbidities and metabolic score were collected using a pro forma (completed by the clinician managing each case). The primary outcomes were evidence of organ failure, escalation of care and death within 48 hours of admission. This information was gathered from the electronic patient record system.

Results
Data was gathered from 140 patient attendances (mean age = 65 years [SD = 19.98], male = 65). 61 patients were identified as having evidence of organ failure, 12 patients required escalation of care and 22 patients died.
After controlling for demographic data, NEWS score at triage was predictive for the development of organ failure (p = 0.002) and death (p = 0.003) however it held no predictive value for escalation of care Legend : Difference in percentage mortality based on NEWS

Fig. 12 (abstract P393). Difference in organ support requirements during ICU stay based on NEWS
(p = 0.045). Metabolic score was predictive for death (p = 0.013) and the need for escalation of care (p < 0.001), however it was not predictive for the development of organ failure (p = 0.65).

Conclusions
Neither the NEWS or metabolic score can accurately predict the development of organ failure, escalation of care and death. We suggest that a combination of these scores should be used to aid risk stratification of septic patients presenting to the emergency department

Introduction
We compared the new criteria of sepsis (Sepsis-3) with the previous criteria (Sepsis-2) to predict hospital mortality in oncological patients of a Brazilian Cancer Hospital. Any improvement on sepsis treatment shall begin with the early detection, especially in high-risk populations as the oncological patients in developing countries. Any diagnosis criteria that delay the diagnosis will impact on patients' outcomes, especially in mortality.

Methods
Retrospective study conducted between 2009 and 2014 in a 55bed ICU of a tertiary and teaching Cancer Hospital. We included all oncological patients admitted in the ICU that received the diagnosis of sepsis, severe sepsis or septic shock at the ICU admission by the attending physician. Mortality was compared across categories of both Sepsis-2 (sepsis, severe sepsis, shock septic) and Sepsis-3 definition (infection, sepsis, septic shock). A p value < 0.05 was considered statistically significant.

Conclusions
In oncological patients of a Cancer Hospital located in a developing country, Sepsis 3 criteria were accurate in stratifying hospital mortality and in some aspects superior to the definitions previously used.

Introduction
Patients with early stage of post hematopoietic stem cell transplantation (HSCT) have a great risk of infectious complication. Sepsis contributes to morbidity and mortality in those patients. Therefore, it's crucial to early detect sepsis and to treat with rapidly proper management. Our hypothesis is national early warning score (NEWS) and rapid response team (RRT) is useful to early detect and resuscitate sepsis in HSCT patients.

Methods
All post HSCT adult patients in sterile room unit who developed persistent fever for 3 days in our university hospital from January to December 2014 were included in this retrospective observational study. Patients were divided into two groups, 1) the control group and 2) the intervention group. Patients in the control group were treated with a local routine protocol. Patients in the intervention group had a local routine protocol treatment and were further evaluated with NEWS and RRT was activated simultaneously. RRT was activated when total NEWS score was exceeded 7 or individual item score was exceeded 3. The 90-day mortality, the number of ICU patients, SOFA score at ICU admission, the number of patients with vasopressor, the number of patients required mechanical ventilation, and days in post HSCT sterile room unit were compared between two groups. Values were expressed as mean ± SD. Data were analyzed by Mann-Whitney U-tests or Χ test. A p < 0.05 was considered as statistically significant.

Results
There were 66 patients (35 men, 31 women; age median 37 (18-62)) in this study. The number of patients who admitted to ICU (6 vs. 3), SOFA score at ICU admission (7.6 ± 1.3 vs. 5.2 ± 1.2), the number of patient who required vasopressor (8 vs. 4) and the number of patient who required mechanical ventilation (6 vs. 3) were statistically higher in the control group than in the intervention group (p < 0.05). There were no significant differences in sex in 90-day mortality (17.6% vs. 15.6%) and days in post HSCT sterile room unit (17.4 ± 2.4 vs. 16.7 ± 2.8).

Conclusions
The NEWS monitoring and RRT did not statistically improved 90days mortality in this study. However, the strategy using NEWS and RRT could contribute to early detect deteriorations and to timely ICU care in post HSCT patients who admitted in a special sterile room.

Introduction
Following publication of the Sepsis-3 review [1], we aimed to evaluate the awareness and use of the new qSOFA criteria for assessing patients with suspected sepsis and the use of SOFA scoring for subsequent evaluation. We also aimed to assess the correlation between MEWS and qSOFA, and the strength of the 24hr/48hr/mean/maximum SOFA and APACHE2 as predictors of mortality when compared to patient outcome. Methods This is a retrospective cross-sectional study over a period of 3 months. The patients' electronic records were reviewed for the use and documentation of qSOFA, SOFA, or any other clinical parameters to justify sepsis requiring ICU input. If not present, the SIRS, qSOFA, 24hr, 48hr, mean & maximum SOFA scores were then calculated. Outcome (discharge vs death) was tested against the mortality risk scores. Inter-test agreement independent of outcome was also analysed. Pearson correlation and binary logistic regression analysis was used to assess for statistical significance.

Results
Of the 214 patients admitted over the three-month period, 81% were admitted with a SIRS score of > =2 regardless of diagnosis. 33/214 patients (15.42%) were referred and admitted due to sepsis. None had SIRS or qSOFA documented either prior to or during their ICU admission. All had MEWS, APACHE2 and SOFA scores done only on ICU admission. They all had a retrospectively calculated admission qSOFA of 2 and their 24hr SOFA scores ranged between 5 and 16. Two patients admitted with sepsis had SOFA scores of 10 and 16 respectively, a qSOFA score of 2, but only 1 SIRS criterion. Three patients had a MEWS of 2, nine between 4-5 and twenty-two between 6-20; 94% of patients with sepsis admitted to ICU had MEWS of > =4. The top 5 clinical parameters used to justify ICU input due to sepsis were: reduced GCS, hypotension, tachycardia, pyrexia and Type 1 respiratory failure. The strongest correlation for patient outcome was with 48hr SOFA (p < =0.01), followed by mean SOFA (p < =0.01) and APACHE-II (p < =0.05). The best agreement between the mortality prediction scores was with APACHE2 and the mean SOFA score (p < =0.01), independent of outcome.

Conclusions
Despite the dichotomy of approach to sepsis, clinical judgement has remained constant. We observed that the 48hr SOFA had the best correlation with patient outcome. All septic patients had a qSOFA of 2, correlating well with a MEWS of > =4 seen in the majority of septic patients prior to their ICU admission, however we agree with Vincent et al [2] that until qSOFA is validated further, it should be used as a guide and not a replacement of SIRS.

Introduction
Antibiotic resistance is increasing in frequency due to higher rates of inappropriate antimicrobial therapy and empiric use of broadspectrum antibiotics.

Methods
We employed a real-time multiplexed automated microscopy system (ID/AST; Accelerate Diagnostics, Tucson, Arizona) capable of evaluating antibiotic susceptibility and resistance directly from positive blood culture broth in septic patients using automated phenotypic growth pattern analysis.

Conclusions
The ID/AST system provided accurate pathogen identification and susceptibility more than 1 day sooner compared to standard blood culture processing.

Introduction
Catheter-related bloodstream infection (CRBSI) requires a positive blood culture and a positive result by the semiquantitative culture method (SQC) or the quantitative sonication method (SON). Our aim is to compare the yields of both techniques to detect the colonization and infection, in short and long term catheters in ICU patients.

Methods
We prospectively studied central venous antimicrobial catheter. Both methods were performed on tips cut into 2 equal segments each, to avoid loss of microbial colonies during serial examination and contamination. SQC method was performed by rolling the tip on a Columbia agar plate. Sonication was applied by placing the tip in 5 ml of 0.9NaCl, sonicating for 5 min, vortexing for 15s and culturing 0.

Conclusions
Our findings, unlike previous studies, imply superiority of the SON method over SQC, in detection of catheter colonization and CRBSI. A hypothetical explanation is based on the prolonged sonication time of 5 min (instead of the usual 1 min), which probably enhances detachment of gram (-) bacteria that as endoluminal bacteria are more difficult to isolate by SQC. Especially for long term catheters, where colonization by gram (-) bacteria is most often, according to the literature, the sonication method may be prove particularly useful for colonization detection and CRBSI diagnosis. Further investigation is envisaged to explore the potential of the method.

Introduction
Hospital-acquired (or nosocomial) infections are important causes of morbidity and mortality despite improved antimicrobial therapy, supportive care, and prevention. However, due to aging, comorbidities and repeated contact with healthcare facilities, many patients are already infected with multidrug-resistant (MDR) pathogens at the time of ICU admission.

Methods
Retrospective analysis of microbiologic exams taken at admission in a ICU over one year, using data obtained from microbiology laboratory. Characterization of positive samples regarding pathogens identified and clinical setting (community vs hospital acquired).
Results 2788 samples were obtained in 510 patients admitted to the ICU during 2015. Regarding the type of sample, 23.6% were tracheal aspirate cultures, 57.6% blood cultures and 18.7% urine cultures. We collected 560 positive samples, obtained from 280 patients. From the tracheal aspirate cultures, 38.3% were positive. The most frequent pathogen was S. aureus, found in 74 samples -15 of which were oxacilin resistant. 8 of these patients were admitted from the ER. The other were transferred from wards: 3 from medical wards and 4 from surgical wards, with an average lenght of stay (LOS) before ICU admission of 5 days. 9 samples were positive for A. baumanii, 1 admitted from ER, 3 from surgical and 5 from medical wards. In 7 samples were found ESBL positive pathogens, 5 coming from surgical wards, 1 from ER and 1 from medical ward. From the blood cultures, 11.6% were positive. The most common pathogen was S. epidermidis in 46 samples. Regarding MDRs, we found 9 MRSA, 5 ESBL positive germens and 4 A. baumanii. The MRSA positive samples were obtained from patients coming mostly from surgical wards (5), followed by medical wards (2) and ER (2), with an LOS before ICU admission of 9.4 days. The ESBL positive pathogens were found in 3 patients admitted from ER and 2 patients from surgical wards, with average LOS previous to admission of 4.6 days. A. Baumanii was identified in 2 patients coming from medical wards, 1 from surgical and 1 from ER. In urine samples the most frequent germen was E. coli, found in 7 samples; and K. pneumoniae, found in 5. From these, 3 were ESBL positive, 2 coming from surgical wards and 1 from ER. The average LOS before ICU admission was 5.0 days. 2 samples were positive for A. baumanii. MDR Pseudomonas aeruginosa were found in 2 samples.

Conclusions
MDRs were found at ICU admission in 54 samples (9.6% of all positive samples). 44.4% came from surgical wards and 33.3% from ER. The number of MDR found in patients admitted from ER probably results from repeated contact with healthcare facilities. Surgical patients are more prone to MDR infections.

P401
Microbiological yield of bronchoalveolar lavage in the setting of veno-venous ECMO

Introduction
Whilst bronchoalveolar lavage (BAL) in the context of adult extracorporeal membrane oxygenation (ECMO) has been shown to be safe [1], there is a paucity of data regarding the microbiological yield and microbiota of BAL in this setting. In community acquired pneumonia (CAP), BAL may be of additive diagnostic value in around 50% of patients [2], with a similar yield in ventilator associated pneumonia (VAP) [3]. The aim of this study was to describe the positive microbiological BAL results and the yield of BAL in the context of adult veno-venous ECMO (VV-ECMO) at a large UK ECMO centre.

Methods
A retrospective analysis was conducted of all BAL microbiological results of patients supported with VV-ECMO at a UK specialist centre for severe acute respiratory failure. Data were collected for the period April 2014 to July 2016, restricted to the initial diagnostic BAL done on admission to this centre.

Results
Of the 85 patients treated with VV-ECMO, BAL was performed on admission in 71/85 cases (84%). The vast majority (83/85) were on antibiotics for suspected bacterial pneumonia. A total of 58 positive microbiological results were obtained from 45/71 patients, giving an overall BAL yield of 63%. The bacterial yield from BAL was 18%. Fungi were isolated from 23 patients (32%), respiratory viruses from 19 patients (27%), and bacteria from 13 patients (18%). The most commonly isolated fungi were species of Candida (predominantly C. albicans), found in 18 patients. Aspergillus was isolated from 4 patients (6%), but in no case was it the only organism identified. Influenza A, found in 11 patients (15%), accounted for more than half of all the positive virology results. Staphylococcus aureus was the most common bacterium isolated (7%), closely followed by Pseudomonas aeruginosa (6%).

Conclusions
BAL on admission to a specialist centre provides an acceptable overall microbiological yield in the context of VV-ECMO (as compared with previous data for CAP and VAP). The bacterial yield is low. The finding of a positive BAL fungal result should be correlated with the clinical context.

Introduction
The aim was to estimate the rate of urinary tract infection in bacteriuria, duration of urinary tract infection treatment and length of stay also associated factors with poor outcome in Coronary Care Unit patients.

Conclusions
Comparing the 2 periods, we found that there aren't substantial percentage differences of prevalence of bacterial colonization by Gram negative bacteria; on the contrary there is an increase of bacterial colonization by Gram positive bacteria -MRSA. Our study shows that tracheostomized patients, ventilated for 24 hours a day at home do not develop totiresistant colonization.

Introduction
Endotoxin tolerance leads to an attenuated inflammatory response at a second hit [1]. From this it was hypothesized that bacterial killing might be affected similarly. If so, studies whether this might be of importance for the choice of antibiotics are warranted. In a previous study it has been shown that bacterial elimination and killing in bacteraemic animals occur in the spleen and liver [2]. Therefore, the objective of this study was to compare the killing of bacteria in the spleen and liver in animals made endotoxin tolerant by a 24hendotoxin infusion with that in animals not exposed to endotoxin. Methods The endotoxin exposed group (Exp) (n = 18) were, prior to the bacterial exposure, given 0.063 μg/kg/h endotoxin during 24 h. Thereafter, an Escherichia coli intravenous infusion of 8.3 log 10 Colony Forming Unit (CFU)/h was administered for 3 h. The endotoxin unexposed group (Unexp) (n = 18) was given an identical bacterial challenge but without the preceding exposure of endotoxin. The animals were observed for signs of sepsis and inflammatory response and treated in accordance with an intensive care protocol [2]. Blood cultures were obtained hourly. After 6 h the animals were killed by a potassium chloride injection. Biopsies for quantitative culture were taken from the liver and spleen immediately post mortem.

Results
All animals developed signs of sepsis. Peak TNF-α levels 1 h after start of the bacterial infusion were 2.27 ± 0.13 and 4.17 ± 0.13 log10 pg/ml (mean ± SE) in the Exp and Unexp groups, respectively. Clearance of bacteria from the blood was rapid and within 1 h. The Exp group demonstrated a significantly increased killing and attenuated growth of bacteria in the spleen and a similar trend in the liver. Mean bacterial growths in the spleen of the Exp and Unexp groups were 3.16 ± 0.13 and 3.73 ± 0.17 log 10 CFU/g (mean ± SE), respectively (p = 0.01). Mean bacterial growths in the liver of the Exp and Unexp groups were 2.01 ± 0.18 and 2.40 ± 0.23 log 10 CFU/g (mean ± SE), respectively (p = 0.20).

Conclusions
Contrary to the hypothesis the reduced cytokine response characteristic for endotoxin tolerant animals was not associated with a reduced bacterial killing in the spleen and the liver. This indicates that these processes do not run in parallel in vivo. The effect on bacterial killing after more prolonged activation of inflammatory and antiinflammatory responses needs further investigation.

Introduction
Klebsiella pneumoniae (KP) infection is associated with high morbidity and mortality in different clinical settings. Multi-drug resistance associated with extended spectrum beta-lactamase (ESBL) among KP is endemic worldwide. Our study aims to evaluate the clinical characteristics and outcomes of patients with KP bacteraemia in the critical care and general ward settings.

Methods
Adult patients admitted to a regional hospital in

Results
There were 126 patients; 80 patients in tigecycline group and 46 patients in non-tigecycline group. We selected only 41 Acinetobacter baumannii VAP patients in each group with matched baseline characteristics. The mortality rate of tigecycline group was significantly higher than non-tigecycline group (73.20% and 46.30% respectively, P = 0.013). The median length of hospital stay was also significantly higher in tigecycline group than non-tigecycline group (46.99 days and 30.24 days respectively, P = 0.004).

Conclusions
The use of tigecycline in Acinetobacter baumannii VAP patients increased the mortality rate and the length of hospital stay compared with the use of other antibacterial agents. However, a prospective cohort study with larger sample size is necessary.

Introduction
The incidence of carbapenem resistant enterobacteriacae (CRE) has been steadily rising [1]. The morbidity, mortality and financial implications of such patients are significant. We recently had few reports of colistin resistance in our centre. This prompted us to analyse the risk factors for the outbreak of pan drug-resistant (PDR) organism.

Methods
We did a retrospective analysis of the case records of the patients who had any culture report positive for pan drug resistant (PDR) organisms. Cultures were done in our centre using VITEK 2 compact for all specimens. We followed the report of VITEK-2 and EUCAST breakpoints were followed (S < = 2, R > =2) for Enterobacteriacae. Pseudomonas and Acinetobacter isolates were considered resistant to colistin if the MIC > = 8 and 4 respectively. Their APACHE II score on admission, all culture reports, antibiotics received, length of stay in ICU and hospital along with outcome were scrutinized.

Results
There were total 10 isolates of PDR organisms in 8 patients. Five patients had received prior colistin therapy due to growth of carbapenem resistant bacteria isolates while 3 cases had no prior colistin treatment. Two out of 5 such cases had colistin monotherapy. Among 10 such isolates, there was Klebsiella pneumonia in 7, Pseudomonas aeruginosa in 2 and Acinetobacter baumanii in one case. Among these 7 isolates were considered to be coloniser and three were infective pathogens. These PDR isolates were associated with CAUTI (5), tracheitis (2), bacteremia (1), meningitis (1) and soft tissue infection (1). Average APACHE II score was 24 indicating sicker patients with multiple comorbidities and organ dysfunction. Mean age of the patients was 50.4years. Average length of hospital stay was 42.8 days. Four of these patients died while other 4 were discharged home. Six of the 8 patients belonged to neurosciences (stroke and traumatic brain injury) and had overall poor status with Glassgow coma scale (GCS) <8 with long hospital stay.

Conclusions
Critically ill patients with longer hospital stay are more likely to get affected by PDR organisms. Patients requiring long-term rehabilitation should be cared for in dedicated centers. Culture reports should be judiciously interpreted to differentiate between colonizer and infective pathogen before treatment. Widespread indiscriminate use of colistin to treat CRE and other gram negative organisms can lead to emergence of PDR organisms. Strict implementation of antibiotic stewardship programme are essential to limit use and prevent abuse of colistin.

Introduction
The aim of study was to analyze multidrug-resistant (MDR) and extended spectrum betalactamases (ESBL) producing E. coli strains of monobacteremia also associated factors with length of stay in Intensive Care Unit (ICU) > =7 days and mortality.

Methods
The retrospective data analysis of patients

Introduction
Over the last decade, the prevalence of strains of Klebsiella Pneumoniae Carbapenemases (KPC) producing K. pneumoniae (Kp) has dramatically increased worldwide and has become a significant problem in terms of public health, especially in some countries.

Methods
In this retrospective observational study conducted in Intensive Care Units (ICU) of the the Italian teaching Hospital of Pisa, we recruited critically ill patients with a diagnosis of bloodstream infections (BSI) caused by KPC-Kp. 30-days mortality from the first positive blood culture, septic shock diagnosis, steroid therapy, SOFA, SAPS II, Charlson Score, antibiotics therapy, previous hospitalization were compared between Survivor and Non-Survivor groups.

Results
We enrolled 42 patients admitted in ICU between January 2012 and December 2015. The overall 30 days mortality rate was 52,4%. A significantly higher rate was observed among patients with a diagnosis of septic shock at BSI onset, steroid therapy, higher SOFA and SAPS II score. Gentamicin used in combination therapy was associated with lower mortality, regardless of meropenem use. Furthermore, in colonized patients the digestive decontamination with oral Gentamicin decreased BSI mortality. Previous use of meropenem was associated with increased mortality. Additionally, the intravenous admistration of Fosfomicin resulted in a lower mortality rate. Charlson score and optimal empirical therapy seemed not to have a significant influence on survival.
Conclusions KPC-Kp BSI was associated with high mortality. We found that a regime therapy including gentamicin is associeted with lower mortality. Futher prospective studies should be done to confirm our result.

Introduction
Neonates and children under 5 years are at major risk for sepsis, but knowledge on the global epidemiology of childhood sepsis is scarce. The aim of this study was to estimate the global burden of and mortality from sepsis in children and neonates based on available evidence from observational epidemiological studies.

Methods
Systematic review and meta-analysis. We searched 13 international databases for published and grey literature between 1979-2016, supplemented by hand search and expert consultation. Studies reporting on the incidence of sepsis in children (<20 y) defined according to the International Consensus Conference on Pediatric Sepsis Definitions [1], ACCP/SCCM consensus criteria [2,3] or related sepsisrelevant ICD-9/ICD-10 codes on a population level per 100 000 person-years or live births were included.

Results
Of 1270 studies, 23 studies from 16 countries met the inclusion criteria. Sixteen were from high-income-countries, seven from middleincome-countries. Fifteen studies reported complete data and were included in the meta-analysis.

Conclusions
Sepsis is a major contributor to neonatal and childhood mortality. Further comprehensive studies on sepsis epidemiology especially in low-and middle-income countries are needed, as well as effective measures to reduce the burden of sepsis in children globally.

Introduction
The prevalance and mortality of sepsis in Turkey is largely unknown. Turkey is among the countries that have a high level of antibiotic resistance. Turkish Society of Intensive Care Medicine, Sepsis Study Group conducted a national, multicenter, point-prevalance study to determine the prevalence, causative microorganisms, and outcome of sepsis in Turkish ICUs. Methods A total of 132 ICUs from 94 hospitals participated the study. All patients (>18 yr old) present on the participating ICU or admitted for any length of time during a 24 hr period between 08:00 on Wednesday 27 January 2016 and 08:00 on Thursday 28 January 2016 were included the study. The International Sepsis Forum definitions for common sites of infection were used [1]. The presence of SIRS, severe sepsis and septic shock was assessed and documented based on consensus criteria of the ACCP/SCCM (SEPSIS-I) [2]. Data were also used to define septic shock according to the SEPSIS-III definitions [3]. Demographics, severity of illness, comorbidities, microbiological data, therapies used, length of stay and outcome (dead or alive through 30 days) were recorded.

Introduction
We tested the hypothesis that differences in generic and sepsisspecific patient characteristics explain the observed differences in sepsis outcomes between countries [1].

Methods
We studied first ICU episode for adult medical patients with sepsis admitted during 2013 using national data sources from England (ICNARC Case Mix Programme) and Brazil (ORCHESTRA study). After harmonizing relevant variables, the datasets were merged. Sepsis was defined as infection and > =1 organ dysfunction (OD) (> = 2 points) using a modified SOFA score to align with Sepsis-3 [2]. The primary outcome was acute hospital mortality. We used multilevel logistic regression models to evaluate the impact of country (Brazil vs England) on hospital mortality, after adjustment for generic (age, sex, comorbidities, admission source, time to ICU admission) and sepsis-specific (infection site, OD type and first order interactions) characteristics. We report risk-adjusted mortality stratified by admission source, time in hospital prior to ICU admission, infection site, and decile of predicted risk of death (from our regression model).

Conclusions
We show for the first time that generic and sepsis-specific patient characteristics explain observed differences in sepsis outcomes between countries.

Introduction
Esophageal heat transfer device (Advanced Cooling Therapy, Chicago, Ill, USA) is a new temperature management device, which has mostly been used for temperature management in survivors of cardiac arrest [1]. Our aim was to evaluate the effectiveness of the device in long-term temperature management in non-cardiac-arrest patients.

Methods
A retrospective study from January to November 2016 with inclusion criteria: the device used for >120h in non-cardiac arrest patients.

Results
We included 3 males (age 67, 71 and 74 years) and 1 female (36 years). The duration of treatment with the device was 220, 192, 452 and 168h, respectively. Indications for temperature management were hyperthermia associated with sepsis (1 patient with meningitis, 2 with pneumonia and 1 with necrotizing fasciitis). Target temperature was determined by the attending physician (36-37°C in a patient with meningitis and one patient with pneumonia, 36.5-37.5°C for one patient with pneumonia, and 37-38°C in patient with fasciitis). Temperature was in target range for 81% of time, ±0.5°C outside target range for 11.6% of time, and > =0.6°C outside of target trange for 7.4%. The greatest temperature fluctuations (>2°C outside target temperature) were observed in the patient with fasciitis after return from operating theatre (the device was disconected from the chiller during surgery) (Fig. 14).

Conclusions
Esophageal heat transfer device can be used for effective long-term temperature control in non-cardiac arrest patients.

Introduction
Dysregulated systemic inflammatory response in septic shock often results an overwhelming &#8220;cytokine storm&#8221;, evolving into fulminant sepsis, with multiple organ dysfunction and early death. Attenuating the cytokine storm by adsorbing cytokines via hemoperfusion (CytoSorb) has pathophysiological rationale. Our aim was to investigate the effects of CytoSorb therapy on organ dysfunction and inflammatory response when started early (<48 h) in septic shock.

Methods
Patients fulfilling septic shock criteria were randomized into CytoSorb and Control groups. CytoSorb therapy lasted for 24 hours. Blood samples were taken to determine IL-1, IL-1ra, IL-6, IL-8, IL-10, TNF-α, PCT, CRP levels. At this stage of the study only PCT and CRP levels were analyzed. Organ dysfunction was evaluated by the Sequential Organ Failure Assessment (SOFA) score. Data were recorded on enrollment (T 0 ) then at T 12 , T 24 , and T 48 hours.

Conclusions
According to our interim results CytoSorb therapy resulted in better SOFA scores, and also proved to be safe without significantly affecting PCT and CRP changes within the first 48 hours of septic shock. Based on the results of this current pilot study we are planning to design a prospective randomized multicenter trial.

Introduction
Unfractionated heparin (UFH), antithrombin (AT), and recombinant thrombomodulin (RT) represent two distinct anticoagulant/antithrombotic agents with different targets in the hemostatic process to produce their therapeutic effects. Both of these agents are widely used in various hematologic indications in mono and poly therapeutic approaches. Currently, a recombinant version of thrombomodulin, ART-123 (Recomodulin) is undergoing clinical trials to validate its efficacy in sepsis-associated coagulopathy. Recomodulin is a novel, recombinant and soluble thrombomodulin, and is a human protein with both thrombin inhibiting and protein C stimulating activities. In comparison to both UFH and AT, this agent has relatively weaker anticoagulant activities related to bleeding risk at therapeutic concentrations of < = 1.25 ug/mL. Supratherapeutic concentrations of this agent may occur in some patients with renal dysfunction. The purpose of this study is to compare the anticoagulant and platelet modulatory effects of ART-123, UFH, and AT. Methods UFH of porcine origin was obtained from Medefil Inc. (Glendale Heights, IL) in powdered form with a specific activity of 175 U/mg. A working concentrations of this agent was made at 100 ug/mL in sterile saline. AT was commercially obtained from Baxter Healthcare Corporation (Deerfield, IL). A working concentration of AT was prepared at 100 U/mL in sterile saline. Recomodulin was commercially obtained and was manufactured by Asahi Kasei Pharma (Japan). A working concentration of Recomodulin at 100 ug/mL was prepared in sterile saline. The effect of Recomodulin, AT, and UFH on the glass activated clotting time and thromboelastographic (TEG) profile was measured at concentrations of 0-5 ug/mL. Global anticoagulant Fig. 14 (abstract P419). Temperature changes compared to target temperature over time assays including PT, APTT, and TT were also measured in citrated whole blood and retrieved plasma. The effect of these drugs on agonist induced platelet aggregation (arachidonic acid, ADP, collagen, thrombin, and epinephrine) was measured in platelet rich plasma collected from healthy donors.

Results
In comparison to both AT and UFH, Recomodulin did not produce any anticoagulant effects in either the TEG or the ACT tests at concentrations of 1.25 ug/mL. At higher concentrations of 2.5 and 5.0 ug/mL, the relative anticoagulant effects of Recomodulin were much weaker in comparison to both AT and UFH. In the TEG profile at 5.0 ug/mL, both the AT and UFH produced complete anticoagulation. However, Recomodulin did not produce a complete anticoagulation at 2.5 and 5.0 ug/mL. In the whole blood global clotting assays, all agents produced a concentrationdependent anticoagulant effect following the order UFH > AT > Recomodulin. In the platelet aggregation studies, while heparin produced a mild increase in the aggregation profile of some of the agonists, AT and Recomodulin did not produce any effects at concentrations of up to 10 ug/ml and 5 U/ml for all of the agonists except thrombin.

Conclusions
The circulating levels of Recomodulin for the management of sepsisassociated coagulopathy range from 0.5-1.5 ug/mL. The therapeutic levels of UFH for similar indications range from 1.5-5.0 ug/mL (0.25-1.0 U/mL), whereas the circulating AT levels may range from 1-2.5 U/ mL. The results from this study suggest that Recomodulin is a much weaker anticoagulant in comparison to both UFH and AT and at therapeutic concentrations, it does not produce any measurable anticoagulant effects. At supratherapeutic concentrations of > 2.5 ug/mL, Recomodulin exhibits weaker anticoagulant effects which are unlikely to contribute to any hemostatic deficit resulting in potential bleeding complications.

P422
Safety of thrombomodulin (ART-123) in surgical patients with sepsis and suspected disseminated intravascular coagulation (S + DIC)

Introduction
We hypothesized that ART-123 is safe for therapy of S + DIC, particularly re: Bleeding risk. S + DIC occurs in 20%-35% of hospitalized patients with sepsis. Limited safety data exist for surgical patients receiving ART-123 for S + DIC despite substantial clinical use in Japan (>160,000 pts treated).

Methods
Retrospective review of a large Phase 2b, randomized, placebocontrolled trial of therapy with ART-123 to assess safety [1] in 741 treated pts. Data were analyzed for the presence and type (

Introduction
Recent data of the Hellenic Sepsis Group showed that treatment of severe infections by the IgM-enriched preparation Pentaglobin delayed the advent of breakthrough bacteremia suggesting a probable trained immune effect [1]. We investigated if Pentaglobin can elicit such trained immune responses.

Methods
Interleukin(IL)-10 were selectively measured in serial serum samples from five patients with ventilator-associated pneumonia of a prospective cohort [2] who received Pentaglobin at the discretion of the attending physicians. Fresh peripheral blood mononuclear cells (PBMCs) were isolated from six healthy volunteers and pretreated for 24 hours with medium or Pentaglobin so that final IgM concentration was 50mg/dl. After washing, PBMCs were stimulated for 24 hours with heat-killed 5log10-growth isolates of Klebsiella pneumoniae producing-carbapenemase (KPC) and multidrug-resistant Pseudomonas aeruginosa (MDRPA). Isolates were selected from those infecting already described Greek patients; isolates were kept frozen and thawed [1]. Tumour necrosis factor (TNF) alpha was measured in PBMC supernatants by an enzyme immnosorbent assay.

Conclusions
Pentaglobin pre-treatment leads to enhanced TNFα responses after re-stimulation with resistant Gram-negative hospital-acquired pathogens simulating enhanced host responses. This corroborates with the decrease of circulating IL-10 shown in patients and explains the protection from breakthrough bacteremia seen in our recent clinical study [1].

Introduction
In a recent publication, 100 patients with severe infections by MDR Gram-negative bacteria were treated for five days with one IgMenriched immunoglobulin preparation (IgGAM, Pentaglobin); they had favorable 28-day outcomes compared to 100 well-matched comparators for all infection variables [1]. We re-analyzed regarding longterm outcomes.

Methods
Patients were sub-grouped by SOFA score; survival on day 90 was compared by the log-rank test. Serial procalcitonin (PCT) measurements were available for 22 IgGAM-treated patients and 35 comparators and compared by the Mann-Whitney U test.

Results
In the subgroup with SOFA more than 10, survival until day 90 was prolonged with IgGAM (median 27 vs 8 days, Fig. 16

P426
Medical application content quality metrics: a cross-sectional survey of current applications used to aid the management of sepsis P Allan 1 , R Oehmen 2 , J Luo 1 , C Ellis 1 , P Latham 3 , J Newman 4

Introduction
The use of mobile applications (apps) by healthcare practitioners is increasing globally [1]. While this has the potential to improve clinical knowledge translation, it also represents a potential source of medical risk due to a lack of both regulation and validation. In this study, we attempt to assess the quality of currently available apps relating to the management of sepsis.

Methods
Between the 15th and 19th of April, 2016 a detailed search was completed of all apps available on the two largest global app stores relating to sepsis. Search terms 'sepsis' , 'septic' and 'septic shock' were used. Matching apps were assessed against set inclusion criteria such as 'English language only' , "not a game". Apps meeting these criteria were then purchased and reviewed for further metrics including intended audience, and content design. Apps not intended for medical practitioners and those not relating to the management of sepsis were further excluded. Remaining apps underwent a detailed review of content including the representation of international sepsis guideline information, author metrics, references, legal disclaimers as well as update and review processes.

Results
Search terms returned 236 unique apps. 19 apps passed all barrier conditions for final review. Of these, 15 were information references only and did not modify management information based on user entered patient data. Of the 4 apps that did, only 2 offered a commitment to maintaining privacy. Reference apps were typically not specific to sepsis management but rather covered a broad range of diseases. 5 apps were electronic versions of published medical textbooks. Instances of information conflicting with guidelines was common. Only 12 apps included references. Periodical updates were evident in the majority of surveyed apps, but only 8 apps had been updated within 12 months. Legal disclaimers outlining liability were observed in only 8 of 19 applications. No application explicitly mentioned the presence or absence of any conflicts of interest.

Conclusions
The quality of surveyed medical apps concerning sepsis management varied greatly. The potential for clinical risk is high due to the lack of regulation governing currently available medical apps. The development of designer, consumer and market-based processes to regulate medical app content is important as this technology is increasingly integrated into modern clinical practice.

Introduction
In this study we assessed the effectiveness of our sepsis simulation based education (SBE) course using the Kirkpatrick learning evaluation model [1]

Conclusions
Our course has demonstrated that participants enjoy and gain knowledge in sepsis and human factors in the context of SBE. Importantly it has demonstrated a statistically significant improvement of sepsis 6 delivery within 1 hour.

Introduction
It is unclear whether qSOFA also has prognostic value for organ failure in patients with suspected infection. The aim of this study was to determine the prognostic value of qSOFA compared to systemic inflammatory response syndrome (SIRS) for predicting organ failure in patients with suspected infection in emergency department (ED).

Methods
We retrospectively reviewed the medical records of 3234 patients with suspected infection in ED of a university-affiliated hospital during a 10-year period. We analyzed the ability of qSOFA compared to SIRS for predicting organ failure development (defined as an increase in the SOFA score of 2 point or more) in patients admitted at ED using area under receiver operating characteristic (AUROC) curve. When qSOFA score was equal to or greater than 1, its sensitivity and specificity for predicting organ failure were 75% and 82%, respectively.
Conclusions qSOFA can predict the occurrence of organ failure in patients with suspected infection. It has a superior predicting ability than SIRS. Further prospective study about the optimal cutoff value of qSOFA for predicting organ failure is warranted.

Introduction
Influenza is associated with high morbidity and mortality rates worldwide. The severity of influenza infections is increasing over the last years. In this study we aimed to identify risk factors for ICU admission and mortality for influenza patients during the 2015-2016 epidemic in The Netherlands.

Methods
We performed a retrospective cohort study in influenza patients who were hospitalized in two university hospitals during the 2015-2016 influenza epidemic. Cases were identified using databases of the microbiology departments at both medical centers. Patients admitted to the hospital with clinical symptoms due to an acute infection with Influenza A or B were included in this study. Virus samples were obtained from nose/throat swab, sputum collection or bronchoalveolar lavage. A laboratory confirmed novel influenza infection was defined as a positive polymerase chain reaction, immunofluorescence assay or simple culture for either influenza A or B. In patients where a rapid influenza test was initially used, one of the above laboratory techniques subsequently had to be positive before a patient could be included in the study as a case.

Results
We identified 200 cases with influenza type A or B admitted to one of the index hospitals. Data of 1 patient was of poor quality and excluded from further analysis. Overall mortality was 9%. 45 patients (23%) of the patients were admitted to the ICU, either primarily or later during the disease. Risk factors for ICU admission were smoking (p = 0.01), a history of OSAS/CSAS (p = 0.03) and myocardial infarction (p = 0.007). The development of renal failure and secondary (pulmonary) infections were also associated with ICU admission (P < 0.001). 17 of the 45 (38%) patients admitted to the ICU died during admission. Risk factors for ICU mortality included diabetes (p = 0.04) and renal failure before and during ICU admission (p = 0.01). In 25 out of 45 patients (55.6%) a secondary pulmonary infection developed. The most common bacterial pathogens included staphylococcus aureus (11.1%) and streptococcus pneumoniae (6.7%). Among the secondary fungal infections, Aspergillus fumigatus was most common (17.8%), followed by pneumocystis jirovecii (6.7%). Patients with a secondary infection received oseltamivir more often (100% vs. 65.0%, p = 0.002).
No association was observed between other immunosuppressive drugs and the development secondary pulmonary infections.

Conclusions
Development of secondary pulmonary infections is a frequent complication in patients with influenza and associated with an increased risk of ICU admission. Early identification and treatment of these patients may prevent ICU admission.

Introduction
Several studies showed neuroprotective and immunomodulatory mechanisms of therapeutic hypothermia (TH). TH negatively affects leucocyte migration and synthesis of cytokines and is therefore thought to enhance infection rate due to impaired immunosurveillance. However, data for TH in children is limited. Therefore, we analyzed incidence of infections and outcome of children treated with TH (TH-group) and compared them to normothermic patients (NT-control group).

Methods
This study was performed as a retrospective case control study (from 2000 to 2012). All medical records and laboratory files of patients (newborns and children until 18 years of age), receiving TH for 24 to 72 hours, were screened and compared to a NT-group (historic control group with conventional treatment). Both groups were matched according to diagnosis and age. Data was analyzed from the day of admission to ICU until day 6. Indications for TH-initiation were cardiac arrest, peripartal asphyxia, traumatic brain injury, ischemic stroke, cerebral haemorrhage, -edema or -seizures, as well as acute liver failure. Patients with evident infection, medical history of an immunodeficiency disorder and / or an immunosuppressive therapy, before onset of TH, were excluded. TH (32-34°C) was induced by using a non-invasive cooling device. After 72 hours rewarming was started (0.2-0.3°C/hour). Results 108 patients were included (TH-group n = 27, NT-group n = 81). Survival rate showed no significant difference (81% of TH and 78% of NT, p = 0.996). CRP elevation (>1.2 mg/dL) was earlier in the THgroup and showed a significant difference on day 6 (4.93 mg/dL) to NT-group (2.36 mg/dL, p = 0.007). 22,2% of patients in the TH-group had culture proven infections in comparison to 4% within the NTgroup. Five patients in the NT-group showed only clinical signs of infection. Pneumonia was the most commonly culture proven infection within both groups (TH 14.8% and NT 6.2%).

Conclusions
We report that therapeutic hypothermia did not significantly alter overall survival and length of hospital stay in our pediatric center. Similar to previous observations [1], our results showed a significant increase of CRP levels in TH patients as compared to NT controls, and had more culture proven infections. This data underlines the necessity of continuous monitoring for possible infectious complications when TH is used in pediatric intensive care setting.

Introduction
Nosocomial acquisition of multidrug resistant Acinetobacter baumannii in the ICU is common. Despite a high fatality rate, over 50% of ICU Acinetobacter carriers are discharged alive from ICU to hospital wards where they continue to represent a source for cross transmission. We describe an intervention to terminate an ICU Acinetobacter outbreak and investigate the effect on hospital wide Acinetobacter prevalence. Methods ICU Acinetobacter incidence and prevalence were detected from surveillance and clinical cultures. Hospital prevalence was determined from clinical cultures. The Acinetobacter control intervention included unit closure with intense environmental cleaning, a hand hygiene intervention, improved cleaning protocols and the use of virtual walls. Data for the year preceding and following the intervention were compared for (1) ICU admission prevalence and ICU acquisition of Acinetobacter, (2) hospital prevalence of Acinetobacter and (3) Colistin use.

Conclusions
The ICU intervention decreased ICU acquisition of Acinetobacter and ICU discharge of Acinetobacter positive patients to the wards and was associated with a decrease in hospital prevalence of Acinetobacter. Numerically, the decrease in hospital prevalence (from 185 to 107 patients) exceeded the decrease in ICU discharge of Acinetobacter patients (from 48 to 1). This could suggest that decreased ICU discharge of Acinetobacter positive patients leads to a lower Acinetobacter load in the wards and thus decreased transmission in the wards. The decreased load of Acinetobacter also facilitated a decrease in Colistin use. In conclusion, control of an ICU Acinetobacter outbreak had positive ramifications throughout the hospital.

P432
Observational study of central venous catheter care and reasons for removal A Faux 1 , R Sherazi 1 , A Sethi 2 , S Saha 1

Introduction
Vascular catheters are a ubiquitous tool in the critical care setting; however their use does not come without risks. Potentially one of the most serious and preventable complications is catheter-related blood stream infection (CR-BSI). The aim of this study was to examine the reasons why central venous catheters (CVC) are removed, whether there is clinical improvement following removal, and see how this relates to rates of CR-BSI.

Methods
We retrospectively studied the insertion, care, and reason for removal of central venous catheters across a single NHS trust. This data was then matched with the corresponding biochemical and microbiological results for each patient. Suspected CR-BSI was defined as clinical evidence of sepsis and with no apparent source of septicaemia except catheter tip colonization. Confirmed CR-BSI was subsequently defined by colonisation of a catheter tip with the same microorganism grown from a peripheral blood culture. A routine change of CVC or arterial catheter was defined as replacement without evidence of sepsis or problems with the functioning of the catheter.

Results
A total of 87 CVCs were studied with an average duration of insertion of 5.5 days. The most common reason for removal was absence of indication (28.7%). 17 out of 87 (19.5%) of CVCs were removed due to suspicion of CR-BSI, and of these only 1 was confirmed ( Table 6). We further examined the cultured catheters for evidence of biochemical improvement following removal (Table 7). Of those who experienced improvement of inflammatory markers following CVC removal, we found no difference in mean duration of insertion, and similarly they were no more likely to have colonised CVCs than patients who did not experience improvement of their inflammatory markers (25% vs 29%).

Conclusions
In the NHS trust studied we found that 16.1% of central venous catheters were routinely changed after an average of 7.1 days without obvious indication other than to prevent catheter-related infection. Although some of these lines were colonised when cultured, we did not find any biochemical benefit of removing them routinely. This is consistent with evidence from randomised trials showing that routine replacement does not reduce rates of catheter-related bloodstream infection compared with changing them when clinically indicated [1] [2].   Results LPS level in blood significantly reduced after the hemoperfusion procedure (equally or more than twofold) in 50% of the patients with its initially high level (Fig. 17). Significant reduction of serum LPS level required from 2 to 6 hemoperfusion procedures. We did not detect any changes in the patients with initially low level of LPS. IL-6 and IL-8 showed the most prominent concentration changes among the cytokines under the study. Their level dropped down from 2 to 10 times after the procedure in many of the patients with initially high level of the cytokines. Number of functionally active leucocytes with high phagocytic activity and enhanced expression of CD11c adhesion molecules decreased after the hemoperfusion procedure in the patients. Substantial part of the leucocytes adhered on the Alteco hemofilter plates (Fig. 18).

Conclusions
Hemoperfusion procedure allows eliminating from the blood not only LPS, but excess inflammatory mediators as well.
Adhesion of activated neutrophils on the Alteco hemofilter allows eliminating from the bloodstream of septic patients the most reactogenic phagocytes releasing cytotoxic free radicals.

Introduction
Sepsis is the most common cause of death by infection. In particular, abdominal sepsis is characterized by the highest death ratio and a complicate management, because of the high variability of microorganisms involved and the growing problem of antibiotic resistance. Therefore, the aim of this study is to identify potentially modifiable risk factor of death in patients with abdominal sepsis. Methods This is a prospective-observational study conducted in ICU at Sant'Anna Hospital, Ferrara. All patients >18 years admitted with the diagnosis of abdominal sepsis, according to 2016 sepsis criteria, were enrolled. We collected clinical and demographic data for each patient. Moreover, we recorded laboratory and blood gas analysis data microbiological investigations, anti-infective treatment (antimicrobial therapy/source control) and fluid balance. Primary outcome was 90days mortality.

Results
Thirty patients were enrolled. Clinical and demographic variable are shown in Fig. 19. The 61% of isolated microorganisms were multiresistant (MDR). Mortality in patients affected by MDR bacteria was not increase (36% vs 35%; p = 0.91). However, an appropriate empirical antibiotic therapy in the first 12 hours was found to be associated with a decreased mortality from 70% to 12% (p = 0.01). Moreover, positive fluid balances for more than 72 hours were more frequent in non-survivor patients (47% vs 18% p = 0.04).

Conclusions
In our population, MDR bacteria were highly represented. However, not the resistance itself, but the failure of an appropriate early empirical therapy was strongly associated with increased mortality; therefore, the knowledge of the local epidemiology of microorganisms and of their drug-resistance is crucial. Moreover, the inability to get a null or negative fluid balance after the first 72 hours in the ICU is a risk factor for 90-days mortality.

Introduction
Health care associated pneumonia (HCAP) ability to identify pneumonia sustained by multi drug resistant pathogens is controversial [1]. We sought to provide a new scoring system addressing the empirical antimicrobial therapy based on the correct identification of resistant pathogens in patients with pneumonia.

Methods
In this prospective observational study, we considered 93 adult patients with microbiological confirmation of pneumonia. We looked for the prevalence of Community-Acquired Pneumonia-Drug-Resistant Pathogens (CAP-DRP), among patients classified and treated as having CAP or HCAP. The primary goal was to assess whether the appropriateness of initial empiric therapy would affect the short term mortality and if we could improve it with a new scoring system. Conclusions: HCAP-addressed empirical therapy leads to overtreatment and undertreatment of a considerable percentage of patients with pneumonia. Among them, the score system we propose may help to choose the most appropriate empiric antimicrobial therapy, aiming at reducing inappropriate initial therapy and its associated mortality.

Introduction
Antimicrobials are the 51% of the drugs used in intensive care units (ICUs) [1] and antimicrobial resistance has increased dramatically in recent years. Antimicrobial stewardship (AMS) programs aims to establish strategies to improve the use of antimicrobial and reduce resistance, adverse effects, Clostridium difficile infections and cost [2]. International guideline proposes restrictive, persuasive and structural strategies. In Chile, the feasibility of these programs are not totally studied. We showed a process of implementation AMS programs in our ICU.

Methods
We conducted a 3 steps process of implementation of AMS program. We reviewed the literature to identify AMS strategies published from 2010 to 2016, inclusion criteria were studies in adult patients, high complexity hospitals or ICU, and have description of the strategies in their results. A local survey was applied to 8 health care professional to evaluate the feasibility to implement the strategies found, this survey consists of positives, neutral and negative answers and if the strategy counts with a half of positive answers we considered a possible implementation. A team with clinical pharmacist, intensivist and infectious diseases physicians worked on the set to develop the optimal way to implement these strategies in our ICU.

Results
We found 362 studies about AMS program, in which 49 included 21 strategies in ICUs. Strategies were classified in restrictive, persuasive and structural. In our unit 9 of those are already implemented. According to local survey 7 strategies were feasible to implement (Fig. 21). Implementation was carried out through two processes: Medical record optimization and local guideline of antimicrobial treatment.

Conclusions
With a baseline of 9 strategies we implemented 7 additional. This study is the first step to establish AMS policy in our ICU. The next step is made assessment of the clinical impact of our AMS policy.

Introduction
To better inform infection control and antibiotic stewardship programs, we investigated carbapenem resistance trends and assessed the molecular characteristics of β -lactamases (ESBLs, AmpC β -lactamases and carbapenamases) among Enterobacteriaceae isolates from IAI patients treated in South African hospitals participating in the Study for monitoring Antimicrobial Resistance Trends (SMART) program 2010 and 2015.

Methods
Cochran-Armitage test was used to examine trends in susceptibility. Classification and regression trees (CART) were used to identify minimum inhibitory concentration (MIC) thresholds of various drugs as well as other factors predictive of phenotypic CRE. EUCAST version 6 MIC interpretive criteria were used to identify non-susceptible isolates.

Results
Of the 124 isolates 109 (88%) were phenotypically ESBL, 122 (98%) of these had one or more β -lactamases and ampC genes identified and 98 (79%) were fully susceptible to the carbapenems (Fig. 22). Figure 22A shows that K.pneumoniae (68/124) and E coli (42/124) were the majority isolates contributed most of the drug resistance genes identified. Carbapenem susceptibility significantly declined by an average of 35% (2-80), driven in part by increase in resistance in K pneumoniae isolates (Fig. 22B). On the contrary, susceptibility to amikacin significantly increased, while that of piperacillin/tazobactam decreased to 59%. Of the 26 isolates nonsusceptible to carbapenems; 6 (23%) were CAI and were 12 (46%) were HAI; p = 0.468 (Fig. 22C). CTXM-15 lactamase was the most frequent gene identified I 85/124 (65%) of isolates; but, was found in combination with other genes 67/ 81 (83%) of the time. In fact, the odds for nonsusceptibility to carbapenems rose 26-folds (3-245) when 4 genes were present compared to isolates with only one drug resistance gene was present for all isolates. Conclusions CTXM-15 carrying K pneumonia are emerging as causes of CRE associated IAI in South Africa. Piperacillin/tazobactam is an unlikely substitute antibiotic for this patient cohort given declining susceptibility trends and overall poor activity. Introduction Intraosseous (IO) access may be lifesaving in medical emergencies, when conventional vascular access is difficult to achieve. In lifethreatening infections, early administration of antibiotics is crucial. In septic shock, the circulation is markedly compromised. We wanted to elucidate whether sufficient blood levels are reached, when antibiotics are administered IO [1]. Methods A model of endotoxemic shock was used, where 8 anaesthetized, extensively monitored, pigs were given a continuous infusion of E. Coli endotoxin over 6 hour period, after which, the animals were sacrificed. The Animal Ethics Committee of Uppsala University, Sweden, approved the experiment. IO access was achieved in the tibial bone (EZ-IO®, Teleflex Medical, Morrisville, NC, USA). Cefotaxime at 75mg/ kg and gentamicin at 7 mg/kg, respectively were administered IO or intravenously (IV). Central concentrations of these antibiotics were measured at 5, 15, 30, 60, 120, and 180 min.

Results
After starting the endotoxin infusion, most of the animals showed signs of hemodynamic instability with reduced mean arterial blood pressure (MAP), cardiac index and markedly elevated mean pulmonary arterial pressure. All animals except one, needed norepinephrine to keep MAP >60 mm Hg. The plasma concentrations of both cefotaxime and gentamicin, when injected IO and IV, respectively, were nearly identical with similar slopes showing decreasing plasma concentration of each antibiotic by time. There were no significant differences between either antibiotic, regardless of injection site, based on analysis of variance. The 95% confidence intervals regarding the difference of means for each antibiotic, were well within the relevant intervals.

Conclusions
This study strongly suggests that IO access may be an appropriate alternative to IV administration of antibiotics, in some severe infectious conditions (e.g. septic shock), when venous access is difficult to achieve. Relevant concentrations of these antibiotics were achieved even in severely circulatory comprised animals. Thus, we deduce that IO antibiotic administration also may be considered in the prehospital setting, as early systemic administration of antibiotics may be lifesaving [2].

Introduction
Our study was carried out to audit aminoglycoside resistant patterns and assess the need to cycle aminoglycosides. The current trust antimicrobial policy at Lewisham (UHL) and Greenwich (QEH) NHS Trust advises the use of gentamicin as a first line agent against gram-negative organisms except in severe sepsis where it recommends amikacin. Antimicrobial cycling has a role in critical care due to the increasingly rapid spread of resistant bacteria, influencing healthcare costs and mortality [1] [2]. Of note, the rotation of gentamicin and amikacin has been found to reduce gentamicin resistance in some studies [3].

Methods
This was a 3-month retrospective, cross-site study from 1/8/16 -31/ 10/16. Data was obtained from UHL and QEH critical care units. Resistance patterns of samples positive for gram-negative organisms were analysed using WinPath. The total number of resistant organisms were found, and within this cohort; the number of gramnegative organisms resistant to both gentamicin and amikacin. The proportion of extended spectrum beta lactamase (ESBL) and AmpC positive organisms were also analysed. Results 4398 samples were obtained; 2822 (64%) from UHL and 1576 (36%) from QEH. 68% of gram-negative organisms were sensitive to all antimicrobials. Within the 3 month period 1462 (33.2%) of samples displayed antibiotic resistance; of which 4% of organisms were found to be resistant to gentamicin.

Conclusions
This audit has demonstrated a significant number of gentamicin resistant organisms. However this does not reach the national threshold for cycling. We have noted the need to reinforce the use of amikacin as a first line aminoglycoside in all critically ill patients.

Introduction
Linezolid (LZ) is an antibiotic with time-dependent activity. An optimal antibacterial effect is achieved when plasma drug concentrations are above the MIC (T > MIC) for the entire length of treatment. Critically ill patients receiving standard doses of LZ, particularly in those with sepsis and conserved renal function (CRF), are at risk of not attaining PK/PD targets. LZ clearance (LzCl) is usually increased in these patients, often due to augmented renal clearance (ARC). Continuous infusion (CI) could offer advantages in this context [1].

Introduction
Tigecycline (TGC) is a key antibiotic in the therapy of intra-abdominal infections in critically ill patients. Liver failure frequently occurs in this patient collective and may lead to altered pharmacokinetics (PK) of TGC in these patients. We aimed to (i) determine the PK of TGC in patients with and without liver impairment and to (ii) compare the value of the novel maximum liver function capacity (LiMAx) test and  The most common source was tracheal aspirate followed by bloodstream. In addition to C/T, amikacin (AMK), cefepime (FEP), ceftazidime (CAZ), colistin (COL), levofloxacin (LVX), meropenem (MER), and piperacillin/tazobactam (TZP) were tested. Antibiotic resistant phenotypes identified using EUCAST (2016)

Results
Fifty-one patients met inclusion criteria. There were no differences in clinical cure rates between CTX (81.8%) and SOCT (72.5%) (p = 0.7063). There were no differences in treatment failure between CTX (9%) and SOCT (7.5%) (p = 0.99). Median time to negative blood culture was 40 hours for CTX vs. 36.5 hours for SOCT (p = 0.45). Median time to defervescence was 13 hours for CTX and 22 hours for SOCT (p = 0.99). AKI occurred in 0 patients treated with CTX and 4 patients treated with SOCT (p = 0.5726).

Conclusions
In this comparison of CTX vs. SOCT for the treatment of MSSA bacteremia we found no difference in clinical cure. We also found no difference in treatment failure, time to negative blood culture, time to defervescence, and occurrence of AKI.

Introduction
Ventilator-associated pneumonia (VAP) is a common and serious problem in intensive care unit (ICU). Several studies have demonstrated that Gram stain of endotracheal aspirate may be useful for accurate diagnosis of VAP. However, the effectiveness of Gram stain on prediction of causative microorganisms has not been elucidated. The purpose of this study is to evaluate whether Gram stain of endotracheal aspirates can be used as a guide of initial antibiotic therapy for VAP.

Methods
Data on consecutive episodes of microbiologically confirmed VAP were collected from February 2013 to February 2016 in our ICU. We constructed 2 hypothetical empirical antibiotic treatment algorithms for VAP. The first algorithm was the guideline based algorithm (GLBA) that based on the recommendation of American Thoracic Society-Infectious Diseases Society of America guidelines. The second one was the Gram stain based algorithm (GSBA) that limited spectrum of the initial antibiotic therapy according to the results of bed-side Gram stain. Subsequently, the GLBA and the GSBA were retrospectively reviewed in the same VAP episodes. The initial coverage rates and the recommendation of broad-spectrum antibiotics was compared between the two algorithms.

Results
During the study period, 219 suspected VAP episodes were observed and 131 episodes were assessed for analysis. Appropriate antibiotic coverage rates were equivalent in the two algorithms (GLBA: 94.7% vs GSBA: 92.4%, p = 0.221). The number of episodes that anti-methicillin resistant Staphylococcus aureus agents were recommended as an initial treatment was larger in GLBA than GSBA (70.2% vs 31.3%, p < 0.001). Furthermore, the number of episodes that antipseudomonal agents were recommended as an initial treatment was also larger in GLBA than GSBA (70.2% vs 51.9%, p < 0.001).

Conclusions
The GSBA may restrict the administration of broad-spectrum antibiotics without increasing risk of treatment failure.

P445
Efficacy of murepavadin co-administered with standardof-care in a phase 2 study in patients with ventilatorassociated pneumonia due to Pseudomonas aeruginosa infection A Armaganidis 1 , A Torres 2 , S Zakynthinos 3 , C Mandragos 4 , E Giamarellos-Bourboulis 5 , P Ramirez 6 , M De la Torre-Prados 7 , A Rodriguez 8 , G Dale 9 , A Wach 9 , L Beni 9 , L Hooftman 9 , C Zwingelstein 9 Introduction Murepavadin (POL7080) represents the first member of a novel class of outer membrane protein targeting antibiotics, being developed by Polyphor for the treatment of serious infections by Pseudomonas aeruginosa (PA). This study investigated the PK, safety and tolerability of POL7080 (2.5 mg/kg, 2-hour IVinfusion; TID for 10-14 days) co-administered with standard-ofcare (SoC) in ventilator-associated pneumonia (VAP) patients with suspected or confirmed PA infection. Efficacy was a secondary parameter.

Methods
This was a multicenter, open-label, non comparative, phase 2 study of the PK, safety, and efficacy of POL7080 when coadministered with SoC. Patients whose pneumonia was subsequently shown by culture results to be caused by PA sensitive to SoC antibiotics, were continued on treatment with SoC plus POL7080. The main efficacy endpoints included all-cause mortality (ACM) at day 28 and clinical cure at the TOC (7 ± 2 days after end-of-treatment).

Results
Twelve (48%) out of the 25 enrolled patients were part of the microbiological intention to treat (mITT) population: at inclusion, 11 had [>=] 10 5 CFU/mL -PA retrieved in their endotracheal aspirate (ETA) and 1 patient had a positive blood culture. The ACM at day 28 was 92% (11 of 12 patients). Overall, 10 of 12 (83.3%) patients in the mITT analysis set had a clinical outcome of cure at the TOC visit, as assessed by the investigator (based on clinical signs and symptoms, chest x-ray, CPIS and SOFA scores). Only 2 of 12 (16.7%) patients had a clinical outcome of failure at TOC. One of those two patients died due to shock and multi-organ dysfunction syndrome, 6 days after completing the full course of POL7080 treatment. There was no emergence of resistance related to POL7080 during this study.

Conclusions
The observed efficacy of POL7080 treatment on 28-day ACM and clinical response in this severely ill patient population underscore the potential therapeutic value of POL7080 in the treatment of VAP patients with PA infection.

Introduction
Bacterial toxins are responsible for serious complications of infections and associated with high morbidity and mortality rates despite best antibiotic treatment. Novel empty liposomes, named CAL02 specifically mimicking domains targeted by bacterial toxins, have shown synergistic effects with antibiotics and the ability to fully rescue mice from deadly acute infections by trapping toxins. CAL02 is active against a broad panel of toxins both from Gram-positive and Gramnegative pathogens, regardless of their resistance profile. Initial results of the first-in-man trial with CAL02 in patients with severe community-acquired pneumococcal pneumonia (CAPP) are presented herein.

Methods
The first-in-man clinical study with CAL02 is a randomised, multicentre, double-blind, placebo-controlled, trial carried out in ICU patients with severe CAPP. All patients received standard-of-care antibiotherapy and treatment recommendations for Sepsis. The study is composed of three arms: CAL02 Low-Dose (4 mg/kg), CAL02 High-Dose (16 mg/kg), and Placebo. CAL02 is administered twice, on two consecutive days, starting within 12 hours of the diagnosis of severe CAPP, and within 24 hours of intravenous antibiotic treatment. Endpoints: safety, tolerability, efficacy and pharmacodynamic characteristics.

Results
The CAL02 Low-Dose/placebo cohort has been completed and the CAL02 High-Dose/placebo cohort is ongoing. To date, patients characteristics at baseline (mean (min-max)): age: 58 (38-79), CURB65 score: 3.4 (3-4), APACHE II: 20.7 (14-30), SOFA: 8.0 (6-12). 2 cases presented septic shock at screening. 5 out of 7 patients needed mechanical ventilation and 5 were treated with catecholamines due to septic shock. 6 patients achieved cured pneumonia at Test of Cure and 1 patient died on day 2. ICU stay was 11.6 days (2-22) and hospital stay 20 days (2-55). CAL02 was safe and well tolerated. Safety and efficacy data from all patients having participated to this trial will be gathered and analysed by March 2017.

Conclusions
This clinical study is the first trial assessing the potential of a broadspectrum anti-virulence agent, CAL02, and its ability to protect against clinical deterioration in combination with antibiotics in a severely infected population. CAL02 is a first-in-class non-antibiotic drug which appears safe and its potential lies nonetheless far beyond, as it could also be a treatment for most infections caused by ESKAPE pathogens. Echinocandins such as anidulafungin are first choice for treatment of invasive candidiasis in critically ill patients. The elimination of anidulafungin is independent from renal and hepatic function. However, data on its target-site penetration are limited so far. Therefore, we assessed anidulafungin pharmacokinetics in ascites and pleural effusion of critically ill patients treated with standard doses of anidulafungin for proven or suspected invasive candidiasis.

Methods
Ascites and pleural effusion samples were drawn during routine paracentesis or from drains inserted for therapeutic purpose. When a drainage was in place samples were taken before as well as 1, 4, 8, 12, 18, and 24 hours after start of anidulafungin infusion. Anidulafungin quantification was performed with high pressure liquid chromatography (HPLC) and UV detection after protein precipitation with acetonitrile. For gradient elution, ammonium acetate and acetonitrile were applied. The flowrate was 1.0 ml/min. Anidulafungin was detected at 306 nm. The method was validated according to the European Medicine Agency (EMA) guidelines. Its lower limit of quantification is 0.05 mg/L.

Results
So far, seven critically ill patients with severe sepsis and multi-organ dysfunction were enrolled into the study. Anidulafungin pharmacokinetics in ascites has been determined in four patients. Kinetics in pleural effusion has been assessed in two patients. In addition, a single ascites sample obtained at paracentesis has been analysed. Anidulafungin peak concentrations in ascites amounted to 0.34-0.98. Thus, they were lower than the simultaneous plasma levels (peak levels 3.82-7.70 mg/L) and displayed a slower increase and decline than the plasma levels. The time to the peak level amounted to one hour in plasma and to 4-12 hours in ascites. We defined the penetration ratio as the ratio between the area under the concentration time curve (AUC) in ascites and the AUC in plasma. It amounted to 0.07-0.37. In pleural effusion, anidulafungin were slightly than those in ascites. The peak concentrations in the two patients were 1.02 and 2.02 mg/L.

Conclusions
Anidulafungin was detectable in ascites and pleural effusion after administration of standard doses. Ascites and pleural effusion concentrations exceeded the minimal inhibitory concentrations (MICs) reported for several Candida strains. However, isolates with MICs above the target-site concentrations have also been described. Investigation of target-site pharmacodynamics of anidulafungin in ascites and in pleural effusion is required.

Introduction
Anidulafungin can be used first-line for invasive candidiasis and testing B-D-glucan (BDG) has been shown to both identify those with early invasive fungal infection and guide discontinuation of antifungal therapy when invasive candidiasis is unlikely. Despite the cost, anidulafungin may be started even in those whom invasive candidiasis is unlikely or for whom microbiological evidence is scant. We investigated the appropriateness of anidulafungin prescription and the cost implication of its use in this manner in our department Methods From April 1st -August 31st 2016, patients in whom anidulafungin was prescribed were identified and their notes analysed. Likelihood of fungal infection was assessed using the 'Candida score' (score < = 3 suggesting invasive candidiasis unlikely). Electronic patient records were used to determine whether serum BDG was used to confirm fungal infection, and where negative, whether this was used to discontinue anidulafungin.

Results
Anidulafungin was started with a score <3 on 25 occasions, and with a candida score > =3 on 13 occasions. Only 2 patients had definitive evidence of invasive candidiasis on blood cultures and both had a 'Candida score' > =3 (illustrating low scores identify low-risk of candidaemia). All patients with a score of > =3 in whom BDG was tested had a positive result whilst with a score <3, BDG was negative in 8 of 11 patients (where tested

Conclusions
Our pilot study corroborates prior data in which BDG testing identifies high-risk of invasive candidiasis, and a Candida score / BDG regimen could both stratify risk of invasive candidiasis and aid discontinuation of treatment in low-risk patients [1,2,3]. The potential cost improvement for rationalization of anidulafungin use is significant but requires further work.

Introduction
The aim of this study is describe and analyse the thyroid function in the critically ill patient, the different pattern of which of these are associate with higher mortality.

Methods
We enrolled all the patients admitted in the ICU of a tertiary hospital, during the period from January 2015 to August 2016, with a length of stay in the ICU of seven days or more. After the subgroup analysis of the cause of admission, we didn't found significate differences in the average values of hormones among the groups. However we found statistically difference in T3 average value in the group of survivors and no survivors in all the subgroups.

Conclusions
After the perform of the TSH, T3 and T4 analysis, in the survival, and mortality group, we found a statistical significant difference in the T3 group. We can hypothesize that the decrease of this thyroid hormone is in relation with worst outcome, as the severity scores. After the subgroup analysis of the cause of admission, we haven't found differences in the mean values of hormones among the groups.

Introduction
Hyperglycemia, increased lipolysis, hypoaminoacidemia and profound muscle wasting are hallmarks of critical illness. Guidelines recommend amino acid administration to prevent muscle wasting. Glucagon, a catabolic hormone that affects these metabolic pathways and can induce muscle wasting, is increased during critical illness. Altered nutritional modulation of glucagon is known to contribute to the catabolic phenotype of diabetes. We assessed whether glucagon can be modulated by nutrition and investigated its metabolic role during critical illness.

Methods
In 174 critically ill patients and 20 matched healthy controls, plasma glucagon was quantified. In patients, the effect of glucose and insulin infusion and of parenteral nutrition containing amino acids (PN) on glucagon was documented. In critically ill mice with CLP-induced sepsis, the effect of amino acid infusion on plasma glucagon was evaluated and the impact of glucagon immunoneutralization on glucose, lipid and amino acid metabolism and muscle wasting was studied in the early (10 or 30 hrs) and/or prolonged (3 days) phase of critical illness.

Results
In patients, plasma glucagon concentrations were elevated from day 1 in ICU onwards up until day 7 and correlated with severity of illness. Infusing glucose with insulin did not significantly lower glucagon, whereas PN containing amino acids increased glucagon. In critically ill mice, infusion of amino acids increased plasma glucagon and up-regulated markers of amino acid catabolism in the liver without affecting markers of muscle wasting. Glucagon immunoneutralization in ill mice only transiently affected glucose and lipid metabolism, with a decrease in the illness-induced hyperglycemia and signs of stimulated rather than decreased lipolysis at 10 hrs. Glucagon immunoneutralization did not affect muscle wasting, but drastically suppressed markers of hepatic amino acid catabolism and reversed the illness-induced hypoaminoacidemia.

Conclusions
These data suggest that elevated glucagon availability during critical illness increases amino acid catabolism in the liver, explaining the illness-induced hypoaminoacidemia, without affecting muscle wasting and without a sustained impact on blood glucose. The data also suggest that infusion of amino acids, which is done during illness with the intention to spare muscle, is not effective and instead -via

Introduction
The objective of this study was to investigate the effects of in vitro administration of ubiquinol and thiamine on cellular oxygen consumption in peripheral blood mononuclear cells (PBMCs) from patients with diabetic ketoacidosis (DKA). We hypothesized that DKA patients would have depressed cellular oxygen consumption compared to controls. DKA often requires significant hospital resources, is characterized by depletion of electrolytes, and may be associated with subclinical cellular injury with prolonged acidosis. Therefore, evaluating metabolic components (apart from insulin) that can more rapidly reverse DKA and protect cells may be beneficial. Thiamine and ubiquinol are essential for adequate aerobic metabolism.

Methods
We performed a prospective study of DKA patients and healthy controls presenting to the emergency department at an urban tertiary care center from November 2015 to June 2016. A single blood draw was performed and PMBCs were isolated from blood samples. Cells were randomly assigned to in vitro administration of 0.5 ug/mL thiamine, 1 ìg/mL ubiquinol, or placebo treatment. The complete mitochondrial respiration profiles were measured using XF Cell Stress Mito Kit (Seahorse Bioscience) to reveal the key parameters of cellular oxygen consumption. One-way ANOVA was used to analyze differences in oxygen consumption rate between groups. Results A total of 10 DKA patients and 9 controls were included. Basal (7.0 ± 2.1 pmol/min/μg protein vs. 10.2 ± 2.4, p = 0.005) and maximal oxygen consumption (16.6 ± 4.2 vs 28.3 ± 9.0, p = 0.05) were significantly lower in PBMCs in DKA compared to controls. We found a significant increase in basal (10.3 ± 2.4 pmol/min/μg protein vs. 7.0 ± 2.1, p = 0.05) and maximal (27.6 ± 5.2 vs. 16.6 ± 4.2, p = 0.04) oxygen consumption between the ubiquinol and placebo group. Additionally, we found a significant increase in basal (9.3 ± 2.4 pmol/min/μg protein vs. 7.0 ± 2.1, p = 0.05) and maximal (25.4 ± 5.0 vs. 16.6 ± 4.2, p = 0.05) oxygen consumption between the thiamine and placebo group. Neither ubiquinol nor thiamine had a significant effect on basal and maximal oxygen consumption for controls. Conclusions DKA patients had overall lower oxygen consumption compared to healthy controls. In vitro administration of thiamine and ubiquinol independently increased oxygen consumption in DKA patients, but not in controls. These findings suggest thiamine and ubiquinol may have potential as mitochondrial resuscitators in DKA and potentially states with similar metabolic stress.

Introduction
This study aims to evaluate the adherence to and outcome of the blood glucose monitoring protocol amongst adult patients in ICU.

Methods
Patients above 18 years of age, who were admitted to EICU between March 2012 and March 2015 for longer than 24 hours and put on blood glucose monitoring protocol, were enrolled in this study.
Patients' demographic data, diagnosis on admission, presence of diabetes mellitus diagnosis, APACHE II score, length of stay in ICU and afterwards in the wards and health state on discharge from ICU and from the wards were recorded retrospectively. We calculated the rates of adherence to the timing of measurements and treatment protocol using patients' blood glucose recordings, the success rate of glycemic control using blood glucose levels, and the effect of adherence to monitoring protocol on the mortality rates and length of hospital stay. We assessed if the disparity in protocol adherence is associated with diabetes mellitus diagnosis, having surgery, severe sepsis/septic shock diagnosis or different APACHE II scores. Total 453 patients in 4447 in ICU stay day 22.220 measurement were recorded. Results 24% of measurement timings and 57.8% of treatments were according to the protocol. Better adherence achieved more effective hyperglycemia control and lessened the fluctuations in blood glucose levels, however it caused significantly increased rates of hypoglycemia and no change in mortality or length of hospital stay. The adherence rate was lower for patients with diabetes mellitus diagnosis, but sepsis or surgery had no effect.

Conclusions
We need to update our protocol according to the current guidelines in order to improve its efficacy and safety, redesign the parts that may hinder our adherence, and provide the necessary training to the staff for improving the adherence to the protocol.

Introduction
Two insulin receptor (IR) isoforms are responsible for insulin physiological actions. IR primary mRNA after alternative splicing generates IR-B and IR-A isoforms [1][2][3]. They activate different signalization pathways leading to different physiological effects; IR-B is responsible of metabolic effects. In Metabolic Syndrome, insulin resistance is associated to changes in IR isoforms ratio expression [4]. There could exist a similar situation in sepsis, but there is not data yet. The aim of this study was to assess the effects of serum septic shock on the IR-B mRNA expression. Methods RAW cells, a murine cell line, were cultured in conditioned mediaenriched with 10% of human septic serum (HSS) -or usual culture . See text for description media (control). HSS was obtained from critical care patients with less than 24 hours of septic shock diagnosis. Blood samples were centrifuged and the serum was obtained. Cells were culture by 16h, mRNA was extracted and quantified, cDNA was obtained by reverse transcription and using specifics primers, IR-B mRNA was amplified [5]. The cellular mRNA content was obtained by densitometric analysis. This same method was uses to assess if LPS (25 ng/mL) could have a similar effect to HSS. For statics analysis we used nonparametric test and p < 0.05 was considered significant.

Results
Both HSS and LPS significantly decreased IR-B mRNA content compared to control. The effect of LPS was significantly higher than observed with HSS to decrease the IR-B mRNA content (Fig. 25).

Conclusions
We demonstrated that HSS induces a significant reduction of IR-B expression in macrophages; the infective agent could be mediated this effect. The LPS concentration used here was greater than reported in clinics and it could explain it greater effect, if the lower content of mRNA implies decrease of the protein IR-B expression, then we are in presence of a new pathogenic mechanism for hypoglycemia in sepsis not reported before.

Introduction
Retrospective studies showed that high blood glucose variability is associated with an increased onset of infections and mortality in critically ill patients. However, a causal link has not been established.

Conclusions
These preliminary data confirmed the association between glycemic variability and mortality in critically ill patients. A higher blood glucose variability may be associated with a higher susceptibility to nosocomial infections in the ICU population but data from a larger sample of patients are needed.

Introduction
Meticulous euglycemia is known to improve outcomes in critically ill patients [1]. However attempting to maintain a tight glucose control regime does lead to dangerous life threatening hypoglycemic episodes [2]. Continous blood glucose monitoring (CGM) can potentially improve euglycemic control by providing immediate feedback to insulin's effects and probably reduce the workload of the nurses.

Methods
After obtaining R&D approvals and patient consent, patients undergoing elective cardiac surgery in 2 centres were enrolled. A nonrandomised, non-treatment, open label prospective study was performed where blood glucose results from Glysure's CGM was compared with the conventional blood gas analyser (Siemens RAPI-DLab 1200 Systems).

Introduction
Measuring of hemoglobin A1c (HbA1c) remain the gold standard for chronic glycemia monitoring in diabetics [1]. High HbA1c following coronary artery bypass graft could signal higher short-and longterm mortality rates [2]. We aim to find relation between preoperative HbA1C and glucose control in intensive care expressed through time in range blood glucose (TIR).

Methods
Two hundred twenty seven patients were involved in this single center prospective observational study. All patients were assessed by HbA1C before surgery, the glucose target was set to be 6.0 to 8.1 mmol/L. Glucose control was expressed though TIR, and based on this patients were grouped into group I who remained within the set target > 80% of the time and group II who remained < 80% within the target. Both groups were matched in terms of age, gender, race, Euro score, cardiopulmonary bypass time (CPB), aortic cross clamp time (ACC). Outcome variables were compared in diabetics and non-diabetics.

Results
Failure to maintain target glycemia was significantly more frequent in diabetics (p = 0.001) in patients with glycated hemoglobin (HbA1c) > 8%. Group II had significantly higher HbA1C (8 ± 2.2 vs 6.6 ± 1.7, p = 0.001), this group was significantly more liable to complications in terms of wound infection and post operative atrial fibrillation (p = 0.05 and 0.04 respectively). Lengths of ventilation, stays within ICU and hospital stay were significantly higher in group II (p = 0.03, 0.04, 0.3 respectively).

Conclusions
High HbA1C is likely a good predictor of poor glycemic control and post-operative adverse events.

Introduction
Shorter leukocyte telomeres have been associated with chronic diseases. We investigated whether telomere lengths upon admission to the paediatric intensive care unit (PICU) predict outcome and whether telomere length is altered during PICU stay in a different manner for children who receive or do not receive early parenteral nutrition (PN).

Methods
Telomere lengths were quantified in leukocyte DNA harvested from 342 healthy children and from 1148 critically ill children who were randomised to early or late parenteral nutrition from admission to the last day in PICU. Independent associations of telomere lengths with outcomes, with PICU mortality as primary outcome, and the impact of early PN on the change in telomere length during PICU stay were studied via multivariable logistic regression and Coxproportional hazard analyses.

Results
Children admitted to PICU revealed shorter leukocyte telomeres than matched healthy children. A shorter, not longer, admission leukocyte telomere length appeared independently associated with better PICU survival (P = 0.01), a finding that was statistically fully explained by a more activated innate immune response (P = 0.04) with a higher fraction of neutrophils that have shorter telomeres. After adjustment for confounders, including neutrophil count, early PN shortened leukocyte telomere length as compared with late PN (estimate early versus late -0.02, 95% CI [-0.04;-0.004], P = 0.01), an effect that was only partially explained by its impact on the duration of PICU stay (estimate per day in PICU -0.14, 95% CI [-0.28;-0.004], P = 0.04).

Conclusions
Shorter leukocyte telomeres in PICU patients than in healthy children reflected the degree of innate immune response activation which independently predicted better survival. Leukocyte telomeres were shortened by early PN during paediatric critical illness. Whether shortened telomere lengths predispose PICU patients to the longterm legacy of critical illness remains to be investigated.

Introduction
Muscle wasting is very common in critically ill patients and can have a profound effect on long term recovery. We aimed to quantify the extent of muscle wasting in trauma patients in the ICU. CT imaging has emerged as a method to assess body composition, with good correlation to total skeletal muscle volume [1].

Methods
Of 858 trauma patients admitted to ICU over a 30-month period (Jan 2012-14), 81 were identified as having had 2 abdominal CT scans. The initial trauma CT scan was compared to the subsequent scan. Two methods were used to measure muscle cross-sectional area (CSA). In the first, abdominal muscle CSA at L3 vertebrae was measured using Fig. 28 J software by tracing the abdominal musculature and applying predefined Hounsfield Units, -29 to +150 for skeletal muscle. An observer traced each psoas muscle at the L4 basivertebral vein for the second method.

Conclusions
Skeletal muscle wasting is common and time-dependent in critically ill trauma patients.

Introduction
Sarcopenia (skeletal muscle depletion) has been suggested to predict morbidity, especially infection and mortality, in various diseases. However, it is not well known whether sarcopenia has prognostic value for predicting poor outcome in sepsis. We hypothesized that sarcopenia can predict outcome in sepsis in the emergency department (ED) assessed by abdominopelvic computed tomography (CT).

Methods
We retrospectively reviewed the medical records of 627 patients with sepsis in the ED of a university-affiliated hospital during a 10-year period. We divided the patients into two groups according to 28-day survival and compared demographic characteristics, clinical features and presence of sarcopenia assessed by cross-sectional area of the psoas muscle at the level of the third lumbar vertebra on abdominal CT scans. Multivariate logistic regression analysis was conducted to examine the independent prognostic value of scarcopenia on the outcome of sepsis.

Conclusions
Sarcopenia evaluated by CT scan can predict poor outcome of sepsis patients. Further large prospective studies on sarcopenia in sepsis and the efficacy of nutrition therapy are warranted.

Introduction
Malnutrition at ICU admission is associated with increased morbidity and mortality. Prevalence of malnutrition in ICU patients varies from 17 to 78 percent in different patients groups. Malnutrition can be assessed by questionnaires but also by bioimpedance analysis. We compared the Short Nutritional Assessment Questionnaire (SNAQ) with phase angle at ICU admission.

Methods
During 15 weeks consecutive patients were included. Exclusion criteria were age < 18 years, abnormalities of the limbs and ICU stay < 6 hours. Phase angle was measured short after admission, SNAQ was obtained from the patient or legal representative. Malnutrition is diagnosed by SNAQ ¡Ý 2 or phase angle of < 5¡ã for men and < 4.6¡ã for women. Demographic and clinical variables were collected. The study was approved by our local ethical committee (MCL, nWMO 77, 2015).

Conclusions
Malnutrition was present in 16% according to SNAQ and 36% according to phase angle. There was a fair agreement between SNAQ and phase angle. With the use of phase angle an objective and accurate assessment of nutritional status can be made in ICU patients.

Introduction
Malnutrition is an important determinant of ICU outcome. An objective way to measure body composition and malnutrition is by bioimpedance analysis and calculation of phase angle (PhA). We questioned whether malnutrition, according to PhA, is associated with morbidity and mortality after ICU admission.

Methods
During 15 weeks consecutive patients were included. Exclusion criteria were age < 18 years, abnormalities of the limbs and ICU stay < 6 hours. PhA was measured short after admission, a PhA < 5°for men and < 4.6°for women was considered malnutrition. Demographic and clinical variables were collected. The study was approved by our ethical committee (nWMO 77, 2015). Results 299 Patients were included. Malnutrition was established in 36% (n = 106). Patients with malnutrition were more often female, older, with lower BMI and higher APACHE II score (Table 11) and more often diagnosed with a malignancy (30 vs 14%, p < 0.001). Hospital LOS, ICU and hospital mortality were higher in patients with a low PhA. Logistic regression analysis showed a significant relation between PhA and age, sex, BMI, malignant disease, hospital LOS and hospital mortality.

Conclusions
According to PhA, malnutrition was present in 36% of ICU patients. The same independent predictors of PhA (age, sex and BMI) were found as in a general hospital population. Even in this small population of a mixed ICU patients, a low phase angle was found to independently predict hospital mortality. The value of phase angle for ICU patients warrants more research.

Introduction
Ultrasound is a noninvasive method that can measure quadriceps muscle layer thickness (QMLT) and subsequently lean body mass (LBM) at the bedside,with the validity and reliability checked and confirmed in previous studies, we correlated the sarcopenia associated with heart failure in ischemic cardiomyopathy patients detected by decrease cross sectional area of Qudreceps femoris with the clinical and labortatory parameters of sarcopenia.

Methods
Retrospective observational study including 26 patients with ischemic cardiomyopathy admitted to our cardiac ICU were enrolled between January 2014 and september 2016 .serial measurements of quadriceps muscle cross sectional area were obtained at admission and every 3 days for intial 10 days. Measures was recorded to assess the trend and correlation between presence of muscle loss and the clinical and laboratory parameters that indicate the presence of sarcopenia despite effective feeding protocol, all enrolled patients were hemodynamically monitored and treated under the same ICU feeding and physiotherapy protocol.

Results
Completed data sets were obtained from 26 patients, 20 males and 6 females, mean age of 57 with age range from 43 to 90 years. The following parameters were assessed daily: Body weight, serum albumin, haemoglobin and CRP, Quadricips femoris for a total of 260 scans was analyzed and serial measurements compared, it shows statistical significant difference between recorded measures (P < 0.05) and positive correlation with clinical assessment.

Conclusions
In our clinical series, Assessment of sarcopenia by serial Ultrasound measurements of Quadreceps femoris cross-sectional area in patient with ischemic cardiomyopathy was correlated with clinical and laboratory assessment of sarcopenia, its an easy and reliable technique to assess for nutritional status and to evaluate our feeding protocol, Further studies needed to confirm these findings.

Introduction
Our study aims mainly to identify the prevalence of malnutrition, to compare elderly patients according to their nutritional status in a Moroccan AMU; then analyze the prognosis impact of malnutrition at 540 days follow-up.

Methods
This was a prospective cohort study conducted in the AMU; Ibn Sina University hospital, Rabat; Morocco, from June to September 2014, including patients aged of > =65 years. Demographic, anthropometric, comorbid diseases, clinical characteristics and in-  hospital evolution data were included. Survival status was evaluated at 2 times; in the hospital and at 540 days follow-up. Malnutrition was evaluated using the MNA, and the health and functional status using EuroQol-5D-3L. Cox proportional hazard univariate analysis identified the factors associated with mortality at 540 days. Then, multivariable model incorporating 4 conventional risk variables was calculated with Cox proportional hazards regression analysis, and we assessed the predictive performance of this model. Statistical analyses were carried out in SPSS Statistics and STATA.

Results
Ninety five patients were included. Mean age was 75 ± 5.9 years; 64.2% were women. The mean ± SD length of stay in AMU was 6.2 ± 5 days. In-hospital and 540 days post-discharge follow-up mortality were respectively 16. 8%

Conclusions
Malnutrition has a negative prognosis impact on the long-term survival status of elderly in an AMU. Therefore, its high clinical utility should be considered to guide nutrition interventions during hospital stay and even after discharge to reduce the poor outcome of elderly patients.

Introduction
There are limited data of mortality-benefit from nutritional management in ARDS patients. This study aimed to evaluate the impact of the amount of calorie-intake during the first week of ARDS on 28-day mortality.

Methods
We retrospectively collected data from ARDS patients admitted in medical ICUs in our hospital between 2010 and 2014 and divided into 3 groups, namely patients receiving less than 60% of targeted cumulative calories in the first week of ARDS, those receiving between 60% and 100% of the 7-day targeted calories and those receiving at least 100% of the 7-day targeted calories. The 28-day mortalities among 3 groups were analyzed and compared.

Conclusions
We recommended to provide at least 60% of targeted cumulative calories in the first week of ARDS to improve 28-day mortality. However, due to limited sample size, further studies are needed to confirm the results.

Introduction
Frequency of motility disorders in the upper or lower gastrointestinal (GI) tract is reported to be almost 80% in critically ill patients, which a delayed passage and delayed gastric emptying are very common. Impaired gastric motility can lead to regurgitation or vomiting of gastric contents which may cause aspiration and consequently pneumonia. Enteral nutrition is the preferred route of nutrient supply in critically ill patients who are not able to consume oral foods. Monitoring of upper GI function has an important role in the decision of the continuation of enteral nutrition. Measurement of gastric residual volume (GRV) is recommended for monitoring of GI function and avoidance of aspiration and pneumonia. However, there are some studies which have not shown any relationship between high GRV and risk of aspiration and pneumonia in critically ill patients. So, validity of GRV as a marker for risk of aspiration in mechanically ventilated patients is still unclear. The aim of this study was to assess the effect of GRV monitoring on incidence of ventilator associated pneumoniain (VAP) in mechanically ventilated patients admitted to intensive care unit.

Methods
This descriptive study was done on 150 mechanically ventilated adult patients admitted to intensive care unit and receiving enteral nutrition. GRV was measured every 3 hours and gastric intolerance was defined as GRV > 250 cc. The incidence of vomiting and VAP, GRV, length of mechanical ventilation and ICU stay, APACHE and SOFA scores, and mortality rate were noted. Data were analyzed using SPSS 18.

Results
The mean age of the patients was 57.79 ± 18.84 years. The incidence of VAP was 24% (36 patients). Incidence of vomiting was 50% [24 patients in VAP positive group and 51 patients in VAP negative group (P = 0.022)]. Mean ICU length of stay (LOS) was 11.59 ± 4.91 days and mechanical ventilation was 7.54 ± 3.67 days. Mean APACHE score was 25.14 ± 5.78 and mean SOFA score was 11.51 ± 2.14. Mean GRV during the ICU stay was 230.51 ± 52.63. Mortality rate was 22.1%. There was not a

Introduction
Gastro-esophageal reflux (GER) is common in ventilated patients and is a cause of VAP. We recently described the development of a unique device called the peristaltic feeding tube (PFT) (Lunguard, Yavne, Israel) [1]. The PFT (Fig. 30) uses simulated peristalsis to seal the esophagus to fluid moving retrogradely, whilst allowing normal drainage of fluid and secretions moving ante-gradely. Here we describe the first trial of the PFT in ventilated patients.

Methods
There were 10 subjects in the treatment (PFT) group and 10 patients in the control group, who had all undergone elective cardiac surgery and were ventilated in the ICU afterwards. The PFT was placed on admission to the ICU. In the control group a standard nasogastric tube (NGT) was inserted. Specimens were collected by suctioning from the oropharynx and from above the tracheal tube balloon every hour and from the trachea twice per eight hour shift. Samples were analyzed by ELISA for Pepsin A as a marker for secretions of gastric origin.

Results
The two groups were comparable (demographics and duration of ventilation). There were statistically more specimens positive for Pepsin A in the control group in the oropharynx and above the ETT cuff, but not in the trachea (Table 13).

Conclusions
The PFT reduced the amount of GER in ventilated patients. A larger study is required to determine whether this translates to a reduction in VAP.

Introduction
Failure to detect misplacement and use of a nasogastric tube in the pleura or respiratory tract prior to commencement of feeding or administration of medication has been classified as a "Never Event" by NHS England. pH testing remains the first line test that a nasogastric feeding tube is inserted correctly. Gastric pH is often falsely elevated due to administration of antacids. Chest X-ray serves as a second line check only, but is often required to ensure correct positioning of feeding tubes into the stomach. This study aimed to 1. quantify the need for chest X-rays to confirm position of nasogastric tubes, 2. determine the percentage of patients in whom gastric aspirate with pH < 5.5 could be obtained, 3. determine the time lag until enteral feed can be started after insertion of a nasogastric tube.

Methods
All data on insertion of new nasogastric tubes were collected retrospectively by research nurses on Intensive Care from August to November 2016. Data collected included indication for tube insertion, level of training of person inserting the tube, pH of gastric aspirate (if obtained), need for chest X-ray request, X-ray results, time lag until feeding was initiated, and treatment with antacids.
Results 96 events of nasogastric tube insertion were recorded. 90% of tubes were inserted by junior doctors, 5% by consultants and 2% by nursing staff. 82 patients (85.4%) received antacids (40.6% ranitidine, 44.8% proton pump inhibitors). A gastric aspirate could be obtained in 43 patients (44.8%). 14 of these patients (32.5%) had a gastric pH of 5.5 or above, thus precluding safe initiation of enteral feeding. A chest X-ray was requested for 76 patients (79.2%), which confirmed position within the stomach in 70 patients. In 7 patients a chest Xray was requested, although the pH of the gastric aspirate was below 5.5. In 17 patients (17.7%) it took more than 4 hours until feed was started.

Conclusions
A main causal factor for patient harm after initiation of enteral feeding through misplaced nasogastric tubes is misinterpretation of chest x-rays1. Although chest X-rays should only be used as second line check, when a gastric aspirate cannot be obtained or pH is inconclusive, in our patient cohort circa 80% of patients required a chest xray to confirm position of nasogastric tubes. A nasogastric aspirate could only be obtained in about 45% of patients. Devices to facilitate gastric aspiration may be cost efficient by reducing chest X-ray requests and may prevent patient harm.

Introduction
In previous studies, parenteral nutrition (PN) was associated with higher risk for adverse outcomes as compared to enteral nutrition (EN) [1]. We aimed to test the hypothesis that critically ill patients receiving PN have higher mortality and infection rates as compared to those not receiving PN.

Methods
Prospective observational study on 108 consecutive adult patients admitted to our Intensive Care Unit (ICU) from February 2016. Patients with ICU-length of stay <72 hours were excluded. We recorded type of nutrition (PN, EN, none or oral feeding), daily caloric and protein intake, microbiology results, ICUmortality.

Results
Thirty-one patients (29%) received PN at least one day during the ICU stay. Patients receiving PN showed higher ICU-mortality as compared to those not receiving PN (Fig. 31) and Non-survivors received PN for a longer time period as compared to Survivors (71 ± 26 versus 42 ± 25 hours, p = 0.005). The association between PN and mortality was independent of the Simplified Acute Physiology Score (adjusted odds ratio: 3.389 [95% confidence interval 1.301-8.828]). PN was associated with higher infection rate (Fig. 32).
Conclusions PN was associated with higher ICU-mortality and infection rate in a general population of adult critically ill patients.

Introduction
We hypothesize that (1) septic patients with alcohol-use disorders (AUDs) do not consistently receive thiamine as part of their treatment regimen, and that (2) those not receiving thiamine have increased mortality compared to those receiving thiamine. Previous studies have shown AUDs to be associated with increased sepsisrelated mortality although the mechanism through which AUDs contribute to worse outcomes has not been fully elucidated. Thiamine (vitamin B1) deficiency is a common sequela of AUDs and may contribute to impaired mitochondrial aerobic respiration leading to refractory acidosis/death if untreated.

Methods
We performed at retrospective analysis of all patients presenting with septic shock between 2008 and 2014 at a single tertiary care center. Patients were identified based on (1) receipt of antibiotics, (2) use of vasopressors, (3) lactate levels > 4 mmol/L, and (4) an AUD diagnosis based on ICD-9 codes. We excluded patients with seizures during the hospital stay and those with non-index events. Descriptive statistics were used to calculate the proportion of septic patients with AUDs receiving thiamine, and mortality between groups were compared using Fisher's Exact.

Conclusions
In this study, we found that a considerable proportion of patients with AUDs admitted for septic shock did not receive thiamine. Thiamine administration in patients with AUDs and septic shock was associated with decreased mortality. We suspect that failure to administer thiamine in this patient population is related to a shift in clinician focus towards the septic insult and away from the AUD history, although further studies are needed to better understand this phenomenon.

Introduction
Vitamin D deficiency has been repeatedly implicated in regulation of the innate and adaptive immune system. Evidence from ER studies investigating the association between vitamin D deficiency and mortality are currently lacking.

Methods
We conducted an interim analysis of a prospective study enrolling patients presenting with sepsis/septic shock at our ER. Vitamin D was sampled from patients before any fluid or drug administration. Patients were considered as vitamin D deficient if < 30 μg/L. Apache II and Charlson index were calculated for each patient.

Results
At November 15th 2016 96 patients completed the 3 months follow up. Vitamin D deficiency at ER admission was observed in 65.1% of patients. 42% were bacteraemic. 34.3% of patients were under pressor support during ER stay and 62% had an increased lactate. Apache II was not significantly different between groups while Charlson Age adjusted comorbidity index was significantly higher in Mortality was higher in vitamin D deficient patients (34.2% vs 23.1% p = 0.12) in a non-significant fashion. Multivariate regression analysis showed that Apache II was independently associated with worse outcome (p < 0.001) while vitamin D status was not (p = 0.231). Mean ICU length of stay was longer for vitamin D deficient patients without reaching statistical significance (6.8 days vs 6.01 days p = 0.64).

Conclusions
Our interim analysis showed a trend towards an increased mortality for patients admitted to the ER for sepsis with a vitamin D deficiency.
A larger sample will eventually clarify the role of vitamin D deficiency at ER admission for sepsis.

P473
Vitamin D supplementation in the critically ill: systematic review and meta-analysis P L Langlois 1 , C Szwec 2 , F D'Aragon 1 , D K Heyland 3 , W Manzanares 4

Introduction
Vitamin D insufficiency is reported in up to 50% of the critically ill patients and is associated with adverse outcomes as increased mortality, length of stay (LOS) in ICU and hospital, and development of respiratory disorders with prolonged ventilation (1). Vitamin D supplementation is an interesting therapeutic strategy in the critically ill but clinical evidence remains unclear. The aim of this systematic review was to evaluate the clinical efficacy of vitamin D administration in critically ill patients.

Methods
We searched databases for randomized controlled trials (RCT) in critically ill patient comparing vitamin D administration to placebo. The studies had to report mortality, infectious complications, hospital/ICU LOS or length of mechanical ventilation. Two independent reviewers assessed eligibility, risk of bias and abstracted data onto a pretested form. When possible, we pooled data using a random effect model to estimate the relative risk (RR) for dichotomous outcome and mean difference (MD) or weighted mean difference (WMD) for continuous outcomes. Pre-defined subgroup analysis included oral-enteral vs parenteral administration, high vs low dose and sensibility analysis restricted to vitamin d deficient patient.

Results
Six RCTs (695 patients) met study inclusion. No overall analysis were significant. A trend towards a reduction in mortality was found when vitamin D was administered (RR = 0.84, 95% Confidence Interval [CI] 0.67-1.06; P =0.13, see Fig. 34). A trend was found towards a reduction in ICU (P = 0.17) and hospital LOS (P = 0.08). No differences in infection rate and ventilation days existed. In the oral-enteral group, a significant reduction in hospital LOS existed (P = 0.008) and a trend towards mortality reduction (P = 0.12) and ICU LOS (P = 0.12). Finally, doses higher than 300 000 IU were associated with a trend to reduce mortality (P = 0.12) and ICU LOS (P = 0.12).

Conclusions
In critically ill patients, Vitamin D administration does not affect overall mortality, ICU/hospital LOS, infection rate and length of mechanical ventilation. However, many analyses showed trends and seemed underpowered to detect a clinical difference, thus prompting further RCTs.

Introduction
Over the last few years several randomized controlled trials examining the role of pharmaconutrition in the critical care setting have shown controversial results. The purpose of this study was to describe current pharmaconutrition practices in intensive care units (ICUs) worldwide evaluating the compliance with the evidence-based Canadian Critical Care Nutrition Clinical Practice Guidelines (CPG).

Methods
We conducted an international, prospective, observational, cohort study between September 2014-June 2015. Participating sites enrolled critically ill adult patients on mechanical ventilation within the first 48 hours of ICU admission and who remained for at least 72 hours in the ICU. Data on nutrition and pharmaconutrient practices were collected from admission to ICU discharge or a maximum of 12 days. Variables related to the CPG are reported as overall averages (or percentages) with the range of site averages.

Introduction
Our hypothesis is that psoas muscle index (PMI) is related with outcomes in living donor liver transplantation (LDLT) patients. Liver transplantation (LT) patients usually have low physiologic reserves prior to LT [1]. In most LT centers, the MELD score is used for the prioritization of organ allocation. However, MELD score does not include the nutritional and functional status of the patients. Sarcopenia is defined as a loss of skeletal muscle mass and function with a risk of adverse outcomes and it was found to be associated with elevated postoperative complications in LT patients [2]. PMI is calculated with psoas muscle area/height2 (mm2/m2) and used to detect sarcopenia [3]. Hence, it may be an outcome predictor in LDLT patients.

Methods
In this study, 261 patients who underwent LDLT between 2011 and 2014 were retrospectively evaluated. Patients who were <18 years old, cadaveric liver transplant recipients and whose CT scans were missing were excluded. Demographic data, body mass index (BMI), PMI, MELD score, etiology, length of ICU stay, length of hospital stay, postoperative 7th day acute kidney injury (AKI), requirement of renal replacement therapy (RRT), reoperation, readmission and 1st year mortality were recorded.

Results
The first quartile of PMI for male and female patients were 397mm2/ m2 and 298mm2/m2 respectively. In all patients, postoperative 7th day AKI, RRT requirement and 1st year mortality were 12.3%, 8.2% and 10.5%. There was no any correlation among PMI, BMI and MELD score. PMI was positively correlated with increase in postoperative 7th day creatinine level and length of hospital stay (r2 = 0.02 P = 0.049 and r2 = 0.06 P < 0.001). In 56 patients with sarcopenia, RRT requirement, reoperation, readmission and 1st year mortality were significantly higher than non-sarcopenia patients (P = 0.001 P = 0.003 and p < 0.001 for others). In multivariate logistic regression model, 1st year mortality was only increased 29.5-fold (8.3-104.4) by sarcopenia (P < 0.001).
Conclusions BMI, MELD score or conventional evaluation of LT candidate may fail in many cases to predict prognosis after LT. Adding the PMI measurement may improve the evaluation of the patients prior to LT and can be used as an outcome predictor in LDLT patients.

Introduction
The mortality rate in patients with severe liver dysfunction with no option of transplantation is unacceptably high [1]. The main aim of this study was to evaluate the usefulness of extracorporeal liver support (ECLS) techniques in this group of patients. Secondary aims were to identify independent risk factors and to assess the predictive values of the following scoring systems: Glasgow Coma Scale (GCS), Sequential Organ Failure Assessment (SOFA), Acute Physiology and Chronic Health Evaluation (APA-CHE) II, Simplified Acute Physiology Score (SAPS) II and Model of End-stage Liver Disease United Network for Organ Sharing (MELD UNOS) Modification in patients with severe liver dysfunction.

Methods
Data from hospital admissions of 101 patients with severe liver dysfunction (MELD UNOS > =18) who were admitted to the department of anaesthesiology and intensive therapy between 2006 and 2015 were retrospectively analysed. The study group was divided into two subgroups. SMT was a subgroup of patients receiving standard medical therapy, and SMT + ECLS was a subgroup containing patients receiving standard medical therapy complemented by at least one extracorporeal liver support procedure.

Results
Significantly lower ICU mortality and 30-day mortality rates were found in subgroup SMT + ECLS. In a multivariate model, independent risk factors for ICU mortality proved to be the SOFA score and prothrombin time. The highest discriminatory power for ICU mortality was demonstrated for the SOFA score, followed by APACHE II, SAPS II, MELD UNOS and GCS scores. For 30-day mortality, however, the best discriminatory power was shown for the SAPS II score, followed by SOFA, APACHE II, MELD UNOS and GCS scores.

Conclusions
Further studies are needed to assess the contribution of nonbiological ECLS procedures to a decrease in mortality rates in the population of patients with severe liver dysfunction. The independent risk factors for ICU mortality and 30-day mortality were demonstrated to include the SOFA score and prothrombin time. The SOFA, APACHE II, and SAPS II scores were better predictors of death than the MELD UNOS Modification score which is dedicated to the assessment of patients with liver disease.

Introduction
Circulatory dysfunction is known in spontaneous bacterial peritonitis (SBP) patients. We aimed to determine whether the degree of hyperdynamic circulation is significantly correlated with severity of liver disease and poor outcome in SBP patients or not.

Conclusions
Degree of hyper-dynamic circulation is significantly correlated with the severity of liver disease and predicts poor outcome in SBP patients.

Introduction
The There was an improvement on chest X-ray findings after therapy among one of the patients who had acute liver failure due to intoxication (Fig. 35). Mean length of stay in the ICU was 67.1 ± 62.3 days. One patient with acute on chronic liver failure is still being treated in the ICU due to sepsis and acute kidney injury. Other 6 patients died while awaiting liver transplantation during the follow up.

Conclusions
Platelet count has an important effect on extrinsically activated coagulation by ExTEM. Determining the cut-off point for clot instability as measured by derived ROTEM parameters may aid in establishing new thresholds for platelet transfusion in patients with ESLD.

Introduction
We assessed the hypothesis that diarrhea during critical illness disrupts the gut-liver signaling axis through intestinal malabsorption of bile salts. Disrupted bile salt signaling could play a role in critical illness associated cholestatic liver injury, which occurs in up to 20% of ICU patients. We previously identified diarrhea as an independent risk factor for critical illnessassociated liver injury (OR 4.1, 95% CI 1.2-14.5, p = 0.028, unpublished data), suggesting intestinal involvement in liver disease in the critically ill. The bile salt-induced enterokine fibroblast growth factor 19 (FGF-19) is important for normal gut-liver signaling, as it provides a negative feedback signal that limits bile salt synthesis in the liver.

Methods
We investigated plasma bile salt and FGF19 levels in critically ill patients. The study population consisted of enterally fed, mechanically ventilated patients admitted to our intensive care unit (ICU). There were no significant differences in demographic characteristics and ICU length of stay between groups with and without diarrhea. Patients with a primary hepato-biliary diagnosis (i.e. biliary obstruction or cholangitis) or admitted after elective surgery were excluded. After inclusion, allocation to a diarrhea group (N = 12) or no diarrhea group (N = 18) occurred based on 24 hour fecal production > = 350 ml. Data were tested for normality using the Shapiro-Wilk test and are presented as median [interquartile range] or mean ± standard deviation. Mann-Whitney or ttests were used as appropriate. Results FGF19 levels were decreased in the diarrhea group (0.20 ± 0.12 vs. 0.29 ± 0.10 ng/mL, p = 0.03) indicating disturbed enterohepatic signaling. Plasma bile salt levels were significantly increased in patients with diarrhea compared to patients without diarrhea (9.8 [5.0-23.9] vs. 4.5 [2.9-7.4] μmol/L, p = 0.01]). Bilirubin, alkaline phosphatase (ALP) and gamma-GT levels were not significantly different between the two groups. Conclusions Diarrhea in critical illness disturbs the normal enterohepatic circulation of bile salts, as evidenced by reduced plasma levels of FGF19. This in turn may lead to unopposed bile salt synthesis in the liver. In conclusion, diarrhea and malabsorption may cause liver injury during critical illness due to disturbed enterohepatic cycling and accumulation of bile salts in the hepatocyte.

Introduction
Although acute pancreatitis necrotizing (ANP) has a high mortality, continuous regional arterial infusion (CRAI) of protease inhibitors has been considered to be a promising method of reducing the mortality.

Introduction
The complication rate after pancreas transplantation (PTx) is very high, with venous thrombosis being the most commonly occurring complication that leads to early graft loss. There is a lack of reliable methods to uncover complications. In this study, we investigated whether microdialysis catheters could detect complications at a timepoint where the grafts were still salvageable. Methods 34 consecutive patients underwent technically similar PTx. 17 were simultaneous pancreas-kidney transplantations (SPK) and 17 were pancreas transplantation alone (PTA). The arteries were anastomosed "end-to-side" to the right common iliac artery and an elongated portal vein was anastomosed "end-to-side" to the inferior caval vein. The enteric anastomosis was performed by a duodeno-dudenostomy. At the end of surgery, two microdialysis catheters were placed anterior and posterior of the pancreas. We measured glucose, lactate, pyruvate, and glycerol bedside by sampling every 1-2 hours postoperatively.

Results
Of the 34 patients included, a total of nine patients developed venous thrombi. The microdialysis catheters detected all thrombi by a marked increase in lactate, which preceded any rise in blood glucose levels, and in the absence of any clinical symptoms. Median anterior lactate was 2.5 mmol/L (interquartile range (IQR) 2.2-4.1) and median posterior lactate was 2.9 (IQR 2.5-4.0) in the patients with thrombi which was significantly higher than in the uneventful group that had a median anterior and posterior lactate values of 1.1 (IQR 0.9-1.6) (p = 0.003) and 1.8 mmol/L (IQR 1.1-1.9) (p = 0.007) respectively. The presence of a thrombus was confirmed by CT. 4/9 grafts were rescued by angiographic intervention, one did not require intervention, and four grafts were explanted due to irreversible ischemic damage. All four patients with rescued grafts remain insulin-independent more than one year post transplantation. In addition, three patients had leakages in relation to the doudeno-duodenal anastomosis. This was detected by an increase in glycerol that preceded any clinical signs for up to fourteen days. One patient required relaparotomy, another percutaneous drainage, and the last was managed with IV antibiotics alone. The three patients had higher median glycerol levels in the anterior catheter compared to the non-events, 908μM (IQR 543-1355) vs. 63μM (IQR 51-124) respectively (p = 0.002).

Conclusions
Monitoring lactate and glycerol with microdialysis catheters detect venous pancreas transplant thrombosis and enteral anastomosis leakage at an early stage, and may improve graft survival, patient morbidity and survival.

Introduction
A widely adopted, accurate and convenient scoring system to predict severity in acute pancreatitis has eluded surgeons down the years. Despite not being supported by the most recent International Association of Pancreatology guideline the modified Glasgow score remains very widely used, in part because a score of >/=3 is often claimed (by surgeons) as an indication for admission to Critical Care Unit (CCU). The score requires measurement of Lactate dehydrogenase, as this is often not included on admission blood tests the patient must either be re-bled or the test "added-on" to stored samples. Completing the score therefore incurs an additional expense. We sought to determine whether completing the score (by measurement of LDH) has any value in predicting CCU admission. Methods All patients admitted with a diagnosis of pancreatitis to our institution in 2015 were reviewed. The components of the Glasgow score on admission were recorded along with admission to CCU, length of stay and long term outcome. Relative risk of admission to CCU was calculated in the presence or absence of LDH measurement and again for normal or abnormal levels. An LDH level of >600 U/l was regarded as abnormal. Repeated admissions of the same patient were excluded.

Results
There were 229 new patient episodes of acute pancreatitis in 2015. 16 (7%) were admitted to critical care. One year crude mortality was 5.7%. Patients who had complete scoring (including LDH) had a Relative Risk of Critical Care admission of 1.27 (95% CI 0.42-3.79). Patients with an abnormal LDH had a relative risk of CCU admission of 2.57 (95% CI 0.87-7.61) Conclusions 95% confidence intervals for RR of CCU admission cross 1 for both elements. These data suggest that not only is an abnormal LDH not a determinant of CCU admission but also that measuring it at all is of no clinical use. These findings support the recomendations of the recent National Confidential Enquiry into Patient Outcome and Death (NCEPOD) publication on pancreatitis. This review encouraged the adoption of the IAP guidelines in the management of pancreatitis which recommend a physiological score as sufficient in determining severity and prognosis. LDH measurement is an unnecessary additional expense in the management of pancreatitis.

Introduction
Emergency laparotomy is a common abdominal operation that carries a high mortality rate, recently reported in the UK at 3.6% to 41.7% [1]. With the drive to reduce associated morbidity and Fig. 36 (abstract P483). See text for description mortality, there is wide academic interest to examine every step in the laparotomy pathway [2]. We aimed to establish whether patients undergoing bowel resection at emergency laparotomy experience any difference in critical care or hospital outcome dependent on whether their bowel is defunctioned or a primary anastomosis is performed. Methods Retrospective review of emergency or urgent laparotomies performed at a large UK district general hospital between 2014-2015 where post-operative critical care admission was required. Data was assimilated from the electronic database. We examined: baseline patient characteristics and predicted mortality, pathology, the presence of soiling, need for repeated operation and subsequent pathology found, organ support requirements, critical care and hospital outcome and length of stay.

Results
We reviewed 106 patients and noted the pathology seen at initial operation varied widely between the two groups, with those defunctioned much more likely to have an inflammatory process or perforation found on surgical exploration. Patients with soiling at initial operation were 4 times more likely to have a defunctioning stoma formed. Both the APACHE II and ICNARC risk prediction models identified the defunctioned group as being at greater mortality risk; the groups were otherwise well matched at baseline. In the subgroup of patients with soiling of the abdomen at initial operation, hospital length of stay was shorter (13 vs 18 days) and more patients were discharged alive (81% vs 66%) if they had undergone defunctioning as opposed to primary anastomosis. No significant difference in organ support, need for repeated operation or overall critical care or hospital mortality or length of stay was identified between the two groups. Conclusions This is the first study to examine the impact of the decision to either anastomose or defunction bowel at emergency laparotomy and link this to post-operative morbidity and outcome. Our data suggests that patients with soiling at initial operation have a lower critical care and hospital mortality, and a shorter length of hospital stay if a defunctioning stoma is performed. No other differences were noted in any of the indices examined. This evaluation has significant implications both locally and nationally for the management of patients undergoing emergency laparotomy.

Introduction
The purpose of this study is to compare the clinical efficacy of intravenous (IV) pantoprazole intermittent bolus (IMB) dosing versus the recommended bolus and continuous infusion (CI) in subjects with upper gastrointestinal bleeding (GIB). The American College of Gastroenterology treatment guidelines for GIB recommend IV pantoprazole 80 mg bolus followed by 8 mg/h infusion [1]. In 2015, our health system switched from CI pantoprazole to IMB dosing as a response to an IV pantoprazole shortage. To our knowledge, there is currently a paucity of literature, which directly compares the two dosing strategies at consistent and specific dosages. This study aims to contribute additional data to the current body of knowledge. Methods This retrospective chart review evaluated the treatment of GIB with either pantoprazole CI or IMB. Patients who received IV pantoprazole therapy for at least 24 hours and had an upper endoscopy (EGD) with high-risk stigmata of bleeding were included. The primary outcome was rebleeding within 3 days of starting pantoprazole.
Rebleeding was defined as overt bleeding with a drop in hemoglobin requiring packed red blood cell transfusion (PRBC) or additional endoscopic or surgical intervention. Secondary outcomes included rebleeding within 7 days, ICU length of stay (LOS), hospital LOS, and 72-hour PRBC requirements from start of pantoprazole. Statistical analyses were conducted using R statistical software.

Results
Fifty-one patients treated with IMB and 41 patients treated with CI were included. Baseline characteristics appeared to be similar between the two groups except for patients receiving an initial 80mg bolus, which occurred more frequently in the CI group (CI 78% vs. IMB 25.5%). There was no difference between either group for the primary outcome with 4 patients in the IMB group and 7 patients in the CI group who met criteria for rebleeding within 3 days (p = 0.20). Rebleeding within 7 days was similar between groups (IMB 11 vs. CI 10, p = 0.94). The mean number of PRBC received in 72 hours was 2.19 units in the IMB group and 4.5 units in the CI group (p = 0.99). There was a significant difference in ICU LOS (IMB 1 day vs. CI 2 days, p = 0.02) but not in hospital LOS (IMB 4 days vs. CI 5 days, p = 0.29). Conclusions IMB pantoprazole was comparable to the guideline recommended CI pantoprazole for the treatment of GIB in patients with endoscopic findings of high-risk stigmata of bleeding. IMB pantoprazole could be considered an equivalent alternative to CI pantoprazole in treatment of GIB and deserves further evaluation.

Introduction
Our aim was to audit the amount of sodium administered to our patients and the incidence of hypernatraemia. ITU patients receive large amounts of IV fluid, often containing high concentrations of sodium. Hypernatraemia is therefore more common and is associated with worse outcomes [1] [2]. Retrospective data collection from ITU patient records over a 1 month period indicated that 96% of patients received >2mmol/kg/ day of sodium. 6% became hypernatraemic. During this time, use of Hartmann's as maintenance IV fluid (MIF) was standard practice. NICE guideline CG174 specifies that maintenance fluid/sodium should be given by infusion of 0.18% Sodium Chloride/4% Dextrose (Dex/sal). Practice was changed and compliance audited. Audit standards used are shown in Table 14.

Methods
We retrospectively audited the records of all admissions to the ITU during May 2016. There were 65 patients; of which 36 were excluded (due to diagnoses that affect sodium balance), leaving 29 patients included in the audit. Data collection included patient demographics, length of stay, fluid type received, total sodium received from all sources and serial serum sodium.

Results
We achieved four standards and failed one (*). [Table 15] Total sodium received per patient reduced from a mean of 936mmol to 314mmol. Sodium received in MIF as a percentage of total sodium reduced from 59% to 35% Maintenance sodium was received at a rate between 0.6-1.4mmol/kg/day in 83% of patients. 67% of patients received >2mmol/kg/day of total sodium. 96% of patients started on dex/sal had a lower serum sodium on repeat. 30% of the normonatraemic (135-145mmol/l) patients became hyponatraemic (<135mmol/l). On average these patients received 0.94mmol/kg/day maintenance sodium and 2.86mmol/kg/day total sodium. 10% of patients received <0.6mmol/kg/day of maintenance sodium, who all remained normonatraemic. There was a weak correlation between percentage serum sodium change and sodium received (total and maintenance) but no correlation if it was corrected for body weight.

Conclusions
We adhered to NICE standards well. We reduced total sodium administered and subsequent hypernatraemia. Hyponatraemia incidence was increased which also adversely affects patient outcome [2]. Recommendations were made to raise awareness of dex/sal induced hyponatraemia and to consider using MIF with 0.45% sodium concentration.

Introduction
The impact of dysnatraemia on survival from Intensive Care (ICU) can be related to the degree of hypo-or hypernatraemia [1][2]. To exclude ICU acquired dysnatraemia we reviewed patients admitted to ICU with normal and abnormal sodium levels. Methods A retrospective analysis of 10,311 consecutive patients admitted to our mixed medical and surgical ICU at East Surrey Hospital was completed. We recorded their lowest sodium level within the first 24 hours of admission. Patients were grouped as follows -Sodium <114, 115-124, 125-135, 136-145, 146-155, >156 mmol/L. Statistical comparison between the groups was carried out using Chi-squared analysis.

Results
Of the 10311 patients collated, 401 were excluded due to incomplete data. Survival to hospital discharge was the primary endpoint. We found that severe hyponatraemia (<114 & 115-124 mmol/L) does not confer a statistically significant impact on likelihood of survival when compared with the two groups of 125-135 & 136-145. However, survival with mild hyponatraemia 125-135 mmol/l carries a mortality of 30.4% when compared with normal sodium levels (136-145 mmol/L) which has a mortality of 24.8%, p <0.0001. Patients who are hypernatraemic in the first 24 hours of ICU admission appear to have increasing mortality rates associated with the degree of hypernatraemia. A sodium level greater than 156 mmol/L has a mortality rate of 57.6% whereas 146-155 mmol/L is 43.3%. The differences between these and when compared to the normal range of sodium are statistically significant, P <0.0001 Conclusions We found that hyponatraemia, unless mild, carries less statistical weight when used as a prognostic tool than hypernatraemia when measured in the first 24 hours of admission to ICU.

Introduction
Peri-operative Acute Kidney Injury (AKI) is a common and serious complication after major surgery [1]. Whereas peri-operative risk factors associated with AKI after cardiac surgery have been well described [2], this is not the case for major non-cardiac surgery. Recently, the impact of a high chloride load on the pathogenesis of AKI has been suggested [3]. This study was performed in order to identify possible intra-operative risk factors linked to peri-operative AKI development in a group of non-cardiac surgery patients.

Methods
This single-centre, prospective observational study included adults undergoing elective major abdominal (including vascular) surgery. Patients with chronic kidney disease (CKD) stage IV or V were excluded. AKI was defined according to Acute Kidney Injury Network (AKIN) criteria within 48 hours after surgery [4]. Patients pre-operative demographics (sex, age, hypertension, coronary artery disease, congestive heart failure, chronic obstructive pulmonary disease, diabetes mellitus, CKD stage) and intra-operative anaesthetic management (type of surgery, intravenous fluids, blood products, vasopressors, mean arterial blood pressure, urine output and blood loss) were evaluated as possible AKI predictors. Furthermore, chloride ion content of intraoperatively administered crystalloids and colloids was estimated.

Results
Of 61 patients (47 males) included in the study, 10 (16.4%) developed postoperative AKI (AKI group) and 51 did not (non-AKI group). After univariate analysis, four intra-operative variables were identified as predictors of AKI: Intra-operative blood loss (p = 0.002), transfusion of fresh frozen plasma (p = 0.004) and red blood cells (p = 0.038), as well as high intraoperative chloride load (p = 0.033, AUC = 0.715 ± 0.095, cut off value >500mEq). The remaining pre-and intra-operative variables did not differ significantly between the two groups.   Conclusions: Isotonic saline administration has recently been associated with post-operative AKI, possibly as a result of the excess chloride load during cardiac surgery [5]. Our study's preliminary results indicate that a high intra-operatively administered chloride load is strongly associated with increased risk of post-operative AKI in patients undergoing elective major non-cardiac surgery.

Introduction
High chloride (Cl) load is one of the putative causes of hyperchloremia in critically ill patients. We aimed to assess and quantify the association between chloride load during the first two days of ICU admission and changes in chloride levels between admission and day 3.

Methods
We analyzed adult patients with at least two days of ICU stay included in the MIMIC II. Cl load was calculated as the sum of the Cl infused in the first 48 hours from common solutions (normal saline, half normal saline, lactated ringer, HES). The association between change in Cl levels (day 3 minus day 1) and Cl load was assessed by linear regression. We performed sensitivity analysis according to serum creatinine at admission (<1 mg/dL; 1-2 mg/dL; 2-3 mg/dL or higher > 3 mg/dL), occurrence of any degree of acute kidney injury in the first 48 hours (above KDIGO 1), in quartiles of diuresis in the first 48 hours and in oliguric (<0.5 mL/kg/h in the first 48 hours) patients. Results 9,043 patients were included in the analysis. Median chloride load was 350 mEq (IQ 162-579 mEq). On linear regression, higher chloride load was weakly associated with an increase in serum chloride (coefficient 0.001; SE 0.0002; p < 0.001 - Fig. 37, red line), but correlation and R-squared were low (0.08 and 0.007, respectively). These findings were consistent in all subgroups assessed.

Conclusions
Although a high chloride load is associated with an increase in serum chloride, the association was weak even patients with low urinary output and high creatinine. This suggests that chloride load may not be the principal cause of hyperchloremia in critically ill patients.

Introduction
Hyperchloremia is common in ICUs [1]. Short term and in vitro studies [2] show that crystalloid infusions may lead to hyperchloremia and metabolic acidosis due to low strong ion difference (SID). It is not known if changes in serum chloride (sCl) concentration in vivo yield to mid-term SID and acid-base (AB) imbalance. We performed a study to assess the effects of sCl increase on SID and AB equilibrium in critically ill patients.

Methods
The

Introduction
Our hypothesis is that there is an interaction between base excess chloride (BEcl) and standard base excess (SBE). We know that electrolytes are independently related with acid-base status (1). Even in acute respiratory acidosis, renal compensation occurs on urinary output of sodium and chloride (2). SBE formula Fig. 37 (abstract P491). Density hexagonal binning for the association between Cl load and change in Cl derivated from Van Slyke equation doesn't include any electrolyte effect (3). Yet, BEcl is defined as an effect of chloride on SBE and calculated with (Na-Cl-32) in accordance with partitioned base excess approach (4). An interaction between BEcl and SBE can provide better understanding primary reasons and compensations of acid-base disorders.

Methods
In this study, 2736 patients were retrospectively evaluated beetween 2011 and 2016. Patients who were <18 years old, readmitted, whose ICU scores, blood gases and outcomes were unknown were excluded. Demographic data, blood gases' parameters and outcomes were recorded.

Conclusions
BEcl is positively correlated with SBE, HCO3 and PaCO2. Hence, it can be used to explaine the effects of sodium and chloride on SBE. Actually, BEcl may lead a primary reason or a part of compensation of any acid-base disorders.

Introduction
Fluid therapy in critically ill patients, especially timing and fluid choice, varies substantially between centres and countries. For many Asian countries, there is few data about the common practice volume therapy. We aimed to study the current approaches in India, Taiwan and Malaysia concerning patient populations, treatments and outcomes within the RaFTA registry.

Methods
RaFTA is an observational study in Asian ICU patients focussing on fluid therapy and related outcomes. Multivariable Cox regression was performed to identify risk factors for increased 90 day mortality and renal failure.

Introduction
According to current guidelines [1], crystalloids and colloids are used as the first line treatment in fluid resuscitation during massive bleeding. However, they may have deleterious impact on haemostasis. Therefore we aimed to investigate effects of balanced crystalloid and colloid solutions on coagulation and fibrinolysis in vitro.

Results
Both succinylated gelatin and HES showed deranged INTEM (CFT, AA, A10 and MCF) and FIBTEM (A10, MCF) parameters (p < 0.01), however the effect of HES was more apparent (Fig. 38). In EXTEM, only HES significantly affected coagulation (i.e. CT prolongation). There was no effect on fibrinolysis or platelet function, as evidenced by unchanged ML and TRAP, respectively. In standard laboratory tests dilutional effect was found, however all investigated parameters stayed within reference values. No correlation was found between functional and standard coagulation tests.

Conclusions
Despite marked haemodilution balanced crystalloid cannot affect coagulation. Balanced colloids impair clot formation and firmness, however the effect of HES is more pronounced. None of the fluids significantly impact fibrinolysis or platelet function.

Introduction
We examined the sublingual microcirculation in a porcine model of THS to assess the impact of microcirculatory dysfunction on the development of TIC in animals treated with initial 0.9% saline or blood products. Development of trauma induced coagulopathy (TIC) is an important modifiable factor associated with excess morbidity and mortality following traumatic haemorrhagic shock (THS). Tissue hypoperfusion, possibly resulting from impairment of the microcirculation, is a pre-requisite for the development of TIC. The early use of blood products may attenuate TIC but is logistically difficult and identification of patients at highest risk would allow more effective targeting of this resource.

Methods
This study was conducted in accordance with the Animals (Scientific Procedures) Act, 1986. 19 terminally anaesthetized Large White pigs were subjected to a standardized hind-limb injury followed by a controlled haemorrhage of approximately 35% of blood volume. A 30 min period of shock was followed by 60 min of hypotensive resuscitation with either 0.9% saline or packed red cells:fresh frozen plasma after which all animals were treated with PRBC:FFP. Sublingual IDF videomicroscopy was performed and animals were divided into above and below average perfused vessel density (PVD) groups based on the lowest recorded measurement taken during the shock and initial resuscitation phases. Coagulation was assessed using thromboelastography (TEG) and PT, APTT.
Results 8 animals had above average perfusion (PVD 10.5 ± 2.5 mm/mm2) and 11 below average perfusion (PVD 5.5 ± 4.1mm/mm2). For animals initially treated with 0.9% saline (n = 10) there were significant differences between those with above and below average perfusion in terms of R  They have been evaluated primary and secondary emphasizing the signs of dehydration, blood sampling for determining the ions of K, Na, Cl, P, acid-base balance, glucose, and urinary ketones.

Results
Children were transported by the emergency in proportion of 81%. According to age children > 9 have predominated, gender: b n = 60; g n = 56 (p <0.001). Degree of dehydration: 42 children mild dehydration under 4%, in 58 cases we founded moderate signs of dehydration 4-7%, and children with serious signs of dehydration -16. We have founded the level of K ions and of ABB while being hospitalized and in dynamics. In lot I, we registered values of K under 3 mmol/l, in comparison with the values of 3.5mmoles/l of K in the control lot (p =0.001).They received infusion of crystalloid and K supplements, the necessary volume and quantity of K was calculated separately according to the known formula. After the infusion-in children from the I lot: the initial values of Ph 7.09 ± 0.07; pCO2 11 ± 1.5; BE -(-) 18.7 ± 0.9 had a tendency of amelioration after 30 minutes in 6 cases, in 19 cases just after 2 hours an amelioration, in 16 cases after 12 hours, simultaneously in other 7 cases the amelioration of the described above parameters was observed after 24 hours. No child needed the administration of Na bicarbonate for the normalization of metabolic acidosis. The average time of treatment 13 ± 0.9 days in I lot, in the control lot 7 ± 0.7 (p = 0.03). Conclusions 1. Early administration of infusion with the goal of hydroeletrolitic and acid-base balancing lead to a more rapid amelioration of the hemodynamics in children with ketoacidosis and reduces the time of being in-patient unit. 2. Correct administration of infusion therapy in ketoacidosis eliminates the necessity of Sodium bicarbonate administration with the goal of metabolic acidosis.

P499
Metformin-associated lactic acidosis requiring intensive care in a regional hospital in Hong Kong and predictive factors for mortality CT Lun 1 , HJ Yuen 2 , G Ng 2 , A Leung 2 , SO So 1 , HS Chan 1 , KY Lai 2

Introduction
Metformin-associated lactic acidosis (MALA) is a severe condition with inconsistent factors for mortality in previous studies. The relationship between the timing of commencement of RRT and survival has not been investigated. Methods This is a 5-year retrospective case series in patients with MALA in a regional hospital in Hong Kong. The primary outcome is to identify factors associated with intensive care unit (ICU) mortality, specifically to investigate the relationship between mortality and time from admission to RRT. The secondary outcomes are to investigate the characteristics and outcomes of patients with different precipitating factors and to identify factors for duration of RRT dependency. Results 59 patients were eligible for analysis. Compared with the 54 survivors, the 5 non-survivors had higher APACHE IV scores, APACHE IV predicted mortality risk, temperature, heart rates, PaCO2 levels, and noradrenaline dosage and lower first 24 hr urine volume and serum albumin. They were more likely to be on mechanical ventilation and suffer from sepsis. They had longer time from hospital admission to commencement of RRT. The ROC curve had AUC of 0.776 and sensitivity and specificity of 80% if cut off value was set at 765.5 mins. Patients with sepsis were found to be more severe with higher mortality. Sepsis as a precipitating factor and high baseline serum creatinine level were two factors identified for longer RRT dependence.

Conclusions
Our study identified factors associated with mortality in metforminassociated lactic acidosis, including time from admission to commencement of RRT, suggesting benefit of early RRT. Further prospective studies may be necessary to determine the optimal timing of initiation of RRT for MALA in future.

P500
The comparison between intravenous sodium bicarbonate plus NAC and intravenous sodium chloride plus NAC to prevent contrast induced nephropathy in ED P Sanguanwit 1 , W Charoensuk 2 , B Phakdeekitcharoen 3

Introduction
Contrast-induced nephropathy (CIN) was common complication of radiologic procedures and increased morbidity, mortality, and acceleration toward ESRD. The objective was to compare IV sodium bicarbonate plus NAC and IV sodium chloride plus NAC to prevent CIN in ED.      Introduction Presepsin (PSEP) has been shown powerful prognostic validity in inflammatory related conditions like sepsis and association with disease severity, multi organ dysfunction syndrome and outcome. We thought to evaluate the prognostic value of PSEP for outcome prediction in patients undergoing elective cardiac surgery.

Methods
We included consecutive patients having cardiac surgery and measured plasma concentration of PSEP, NT-pro-BNP, PCT, leucocytes, and cystatin C. PSEP was determined by using the PATHFAST Presepsin assay (LSI Medience corporation,Tokyo). Outcome measures were in-hospital mortality, 6-month mortality and occurence of acute kidney injury (AKI) during hospitalization.

Conclusions
Preoperative PSEP is a predictor of postoperative mortality and AKI in cardiac surgery patients, and is a stronger predictor than commonly used factors. PSEP has proven as an independent risk indicator to predict outcome and may be used for risk stratification in patients scheduled for surgery.

Introduction
Exertional rhabdomyolysis in long-distance runners has been reported since the late 1970s as a condition precipitated by overheating, significant physical exertion and dehydration. The case report concerns a 35-year-old man who was admitted to the ICU at the 27th hour after completing a 24-hour ultramarathon race during which he ran a distance of about 205 km, after staying for a few hours in the Cardiac Department where the man was admitted for bradycardia (ventricular rhythm rate: 25 bpm), considerable fatigue and whole body pain. During hospitalization in the Cardiac Department, the patient was diagnosed with left ventricular hypokinesis with the EF of 40%, marked hyperkalaemia (8.41 mmol/l) accompanied by hyponatraemia (123 mmol/l) and hypochloraemia (86 mmol/l) as well as elevated levels of creatinine (6.18 mg/dl), urea (220 mg/dl), liver enzymes (AST -6,799 U/I, ALT -2,030 U/I), CK (332,900 U/l), CK-MB mass (>300 ng/ml), troponin T (0.109 ng/ml), and anuria. The level of myoglobin determined on day 2 of the patient's ICU stay was > 30,000 ng/ml.

Methods
Aside from standard treatment, with a view to managing hyperkalaemia and protecting the kidneys from further damage, CVVHD CiCa haemodialysis with an EMIC2 filter was performed, leading to a rapid stabilization of the patient's clinical condition including sinus rhythm recovery. The EMIC2 filter and/or the CVVHD CiCa kit was/were initially replaced every 12-24 hours. At the same time, the patient, who was monitored haemodynamically, despite anuria received fluid therapy so as to achieve mild arterial hypertension due to hypervolaemia. The rate of ultrafiltration was adjusted to the patient's hydration level.

Results
The treatment administered to the patient led to establishing a diuresis of over 400 ml/day on day 6 after admission. CVVHD was discontinued on day 11 of hospitalization, when the patient's spontaneous diuresis was 1,600 ml/day. Following CVVHD discontinuation, diuresis was aided by the intake of mannitol and the continuous infusion of furosemide. The patient was discharged on day 17 of hospitalization, in the polyuric phase of up to 9,300 ml/day (eGFR 17.40 ml/min/1.73 m2).

Conclusions
A combination of intensive fluid therapy provided under haemodynamic monitoring with continuous renal replacement therapy using a large pore filter ensuring myoglobin filtration in patients with acute renal failure caused by exertional rhabdomyolysis is a safe method of patient management which may reduce kidney damage and speed up the recovery of normal renal function after exerciseinduced rhabdomyolysis.

P504
Impact on the development of acute renal injury of two doses of glucocorticoids during cardiopulmonary bypass in cardiac surgery A González Pérez, J Silva Hospital Universitario Central de Asturias, Oviedo, Spain Critical Care 2017, 21(Suppl 1):P504

Introduction
Cardiac surgery with cardiopulmonary bypass (CPB) triggers a systemic inflammatory response syndrome (SIRS) that is associated with postoperative morbidity and mortality. Steroids may attenuate this response but there is currently controversy over its use. This SIRS is associated with acute renal injury (AKI) during the ICU stay. We aimed to assess the effects of two dosage of steroids in CPB in AKI development during ICU stay.

Introduction
An augmented renal clearance (ARC) has been described in some critical patient groups [1]. The aim of this study was to identify factors predicting supranormal glomerular filtration in critically ill patients with normal plasma creatinine and polyuria. Methods This is a prospective study in a surgical and medical intensive care unit (ICU). From October 2015 to October 2016, patients with normal plasma creatinine and polyuria (>3 liters of urine / 24 hours) were included. We exclude individuals referred from another hospital. Creatinine clearance, sodium concentration and diuresis were calculated from a 24-hours urine collection. Percentile 75th was determined to define supranormal glomerular filtration (199 ml / min / 1.73 m2). The logistic regression model was used to identify independent associated variables.

Results
Thirty-six patients were included in the study. Twenty three patients increased renal clearance (>130 ml / min / 1.73 m2). Nine (25%) patients were above 75th percentile (>199 ml / min / 1.73 m2). Seven patients had a neurological cause of admission. In multivariate analysis, the volume of diuresis for 24 hours predicts ARC (Fig. 42).

Conclusions
A relationship between ARC, urinary sodium and diuresis volume should be studied in a larger cohort of patients.

Introduction
Several biomarkers have been suggested as early predictors of acute kidney injury (AKI) after orthotopic liver transplantation (OLT). Systemic and urinary neutrophil gelatinase-associated lipocalin-2 (NGAL) appear to be promising predictors of AKI after OLT, but their clinical benefit remains to be proven [1,2]. Recently, systemic macrophage migration inhibitory factor (MIF) has been proposed as an early indicator for requirement of renal replacement therapy after OLT [3]. We hypothesized that either systemic or urinary MIF can predict the development of AKI after OLT with comparable power as systemic and urinary NGAL.

Methods
This prospective, observational pilot study was performed at the Medical University of Vienna after local ethics committee approval and registration at clinicaltrials.gov (NCT02695979). Concentrations of MIF and NGAL were measured in serum and urine samples collected from patients undergoing OLT. Acute kidney injury was classified according to the KDIGO criteria, with stages 2 and 3 summarized as severe AKI. Areas under the receiver operating curves (AUC) were calculated to assess predictive values of MIF and NGAL for the development of severe AKI.

Results
Forty-five patients (mean age 55 ± 8 years) were included. Nineteen patients (38%) developed severe AKI within 48 hours after OLT. After OLT, MIF concentrations in serum and urine were greater in patients with AKI than those without AKI. In contrast, only urine NGAL concentrations, but not serum NGAL concentrations were greater in patients with AKI than those without AKI.

Conclusions
In the setting of OLT, serum MIF predicted severe AKI earlier than serum NGAL and urinary NGAL.

Introduction
A significant portion of patients does not achieve full renal recovery by 90 days after an episode of acute kidney injury (AKI). This study aimed to elucidate the factors that predict lack of full renal recovery on Day 90 (D-90) in critically ill patients with AKI. Peri-operative Acute Kidney Injury (AKI) is a well defined entity associated with significant morbidity and mortality [1]. While several serum and urine AKI biomarkers have been identified in the cardiac surgery population [2], the predictive value of those markers in noncardiac surgical patients has not been extensively studied. The aim of this prospective observational study was the evaluation of different prognostic biomarkers in a cohort of patients undergoing elective major surgery. Methods 61 patients, 67.1 ± 10 years old, with no chronic kidney disease (CKD) stage IV or V undergoing elective major abdominal surgery were studied. AKI was defined according to Acute Kidney Injury Network (AKIN) criteria within 48 hours after surgery [3]. At pre-defined time points [Pre-op (pre-operatively), RR (recovery room), Post-op (24 & 48 hours post-operatively)] the following biomarkers were measured: Serum creatinine (Scr), Cystatine C (Scyst), Retinol Binding Protein (SRBP), urine á1-microglobulin (Ua1-micr), â2-microglobulin (Uâ2micr), Transferrin (Utransf) and Albumin (Ualb). Fractional excretion of sodium (FeNa) and urea (FeUr) were calculated as well.

Conclusions
Urine á1-microglobulin and serum RBP have been recognised as markers of early tubular damage in diabetic nephropathy [4]. Our preliminary results identified these two low-molecular weight proteins as possible early markers of peri-operative AKI in a well-defined cohort of non-cardiac surgical patients. With the exception of diabetic nephropathy and acute-on-chronic renal injury [5][6][7], this is the first study recognising the potential of these two novel biomarkers for early prediction of peri-operative AKI.

Introduction
Bioelectrical impedance analysis (BIA) is a fast and non-invasive technique used to evaluate hydration status. As acute kidney injury (AKI) remains one of the most common complications after cardiac surgery, the aim of this study is to determine whether BIA can be used to predict AKI.

Methods
This was a prospective observational study, conducted in cardiac surgery centre of Vilnius University hospital Santariskiu Clinics. Consecutive patients scheduled for elective cardiac surgery were enrolled during a period of one year. BIA was performed one day prior surgery; relevant demographic and clinical data was gathered. These patients were observed until discharge from hospital or up to one month. Impaired renal function was defined by the Society of Thoracic Surgeons (STS) proposed criteria of renal failure (RF). The most accurate BIA markers were selected and further analysed. Results 618 patients were enrolled. Most of these patients were men 67% (n = 414), had a CABG procedure 61.5% (n = 380) and were at low operative risk with a median Euroscore II value of 1. Conclusions BIA parameters can be used to stratify risk of RF for patients scheduled for cardiac surgery. However, further studies are needed to determine whether these factors are independent of other predictors.

Introduction
Extracellular histones are cytotoxic molecules that are associated to cell stress and cell death. Presence of extracellular histones has been linked to multiple inflammatory-driven pathologies such as autoimmunity, tissue injury and sepsis [1]. We previously showed that the presence of extracellular histones correlates with mortality in sepsis patients [2], a disease state in which histones are considered as important mediators in disease progression. A potential role for histones in another condition in which tissue condition is considered of vital importance, in organ donation and graft function and survival, is still unknown. The aim of this study was therefore to assess whether an association exists between the presence of extracellular histones in machine perfusates and deceased donor kidney viability. Methods Extracellular histone H3 was determined in machine perfusates of 390 donation after circulatory death (DCD) kidneys by semiquantitative Western Blotting. Corresponding graft function and survival were assessed to study a potential association with the presence of histone H3 in the perfusate fluid.

Conclusions
Extracellular histone concentrations in perfusates of kidneys with post-transplant graft dysfunction were significantly higher than in those grafts that show an immediate function. Follow-up studies are planned to further investigate the potential of cytotoxic extracellular histones in the assessment of post-transplant graft function and survival, as well as possible histone-based interventions to optimize graft preservation protocols.
velocity as the average of all measured values. The index was measured within the first 24 hours of ICU admission, after initial hemodynamic stabilization. AKI was defined according to the Kidney Disease Improving Global Outcomes (KDIGO) Clinical Practice Guidelines [2]. ROC curves were constructed in order to evaluate the AUC of RRI for AKI prediction.

Results
Sixty-three mechanically ventilated, critically ill patients were included in the study, (median age 66 years, 55% were males). Median values for APACHE II and SOFA scores were 18 and 9, respectively. AKI developed in 24 patients (38%) during the first 48h post-ICU admission; 14 patients (22%) had AKI already on ICU admission. Continuous renal replacement therapy (CRRT) underwent 44% of patients with AKI and the all-cause ICU mortality rate was 38%. AUC of RRI for AKI diagnosis on admission was 0.94 (CI: 0.86-1.0), while for AKI prognosis within the first 48h after admission (in the subset of patients who had no AKI on admission) was 0.96 (CI: 0.90-1.0). The total predictive ability of RRI for AKI development (both diagnosis and prognosis) within the first 48h after ICU admission was 0.95 (CI: 0.88-1.0). AUC of RRI for CRRT initiation prediction during ICU stay was 0.81 (CI: 0.69-0.93).

Conclusions
These results suggest that an increased RRI on ICU admission may be used as a predictor of AKI development early in the course of ICU stay.

Introduction
Few studies reported the deleterious association between Acute Respiratory Distress Syndrome (ARDS) and development of Acute Kidney Injury (AKI). We aimed to evaluate the association of AKI defined by Acute Kidney Injury Network (AKIN) criteria and poor outcome in ARDS patients.

Methods
Sixty four patients admitted with ARDS and mechanically ventilated were enrolled. In addition to routine labs and APACHE II score, lung injury severity scores (LISS) were calculated upon admission and after one week. Degree of renal impairment was assessed using AKIN criteria (grade 1, 2, 3). Patients were stratified into 2 groups according to their degree of renal impairment. Patients were followed during their hospital stay. All data were statistically analyzed.

Results
The mean age of the studied patients was 47.23 ± 10.12 years; 33 (51.6%) were males. Mean PO2/FIO2, and LISS on admission were 169.95 ± 31and 3.06 ± 0.54 respectively. In 27 (42.2%) of patients who had significant renal impairment, 15 (23.3%) and 12 (18.8%) patients had AKIN grade 2, and 3 respectively. In these patients, the follow up LISS and length of hospital stay were significantly higher compared to other patients: 3.33 ± 0.74 points, and 19.11 ± 6.37 days, and vs 2.84 ± 0.57, and 12.38 ± 4.21, with P value 0.004, and <0.001 respectively. Also, they had a higher need to use vasoactive agents; 21 (55.3%) patients vs 6 (23.1%) and spend more days on mechanical ventilation 14.18 ± 4.59 days vs 8.51 ± 3.77, with P value 0.019 and <0.001 respectively. Inpatient mortality was significantly higher in patients with significant renal impairment compared to others; 18 (66.7%) vs 6 (23.1%), P value 0.019. Conclusions AKI defined by AKIN criteria is associated with poor outcome in ARDS patients.

Introduction
Regional citrate anticoagulation (RCA) is recommended as standard anticoagulation mode for continuous renal replacement therapy (CRRT). A major complication of RCA-CRRT is systemic citrate accumulation. We studied the predictive capability of lactate concentrations and time-dependent lactate clearance for citrate accumulation.

Methods
We performed the retrospective, observational study at a university hospital with 6 ICUs. Included patients treated with RCA-CRRT were screened for metabolic signs of citrate accumulation during the first 48 hours of treatment.

Conclusions
The risk of citrate accumulation during RCA-CRRT is low even in cases with initially severe hyperlactatemia. Our study stress out the importance of lactate kinetics in assessing a risk of citrate accumulation.

Introduction
Acute kidney injury (AKI) is of high prevalence in critically ill patients and is associated with an important morbidity and mortality (1). Continuous renal replacement therapy (CRRT) is a common modality in its treatment (2). Besides the primary aim to substitute impaired renal function, a secondary, unintentional phenomenon is carbon dioxide clearance. The purpose of this pilot study is to quantify the clearance of carbon dioxide by CRRT.

Methods
In a pilot setting of a hemodynamically en metabolic stable, critically ill patient with AKI and continuous venovenous hemofiltration (CVVH), on several points of the circuit (Fig. 45) blood sampling and analysis were performed. This during a standard CRRT run with and without citrate anticoagulation and without bicarbonate use in the predilution fluid. A software calculator was designed to quantify carbon dioxide removal. Necessary parameters are CVVH effluent flow, predilution and postdilution flow.

Results
Differences in pCO2 are not contributive as a quantitative measurement due to dilution of the blood passing through the circuit. In this context CO2 content was calculated using flow values. The overall removal of CO2 was 29ml/min or 70 mmol/h, an equivalent of 24% of CO2 content in a circuit with citrate. Without citrate we found a clearance of 27ml/min or 54 mmol/h equal to 19% of CO2 content removal.
Conclusions CVVH for traditional ICU use had a hidden component which can be calculated: the extraction of CO2. This hidden force can rise to 24% of CO2 content.

Introduction
Metabolic acidosis is a common event among patients with multiple organ failure. When it is caused by an impaired carbohydrate metabolism due to hypoxia, lactic acidosis may occur increasing blood lactate and reducing pH. Our group has integrated the treatment of acidosis into an Advanced Organ Support (ADVOS) system based on albumin dialysis. It consists of 3 circuits that allow elimination of water and protein bound toxins, regeneration of the albumin used in the process and stabilization of pH [1]. The aim of this work is to show these features of the ADVOS system by means of lactate elimination, pH stabilization and bilirubin clearance in an ex vivo model for metabolic acidosis and comparing them with a normal renal dialysis machine (NIKKISO DBB-03).
Methods An ex vivo model for metabolic acidosis with liver involvement was designed setting a continuous infusion of lactic acid into 5 liters porcine blood, which was spiked with bilirubin (275 mg/dl) before. Blood was dialyzed through the ADVOS system for 2 hours at 200 and 400 ml/min blood flow (BF). A dialysate pH of 9 was set. To determine the maximum lactate addition and removal, lactic acid infusion was progressively increased until blood pH was out of physiological ranges. Once the maximum addition was determined, tests were repeated with the NIKKISO machine using a BF of 400 ml/min. Lactate, pH and bilirubin levels were analyzed pre-and post-dialyzer every 15 minutes. Blood was checked for hemolysis at the beginning and the end of the experiments. Results The ADVOS system was able to stabilize the blood pH in a physiological range till a maximum lactic acid addition of 3.1 mmol/min (BF 400 ml/min). Moreover, with a BF of 200 ml/min up to 1.9 mmol/min (75%) of lactate was eliminated, which would result into a proton elimination of 4,464 mmol/day. Additionally, the ADVOS system removed significantly more bilirubin from blood as the NIKKISO DBB-03 machine (66% vs. 21%). Although the NIKKISO DBB-03 machine achieved a higher lactate elimination rate of 80%, blood pH decreased to 6.90.

Conclusions
During a continuous infusion of up to 3.1 mmol/min of lactic acid in an ex vivo model for metabolic acidosis, blood pH decreased to 6.90 under conventional hemodialysis. With the ADVOS system, blood pH remained stable between 7.35 and 7.45 and additionally an efficient elimination of bilirubin was achieved. Hence, ADVOS has a potential advantage for the treatment of critically ill patients with multiple organ failure.

Introduction
Ultrasonography of inferior vena cava (IVC) is a guide to estimate central venous pressure [1]. Lung ultrasound is an effective method of evaluating extravascular lung water (EVLW) [2]. The observation of B-lines resolution and the collapse index of IVC represent a valid guide to improve continuous renal replacement therapy (CRRT) management. The aim of this work was to demonstrate the role of chest and IVC ultrasound in the assessment of the variation of fluid status in patients undergoing CRRT.

Methods
We enrolled patients undergoing CRRT during their ICU stay. Patients were assessed for lung and IVC ultrasound immediately before CRRT (T0) and after 12 (T12), and 24 (T24) hours. B-lines were counted and the reduction of IVC diameter was measured using the collapse index. We also recorded for each patient at T0 and T24: PaO2/FiO2, Leukocytes (WBC), Procalcitonin (PCT), and the volume of fluid removed. We obtained written informed consent from the patients.

Results
A total of 14 patients undergoing CRRT were recruited. PaO2/FiO2 ratio, PCT and WBC count improved in all patients from T0 to T24. Blines resolution was also reached in all patients as demonstrated by a Comet-Score. (Fig. 46) Not all patients were evaluated for IVC ultrasound, because of open abdominal surgery. Moreover, one of them suffered from a severe tricuspidal valve failure, making it the measure not reliable. Therefore, we measured the IVC collapsibility index (Δ ø IVC %) in 10 patients; the collapsibility index did not significantly change before and after CRRT.

Conclusions
We found that fluid removal during CRRT can be clearly monitorized by chest ultrasound. Lung chest ultrasound is a non-invasive, simple method performed at the bedside that provides accurate information on EVLW variations and, consequently, very useful for an appropriate management during CRRT.

Introduction
Continuous renal replacement therapy (CRRT) is a common necessity in ICUs, whether for those with end-stage renal disease (ESRD) or those with acute kidney injury (AKI). Mortality is high in both situations, with reported rates above 50% in centres of excellence [1,2]. Recovery of renal function among survivors of an acute insult is usually good [1] but some suffer long-term renal impairment as a consequence. Our aim was to examine short-and long-term mortality and recovery of renal function among patients requiring CRRT in ICU. Our goal was that findings from this study would support evidence-based decision making among intensivists when faced with critical illness and the potential need for CRRT.

Methods
All patients admitted to our centre's ICU between January 1st 2012 and December 31st 2014, who required CRRT, were included in the study. Information gathering was from the patient data on these records and public death records.

Results
Our cohort was of 450 patients, with a median APACHE II score on admission of 27.8. Overall in-hospital mortality among patients requiring CRRT was 46% (n = 208). Mortality in ICU was 37%. The inhospital mortality rate in medical patients was 52%, and the rate was higher in AKI patients (48.5%) than in ESRD patients (31.7%). At one year, mortality for the total population was 54%. Outcomes for ESRD patients at one year are poorer than in the short term, with 17% of ESRD patients who survived to discharge dying within the next year, compared to 14% of AKI patients. Regarding renal function, we found a dialysis-dependency of 34% among survivors to ICU discharge; when those with known ESRD were excluded, the rate was 21%. Among those with AKI who survived to hospital discharge, the dialysis-dependency rate was 7%.

Conclusions
Our data show a varying pattern of concordance with previous work.
Overall mortality is lower both in the short-and long-term than in other countries [1,3], but this may reflect differences in study populations and pre-admission medical status, rather than truly different outcomes. In those with ESRD, our work shows that although they may be considered high-risk patients for ICU, their outcomes are equal and often superior to AKI patients; this may influence intensivists to more readily accept these patients to a higher level of care. Finally, regarding recovery of renal function, our analysis suggests that the risk of lasting renal failure in survivors of acute insults requiring CRRT, is low.