IL-17 receptor signaling is required to control polymicrobial sepsis

Sepsis is a systemic inflammatory response resulting from the inability of the host to contain the infection locally. Previously, we demonstrated that during severe sepsis there is a marked failure of neutrophil migration to the infection site, which contributes to dissemination of infection, resulting in high mortality. IL-17 plays an important role in neutrophil recruitment. Herein, we investigated the role of IL-17R signaling in polymicrobial sepsis induced by cecal ligation and puncture (CLP). It was observed that IL-17R-deficient mice, subjected to CLP-induced non-severe sepsis, show reduced neutrophil recruitment into the peritoneal cavity, spread of infection, and increased systemic inflammatory response as compared with C57BL/6 littermates. As a consequence, the mice showed an increased mortality rate. The ability of IL-17 to induce neutrophil migration was demonstrated in vivo and in vitro. Beside its role in neutrophil recruitment to the infection focus, IL-17 enhanced the microbicidal activity of the migrating neutrophils by a mechanism dependent on NO. Therefore, IL-17 plays a critical role in host protection during polymicrobial sepsis.

written down but not treated as a real shortcoming. The changes in medial prescription were done immediately. In cases where the patient did not show a favorable situation for the utilization of thromboembolic prophylaxis (bleeding, presurgical, among others), it would be treated as a nonreal shortcoming. The same was done for glucose control. We realized that after 4 weeks using this instrument there was a small reduction of shortcomings in glucose control (Figure 1), and a discrete raise in thromboembolic prophylaxis ( Figure 2). From this point we reviewed the checklist, in order to provide a field to write down real shortcomings, so that they are given more relevance and treatment, since the patients' clinical situation deserves different treatments that do not interfere in the unit's quality of service. The inclusion of skin evaluation through the Braden Scale was an opportunity to follow patients' skin, by means of risk evaluation to develop wounds, providing data on the daily scoring average of the Braden Scale and the spot where these wounds were more frequent. An average Braden score of 13.65 ( Figure 3) was verified, and it was also seen that the greater incidence of pressure ulcer was in the sacral region (44.75%) (Figure 4). Conclusions It can be concluded that FAST HUG, in addition to being a tool to evaluate assisting quality and to assure patients that their needs will be fulfilled while they remain in the ICU, may be considered a boost to overcome new challenges. Along with the Introduction The mechanisms associated with immunomodulation after red blood cell transfusion are not completely understood, possibly due to methodological biases in the clinical studies and presence of comorbidities such as sepsis. Therefore, a controlled animal model of blood cell transfusion may be a more appropriate approach to minimize these issues. We designed this pilot study in order to validate in vitro and in vivo the survival of swine erythrocytes stored for 13 days. Methods Blood was collected from one Agroceres ® swine and stored in 2 units of red blood cells (RBC). The following measurements were performed at baseline and after 13 days of storage: volume, hemoglobin and hematocrit, hemolysis index, potassium, sodium, glucose and pH. In vivo validation and hemolysis evaluation were performed by labeling the cells with Na 2 51 CrO 4 and recovering viable erythrocytes up to 24 hours after transfusion in one autologous material and four homologous animals. A splenectomy was performed after death to evaluate splenic sequestration of RBC. Results In vitro validation of the samples is demonstrated in Table 1. The mean RBC recovery value after 24 hours of injection of labeled RBC was 97.5 ± 19%, demonstrating a good viability of the samples. The evaluation of splenic hemolysis was negative. Conclusions Erythrocytes from pigs stored under human standardized conditions for up to 13 days may be used for experimental transfusion studies. This controlled animal model may be useful to study pathogenetic mechanisms related to adverse effects of RBC transfusion.
Introduction The use of echocardiogram performed by intensivists, as a modality of hemodynamic monitoring, is becoming increasingly frequent in ICUs worldwide. Several training programs and curriculums have been proposed to avoid the misuse and misinterpretation of this tool. However, until now, there have been few reports validating this type of training. The aim of the present study is to compare these measurements in order to evaluate the efficacy of our institutional training program. Methods In our institution, we have performed a 140-hour bedside echocardiography training (plus 10-hour theoretical classes) for seven intensivists. At the end of the training period, the intensivists and the teachers (experienced level III echocardiographists) registered echocardiography-derived hemodynamic variables of the same patient a few minutes apart. Results We have obtained 46 paired measurements. The velocitytime integral of ventricular outflow tract showed a Pearson correlation coefficient = 0.860 (P <0.01), a bias of 1.19 cm and a  mean error of 29% between paired measurements. The systolic volume classification (between low or normal and high) resulted in a kappa coefficient of 0.696 (± 0.105). Myocardial contractility resulted in a kappa coefficient of 0.823 (± 0.121).
Conclusions Our study demonstrates that our training program was efficient. Hemodynamic-focused echocardiography can be accurately performed by intensivists after attendance of this training program.
Objective To analyze behavior of tissue partial pressure of oxygen (pO 2 ) measured in the liver during sepsis and to correlate its reduction with lactate levels.
Methods Eleven large white pigs, weight 35 kg, in general anesthesia (isofluorane, fentanyl, pancuronium), fully monitored (electrocardiography, etCO 2 , invasive pressure, pulmonary artery catheter, portal vein Doppler ultrasound flow, small bowel tonometry), were submitted to fecal peritonitis sepsis (1 g/kg feces plus 150 ml warm saline) after pO 2 and laser Doppler fluxometry probes were placed inside liver parenchyma. Laboratory and hemodynamic data were registered hourly. After the experiments,

Figure 3 (abstract P4)
Time series plot of median SSC.
pigs were sacrificed with sedative overdose and KCl 19.1% injection.
Results The model is well studied and very consistent. Hypotension occurs only in late phases (8th hour). Lactate generation seems to occur earlier (1st hour) than tissue pO 2 level reduction (4th hour), in septic pigs. (See Figures 1 and 2

.)
Conclusions Lactate generation not only seems to be related to tissue hypoxia in septic pigs. Inflammation and mitochondrial dysfunction may probably play a role in this pathological process. Further studies are needed to clarify these mechanisms. Perhaps other interventions, not only oxygen uptake optimization, ought to be necessary for early reversal of septic cascade.
Introduction Pulse pressure variation (DPP) is a very good method to predict the improvement in oxygen delivery in circulatory failure states after volume expansion. However, this method has been validated only in patients under sedation plus controlled positive pressure invasive mechanical ventilation (PCV). Our understanding of this method in patients under spontaneous ventilation remains unclear.

Materials and methods
In 10 male domestic pigs the pulmonary arterial pressure, aortic arch pressure, femoral arterial pressure (PP) and cardiac output by thermodilution technique were measured in four different stages: (I) basal, in spontaneous ventilation; (II) after controlled hemorrhage to simulate the hypovolemic shock in spontaneous ventilation; (III) in a hypovolemic shock state but now under PCV and breath muscle paralyzation with pancuronium; (IV) after volemic resuscitation under PCV (thiopental plus fentanyl plus pancuronium). The means and medians were compared by the ANOVA and TURKEY tests respectively; P <0.05 was considered statistically significant. DPP was calculated in all stages by the formula: DPP = (100 x (maximalPP -minimalPP)/ (maximalPP + minimalPP)/2), where maximalPP = (maximal systolic pressure -maximal diastolic pressure) and minimalPP = (minimal systolic pressure -minimal diastolic pressure).

Results
The means of DPP under spontaneous ventilation were statistically significantly higher than in other stages of the experiment, respectively: 22.3%; 42.27%; 21.8% and 10.48% with P = 0.039. After the PCV the DPP got back to basal values, without volemic resuscitation. The lowest value were achieved after volume expansion with P = 0.001 compared with stage II. Conclusions The DPP in hypovolemic shock in spontaneous ventilation is higher than under PCV. It is important to find the cutoff value that has a best relationship to the response to volume resuscitation.
Introduction Heart failure (HF) is associated with high morbidity and mortality rates, including frequent rehospitalization. The 30-day HF Rehospitalization Rate is an outcome quality indicator used to measure the quality of care.
Objective To identify changes in the 30-day HF Rehospitalization Rate after the implementation of a HF managed protocol. Method A cross-sectional prospective study of 671 patients hospitalized for heart failure in a tertiary private Brazilian hospital. Patients were divided into two groups: 189 patients admitted in the pre-protocol period (January 2005 to July 2006) and 452 patients admitted in the post-protocol period (August 2006 to May 2008). Mean age was 75.0 ± 12.0 years (range: 21 to 102 years). The HF protocol was implemented on 1 August 2006 and consisted of a written protocol, on-time data collection for quality indicators, and periodic performance feedback (reports) given to the clinical and administrative staff. Data collection before the protocol implementation was done retrospectively by a nurse casemanager. Statistical analysis was performed using the chi-square test, Student's t test and Fisher's exact test. P <0.05 was considered statistically significant. Results There was a significant decrease in the 30-day HF Rehospitalization Rate after, along with an increase in β-blocker, angiotensin-converting enzyme inhibitor (ACEI) or angiotensin receptor blocker (ARB) and smoking cessation counseling rates (Table 1). Conclusions In the present study, the implementation of a HF managed protocol led to a significant decrease in the 30-day HF Rehospitalization Rate along with an increase in the prescription rate of evidence-based therapies, even though the rate of patients admitted in cardiogenic shock was higher. Introduction Sepsis is a systemic inflammatory response resulting from the inability of the host to contain the infection locally. We previously demonstrated that during severe sepsis there is a marked failure of neutrophil migration to the infection site, which contributes to dissemination of infection, resulting in high mortality. IL-17 plays an important role in neutrophil recruitment. Herein, we investigated the role of IL-17 receptor signaling in polymicrobial sepsis induced by cecal ligation and puncture (CLP).

Methods and results
Adult C57BL/6 WT and IL-17 receptor KO mice were subjected to nonsevere (NS-CLP) sepsis. Intraperitoneal neutrophil migration, bacteremia, cytokines and liver injury were evaluated 6 hours after surgery. The ability of IL-17 to mediate the neutrophil microbicidal activity in vitro, as well the neutrophil migration in vivo and in vitro, were also evaluated. It was observed that IL-17R-deficient mice, subjected to CLP-induced nonsevere sepsis, show reduced neutrophil recruitment into the peritoneal cavity, spread of infection, and increased systemic inflammatory response as compared with BL6 littermates. As a consequence, the mice showed an increased mortality rate. Chemokines display a central role in mediating the neutrophil migration to inflammatory focus. Neutrophils respond to CXC chemokines, such as CXCL8, but are usually unresponsive to CC chemokines, such as CCL2. It is known that chemokine responsiveness in neutrophils can be modulated by some inflammatory conditions, such as sepsis. Here, we investigate whether Toll-like receptors (TLRs) modulate the expression of CCR2 in neutrophils and the consequence of this modulation on sepsis onset. Purified neutrophils from septic patients or from WT and CCR2 -/mice subjected to sepsis by cecal ligation and puncture (CLP) and neutrophils from naïve mice or healthy humans stimulated with lipoteichoic acid (LTA) or lipopolysaccharide (LPS) were assayed to CCR2 expression by FACS or immunofluorescence and the chemotaxis response to CCL2. Treatments of neutrophils from naïve mice or healthy humans with TLR agonists, LTA or LPS, induce an upregulation of the CCR2 expression, leading to CCL2 responsiveness such as chemotaxis and F-actin polymerization. CCL2 expression induced by TLR activation was blocked by NF-κB or synthesis protein inhibitors. Moreover, LTA-induced or LPS-induced CCL2 chemotaxis was not observed in TLR2 -/or TLR4 -/neutrophils, respectively. Interestingly, neutrophils from septic patients or septic mice presented high CCR2 expression and CCL2 responsiveness, when compared with neutrophils from healthy donors or naive mice. In vivo, we found that CCR2 -/mice subjected to severe sepsis by CLP exhibited reduced neutrophil infiltration in the heart, lung and kidney and an enhanced survival rate when compared with WT mice subjected to severe sepsis. Finally, severity of illness of the septic patients, judged by their APACHE II score and PaO 2 /FiO 2 relation, had greater correlation with CCL2 responsiveness by neutrophils (r 2 = 0.77, P = 0.001 and r 2 = 0.59, P = 0.001, respectively). Our findings demonstrated that TLR activation induced the CCR2 expression and CCL2 responsiveness in human and murine neutrophils, and this expression profile in neutrophils is involved in the detrimental infiltration of these cells in distant tissues during server sepsis. CCR2 blockage is therefore a potential strategy for human sepsis treatment.
Available online http://ccforum.com/supplements/13/S3 Reduction of neutrophil migration to infection sites correlates with bad outcome in sepsis. Acute phase proteins (APPs) were described to inhibit the neutrophil functions, such as neutrophil migration. We recently showed that α 1 -acid glycoprotein (AGP) is a serum factor involved in neutrophil migration failure in human severe sepsis. In mouse experimental sepsis, the serum AGP concentration was significantly increased only 6 hours after severe sepsis. However, 2 hours after severe sepsis induction in mice, essential steps for neutrophil migration are disrupt, such as a decrease on rolling and adhesion of leukocytes to the endothelium and less of the chemokine receptor CXCR2 expression on the neutrophil membrane. Therefore, AGP should not be involved in early steps of severe sepsis development. The identification of these other serum factors involved in the neutrophil migration failure could be helpful for appropriate management of severe sepsis. In this context, the objective of the present study was to identify soluble substances in the blood of septic mice that inhibit neutrophil migration in the early steps of sepsis. One pool of serum, obtained 2 hours after polymicrobial severe sepsis induction in mice, partially inhibited thioglycolate-induced neutrophil migration into the peritoneal cavity of naïve mice. Separation and identification by Blue-Sepharose, HPLC, native electrophoresis and mass spectrometry of soluble substances with inhibitory activity on neutrophil migration in this serum showed the APP hemopexin (Hx). The purified Hx, as well as the commercial sample of Hx, inhibited thioglycolate-induced or sepsis-induced neutrophil migration to the peritoneal cavity of mice. In contrast to wild-type mice, Hx-null mice that underwent severe sepsis did not present failure of neutrophil migration to infectious focus. As a consequence, these animals presented low bacteremia and high survival rate. Furthermore, Hx inhibited the neutrophil chemotaxis response evoked by C5a or MIP-2 and induces downmodulation of the CXCR2 and L-selectin. These results showed an inhibitory role of the APPs on neutrophil migration in sepsis and suggest that species-specific and time-specific inhibition of the APPs activities may be a new strategy for sepsis treatment. The rate of colonization was higher in the standard CVC. Patients who require a CVC for long periods have benefited with the use of impregnated CVCs, because they present long-term use and lower rates of colonization, avoiding complications related to the procedure of successive punctures and related to the permanence of the catheters. In view of the clinical benefits already mentioned, the benefit gained by the use of antiseptic-impregnated catheters compensated for the initial expensive cost of 40%.

P12
Changes in plasma free fatty acid levels in septic patients are associated with cardiac damage and reduction in heart rate variability Free fatty acids (FFAs) have been shown to produce alteration of heart rate variability (HRV) in healthy and diabetic individuals. Changes in HRV have been described in septic patients and in those with hyperglycemia and elevated plasma FFA levels. We studied whether sepsis-induced heart damage and HRV alteration are associated with plasma FFA levels in patients. Thirty-one patients with sepsis were included. The patients were divided into two groups: survivors (n = 12) and nonsurvivors (n = 19). The following associations were investigated: (a) troponin I elevation and HRV reduction; and (b) clinical evolution and HRV index, plasma troponin, and plasma FFA levels. Initial measurements of Creactive protein and gravity Acute Physiology and Chronic Health Evaluation scores were similar in both groups. Overall, an increase in plasma troponin level was related to increased mortality risk.
From the first day of study, the nonsurvivor group presented a reduced left ventricular stroke work systolic index and a reduced low frequency (LF) that is one of the HRV indexes. The correlation coefficient for LF values and troponin was r 2 = 0.75 (P <0.05). All patients presented elevated plasma FFA levels on the first day of the study (5.11 ± 0.53 mg/ml), and this elevation was even greater in the nonsurvivor group compared with the survivors (6.88 ± 0.13 vs 3.85 ± 0.48 mg/ml, respectively; P <0.05). Cardiac damage was confirmed by measurement of plasma troponin I and histological analysis. Heart dysfunction was determined by the left ventricular stroke work systolic index and the HRV index in nonsurvivor patients. A relationship was found between plasma FFA levels, Lfnu index, troponin levels, and histological changes. Plasma FFA levels emerged as a possible cause of heart damage in sepsis.

Introduction
Sepsis is an answer from the host to infection characterized by some clinical and laboratory signs. They are neither specific nor sensitive for some infection cases, mainly in immunosuppression patients. The identification of laboratory variables for these patients could therefore be diagnosed faster. This study evaluated the role for C-reactive protein (CRP) and procalcitonin (PCT) as such diagnostic variables from sepsis in HIV/AIDS infection patients compared with non-HIV infection patients. Objective Myocardial function is severely compromised during sepsis. Several underlying mechanisms have been proposed to explain this fact. Evidence from our laboratory indicates that myocardial structural changes could be responsible for sepsisinduced myocardial dysfunction. Taking into account that the contractile machinery inside the myofibers must remain intimately connected with the membrane and extracellular matrix, association provided by the dystrophin-glycoprotein complex (DGC), the present study investigated the hypothesis that loss of dystrophin and associated glycoproteins could be involved in early increased sarcolemmal permeability in experimentally induced septic cardiomyopathy.

Methods and results
Male C57Bl/6 mice were subjected to sham operation, moderate septic injury or severe septic injury (SSI) induced by cecal ligation and puncture. SSI mice presented a large number of bacteria, high levels of TNFα and MIP-1α in both the peritoneal cavity and blood, marked hypotension, and a high mortality rate. Using immunofluorescence and western blot analysis, a downregulation of structural protein expression, dystrophin and β-dystroglycan in both severe and moderate injury could be seen in septic hearts. In contrast, the immunofluorescent analysis for laminin-α 2 did not show a difference of expression in septic hearts as compared with sham-operated hearts. In addition, the evaluation of plasma membrane permeability by intracellular albumin staining provided evidence of severe injury of the sarcolemma in SSI hearts that presented accumulation of albumin in a large number of cardiomyocytes depleted of dystrophin. Conclusions Our data provide important insight regarding the alterations in the DGC resulting from severe septic injury. In this study, a significant decrease of dystrophin and β-dystroglycan results in loss of sarcolemmal permeability that may be partly responsible for sepsis-induced cardiac depression. These abnormal parameters emerge as therapeutic targets, and their modulation may provide beneficial effects on future cardiovascular outcomes and mortality in sepsis.

Introduction
The rationale for the use of glucocorticoids in severe sepsis and septic shock can be attributed to well-defined antiinflammatory and hemodynamic effects recognized for decades. However, with the introduction of corticosteroid therapy for a variety of conditions, it was reported that this treatment could induce a myopathy. Animal studies have confirmed that the administration of high doses of corticosteroid can produce myopathy affecting both ventilatory and peripheral skeletal muscles. Actually, it remains uncertain whether doses of corticosteroid, typically used to manage patients with severe sepsis and septic shock, do in fact cause peripheral and respiratory muscle weakness.
Objective To study the effect of low-dose corticosteroids on the muscle force and submaximal exercise tolerance in septic patients. Design A prospective observational study of septic patients in a 14-bed medico-surgical ICU. Thirty-seven patients with severe sepsis and septic shock received low-dose corticosteroids or not.

Materials and methods
We collected data from septic patients from 2008. Muscle force and submaximal exercise tolerance were assessed at discharge from the hospital. Maximal inspiratory pressure (Pimax) was measured using pressure transducers; submaximal exercise tolerance was assessed by a 6-minute walk distance test; quadriceps and handgrip strength on the dominant side were evaluated using an isometric dynamometer. Results A total of 26 patients received low-dose corticosteroids, and 11 patients did not, during the study period. Age, SOFA, and time of hospital stay data were similar in the two groups. The APACHE and time of ICU stay values were significantly different between the group with corticosteroids versus the noncorticosteroid group (P <0.05). The Pimax values were not different from those predicted for each group (60 ± 43% and 56 ± 34%, no corticosteroids vs corticosteroids), and the walking distance was not different. However, the peripheral muscle quadriceps presented 46 ± 21.8% versus 70 ± 40% (P <0.05), respectively, with corticosteroids or not.
Conclusions Low-dose corticosteroids did not alter Pimax and submaximal exercise tolerance on discharge from hospital. However, corticosteroids produced a significant reduction in peripheral muscle quadriceps. All patients presented a significant reduction of predicted force of quadriceps, showing corticosteroids can be responsible for the loss of peripheral muscular force.

Introduction
The major cause of mortality and morbidity of patients and experimental animals with diabetes mellitus is sepsis due to their high susceptibility to microbial infections. However, the mechanisms involved in this increased susceptibility are unclear.
Objective In the present study we investigated the effect of the mast cell degranulation in neutrophil migration failure observed in diabetic mice after polymicrobial sepsis induced by cecal ligation and puncture (CLP). Methods On the fifth day after Balb/c mice became diabetic through intravenous administration of alloxan (40 mg/kg), they were pretreated for 4 days with the mast cell degranulator (compound 48/80; 0.6 mg/kg on day 1; 1.0 mg/kg on day 2; 1.2 mg/kg on day 3; and 2.4 mg/kg on day 4, twice a day, i.p.). Mild sepsis (MS) was performed 24 hours after the last dose and the experiments were conducted 6 hours later. Results Nondiabetic mice subjected to MS showed 100% survival during 7 days, whereas all diabetic mice died within 24 hours of observation. The diabetic mice were highly susceptible to sepsis due to an incapacity to promote neutrophil migration to the peritoneal cavity accompanied by bacteremia and overexpression of the inflammatory response, determined by high levels of circulating TNFα and MIP-2 and lung neutrophil sequestration. The reduction of the neutrophil migration correlated with decreased CXCR2 receptor expression on the neutrophil membrane. However, diabetic mice submitted to MS and daily pretreatment with compound 48/80 did not display failure of neutrophil migration to infectious focus. As a consequence, these animals exhibited low bacteremia and a high survival rate. In addition, the pretreatment of diabetic mice with compound 48/80 significantly blocked the increase of serum TNFα and MIP-2 levels after septic stimulus. Accordingly, the reduction of the membrane expression of CXCR2 in neutrophils observed in diabetic mice after MS was significantly re-established in diabetic mice pretreated with compound 48/80. Conclusions These results suggest that in diabetic mice undergoing polymicrobial infection, mast cells play a key role in the neutrophil migration failure due to reduction of the CXCR2 expression, resulting in bacterial spreading and systemic release of mediators, and as a consequence augmented susceptibility to sepsis development. Introduction Studies have demonstrated an impact of gender dimorphism on immune and organ responsiveness and in the susceptibility to and morbidity from shock, trauma, and sepsis. We performed a comparative analysis of mortality in two subgroups of patients with sepsis, differentiated by age and sex, admitted to the ICU (UTI) of a teaching hospital. Methods From December 2005 to April 2008, 97 patients admitted to the ICU with a diagnosis of sepsis were separated into two subgroups based on age: (G1) the subgroup aged 14 to 40 years old, and the other subgroup (G2) aged over 50 years old. The subgroups were characterized for the demographic data, prognostic indicators (APACHE II score, organ dysfunction at admission and circulatory shock) and outcome (mortality).

Results
The G1 subgroup (n = 35) had 22 (62.9%) female patients and the G2 subgroup (n = 62) had 34 (54.8%) female patients. The mean APACHE II scores were not statistically different between female and male patients of G1 (21.0 ± 9.3 vs 24.8 ± 6.1 points, P = 0.21) and of G2 (23.8 ± 6.6 vs 24.4 ± 9.3 points, P = 0.8) subgroups. There was no statistically significant difference in the incidence of multiple organ dysfunctions (P = 0.89) or progression to circulatory shock (P = 0.46) among females and males in the two subgroups. The general mortality rate was lower in female than in male patients from the G1 subgroup; a reverse trend was observed in subgroup G2 ( Figure 1).
Conclusions Females under 40 years old, in the fertile period, had lower mortality than males in sepsis, and there was a trend to lower mortality in men over 50 years old, possibly due to dimorphism in immune and organ responsiveness related to sex and age. Introduction Sepsis is a great concern in public health due to high incidence and mortality. The objective of the present study is to estimate the incidence and mortality rate of sepsis in a tertiary public hospital, in Londrina, Paraná, Brazil. To analyze dynamic changes in the SOFA scores, we stratified the patients into three groups: low (0 to 5), medium (6 to 9) and high (>10) SOFA upon ICU admission. The three groups of patients were evaluated after 48 hours in the ICU to detect whether the SOFA scores decreased, increased or were unchanged. The discriminative power of SOFA was evaluated using ROC curves.

Results
We analyzed 1,164 adult patients with a mean age of 56.7 ± 19.1 years, and a hospital mortality rate of 47.9%. The Mean SOFA for all of the patients was 6.38 upon admission and was statistically higher in nonsurvivors (Pχ 2 trend = 272.08, P <0.001, increase rate = 0.13). The SOFA score on the third day in the ICU had the highest area under the curve for hospital mortality (AUC = 0.817 ± 0.0133, 95% CI = 0.792 to 0.840). Organ failure occurred in 699 patients. Respiratory failure was most frequent, but cardiovascular failure had the highest associated risk of death (RR = 4.10, 95% CI = 3.12 to 5.38, P <0.001). Conclusions Applying SOFA to critically ill patients admitted to the adult ICU effectively described the severity of organ dysfunctions, and higher SOFA scores had a positive association with mortality. Objective To evaluate the association between delay in ICU admission and mortality in patients with septic shock, which is known to be higher in public hospitals. Methods A prospective cohort study of patients referred to the ICU of University Hospital -Londrina State University, from January to December 2005, including 574 patients, following access protocol to the ICU in chronological order. The main outcome analyzed was the hospital mortality rate. Delay in admission due to lack of an available bed for immediate admission was considered the exposure factor in bivariate analysis between the two groups of patients (delayed admission and immediate admission). We also evaluated APACHE II and SOFA scores.

Results
Among the 574 patients analyzed, 127 (22.1%) cases had septic shock as the admission diagnosis. Most of these patients with septic shock (66.9%) were in the group of delayed admission and more frequently they came from the emergency department (52.8%). There was no difference between the groups at bed solicitation related to age, sex, comorbidities, and SOFA and APACHE II scores. At admission, patients in the delayed admission group presented an increase in APACHE II and SOFA scores. They also had higher scores and nosocomial infection rates compared with the immediate admission group. Pneumonia was the most frequent site of infection in both groups. The hospital mortality rate was higher in the delayed admission group (82.4%) compared with the immediate admission group (64.3%) (P = 0.042), relative risk 1.28 (95% CI = 1.01 to 1.64; P = 0.042). Kaplan-Meier survival curves showed a tendency to lower survival rate for the delayed admission group (P = 0.05). Conclusions Delay in ICU admission results in increased risk of death in patients with septic shock. Higher APACHE II and SOFA scores of patients in the late admission group probably reflect clinical deterioration during the time delay. The establishment of a local inflammatory response with production of cytokines/chemokines and the consequent neutrophil recruitment are critical to control an infection. In this context, our group had demonstrated an impairment of neutrophil migration toward the infectious focus in severe sepsis. The failure of neutrophil migration is markedly associated with increased systemic cytokines levels and high mortality of the severe sepsis, either in experimental models or in patients. Recently, the participation of Toll-like receptors (TLRs) in the failure of neutrophil migration was described. Caspase-1 seems to be important in the activation of TLR signaling pathways. Moreover, it is also critical to the activation of inflammatory cytokines IL-1β, IL-18 and IL-33. The aim of the present study was to evaluate the participation of caspase-1 during severe sepsis. The caspase-1-deficient mice presented increased resistance to severe sepsis induced by CLP. However, IL-18 and ST2 (receptor of IL-33)-deficient mice did not present a reduction of mortality after sepsis. The treatment with antagonist of IL-1 receptor was also unable to modify the survival rate of wildtype mice that underwent severe sepsis. These data indicate that the reduction in levels of these cytokines is not critical for reduction of mortality observed in caspase-1-deficient mice. The reduction in mortality of caspase-1-deficient mice was associated with decreased systemic levels of TNFα and IL-6. Despite the unaltered local levels of cytokines and chemokines, caspase-1-deficient mice that underwent severe sepsis presented a marked increase in neutrophil migration to the peritoneal cavity, which was supported by an increased rolling and adhesion of leukocytes in these mice. As consequence, a reduced bacterial growth in peritoneal exudates and blood was observed in these animals, although neutrophils from caspase-1-deficient and wild-type mice presented similar killing and cellular viability. Thus, in the absence of caspase-1, neutrophil migration to the peritoneal cavity is increased and culminates in a reduction of mortality because of efficient control of the infection. Acknowledgements Supported by CNPq, FAPESP and FAEPA. Background Sepsis is the main cause of mortality in ICUs. Primary sources of infection influence the risk of severe sepsis development, and pneumonia is a leading source of this disease. Neutrophils play a critical role in the host defense against acute pulmonary infection since neutrophil-depleted mice with pneumonia had delayed pulmonary bacterial clearance and high mortality. The heme oxygenase (HO) and soluble guanylate cyclase (sGC) activities are known to downregulate inflammatory events, such as neutrophil migration. In the present study we evaluated the role of HO and sGC activities on neutrophil migration to the lung during severe sepsis induced by pneumonia. Methods C57BL/6 male mice (18 to 22 g) underwent severe sepsis (SS, 4 x 10 8 CFU/mice) and mild sepsis (MS, 1 x 10 7 CFU/mice) by intratracheal administration of Klebsiella pneumoniae. A SS mice group was pretreated with HO-1-specific inhibitor (ZnPP IX) or a specific inhibitor of sGC (ODQ). Mice were killed 6 hours after bacteria administration and alveolar neutrophil migration and pulmonary parenchyma leukocyte sequestration were evaluated. Results Mice subjected to SS presented a failure of the neutrophil migration towards alveoli and an increased leukocyte sequestration into pulmonar parenchyma tissue when compared with mice subjected to MS. The HO-1 or sGC inhibition in SS mice partially restored the neutrophil migration to pulmonary alveoli and reduced the leukocyte sequestration into the pulmonary parenchyma. Conclusions These results suggest that HO-1 and sGC activities mediate the neutrophil migration failure to the lung.
Introduction Ventilator-associated pneumonia (VAP) is frequent and has been associated with substantial morbidity, mortality and excess of cost. We hypothesized that the three most important determinants for the rates of VAP are the process of diagnosis, the adoption of standards of care during the time spent on ventilation (adhesion to the ventilator bundle formerly described by the Institute for Health Care Improvement) and the reduction of time spent on ventilation (exposition to risk). Our aim was to reduce rates of VAP by adopting a diagnostic algorithm, measuring adhesion to the bundle and spontaneous breathing trials to all awake patients on mechanical ventilation (MV) sequentially in a 12bed general ICU.
Methods Traditionally the diagnosis of VAP was made by the attending physician and there was no determined policy to deal with sedation and adhesion to the ventilator bundle. At the beginning of 2007, we adopted a diagnostic algorithm that included the CPIS and bronchoalveolar lavage cultures. In December 2007, we started auditing adhesion to the items of the bundle and promoting educational discussions about the importance of preventing VAP. These audits were done once a day during the afternoon in the form of a checklist. Daily interruption of sedation was done by the staff nurse every day at 8 o'clock in the morning, unless paralytic drugs were in use. In August 2008, we formalized the spontaneous breathing trial as an approach for ventilated patients who were awoken with the intention to decrease the time spent on MV. Results In 2006, the rate of VAP was 32.8/1,000 ventilator days and the average time of MV was 13 days. In 2007, after the diagnostic algorithm, the rate of VAP fell to 21.1/1,000 ventilator days and the average time spent on MV to 10.7 days. In the first 7 months of 2008, after the adoption of the ventilator bundle (with rates of adhesion of: 84% to the head-of-bed; 97% to daily interruption of sedation; 99% to deep vein thrombosis and stress ulcer prophylaxis), the rate of VAP fell to 10.5/1,000 ventilator days and the average time of MV to 8.9 days. Finally, from August 2008 to January 2009, after the adoption of the spontaneous breathing trial, the rate of VAP is 8.66/1,000 ventilator days and the average time spent on MV is 6.6 days. Conclusions Adoption of a diagnostic algorithm may decrease the overdiagnosis of VAP, and this approach combined with a more aggressive strategy of discontinuing MV and adhesion to standards of care for ventilated patients can result in a great impact on the rates of VAP. Introduction Clopidogrel is recommended for patients with acute myocardial infarction (AMI); however concern exists regarding those patients submitted to coronary artery bypass surgery (CABG). We estimated the incidence of CABG during AMI hospitalization, and evaluated whether early treatment with clopidogrel is harmful to this population.
Methods We studied 941 patients with AMI (71% male, age 68 ± 15 years) using prospective data recorded between 2003 and 2008. Variables are presented as the median and interquartile range, or relative frequencies. The effect of clopidogrel on hospital mortality and the hospitalization period was adjusted for prognostic markers (left ventricular ejection fraction, age and Killip class) using logistic or Cox regression, respectively. We also evaluated the effects of clopidogrel on the subgroup of patients submitted to CABG through the inclusion of interaction terms in a multivariate analysis.

Introduction
There is a paucity of information describing 3D echocardiographic new systolic-derived speckle tracking parameters of a normal population. We sought to evaluate by 3D echo: rotation, twist, torsion regional, torsion basal, radial 3D strain, and 3D displacement -in a population with normal ECG, 2D and 3D echocardiographic analyses. This information may be relevant for cardiac resynchronization therapy (CRT) evaluation in patients with advanced heart failure in ICUs.
Conclusions ADHF registries raise the renal function as an important mortality admission risk factor. Nevertheless in this cohort, worsening renal function during hospitalization was an independent mortality predictor. In this context, renal function preservation must be attempted as a mortality reduction strategy in the setting of ADHF patients.

P31
Risk stratification of acute decompensated heart failure inhospital mortality using the adhere CART method: is there validation in a Brazilian cohort?  Lead aVR is an electrocardiographic lead that is frequently ignored [1,2]. Many clinicians consider lead aVR as a not useful electrocardiogram one. Instead of this, we report a patient that was resuscitated from a ventricular fibrillation and presented with ST-elevation in lead aVr (Figures 1 and 2) and right bundle brunch block. The patient was immediately transferred to the cath lab and a left main coronary artery occlusion (LMCA) was visualized [3]. In this emergency scenario the patient had another cardiac arrest in pulseless electric activity and we proceeded with percutaneous revascularization of the LMCA (Figure 3). The patient returned to spontaneous circulation and after 14 days was dispatched from hospital without neurologic sequelae. The rapid diagnosis of such events is critical to guiding early intervention and appropriate disposition in many patients with ACS. Electrocardiography is an appropriate bedside tool used in the ED to make a rapid diagnosis of ACS especially using the aVr lead, allowing physicians to select appropriate therapy and to predict potential cardiovascular complications.
Introduction Aortic dissection has an ominous prognosis if it is not promptly surgically corrected. Despite surgical intervention, patients suffer from high morbidity-mortality. Our aim was to compare the incidence of postoperative complications and 1-month and 6-month mortality of ascending aortic dissection surgical correction with paired matched controls for elective aortic aneurysm correction and urgent coronary artery bypass graft surgery (CABG). Methods Ascending aortic dissection (AAD) and aneurysm correction surgeries were gathered from February 2005 to June 2008, in a private community ICU. Demographic data, comorbidities and cardiac ejection fraction were collected. Euroscore, Ontario and APACHE II scores were calculated to analyze patients' severity of illness. Surgical characteristics (elective or urgent indications) and peroperative data were also compared. AAD patients were compared against elective aortic aneurysms (ascending aorta controls) and all CABG (standard cardiac procedure control) surgeries. Besides, aortic dissection and aneurysm surgeries were compared with paired matched CABG control patients, according to age (± 3 years), gender, elective/ urgent procedure and surgical team. At first, simple comparisons were made between ascending aorta and CABG surgeries. Aortic dissection and aneurysm groups were analyzed against each other; and finally AAD patients were compared with paired Available online http://ccforum.com/supplements/13/S3 matched CABG brackets for morbidity (postoperative complications and ICU and hospital lengths of stay) and 1-month and 6-month mortality. Results Twelve patients were operated for AAD correction and 10 for ascending aortic aneurysm, while 246 patients were submitted to CABG surgery. Ascending aorta surgical patients were younger (mean ± SD, 60.8 ± 16.2 vs 66.1 ± 10.2, P = 0.03) when compared with CABG brackets. APACHE II, Euroscore and Ontario were higher for aorta patients (P <0.01). Rates of urgent procedures were also similar in ascending aorta and CABG patients (54 vs 39%, P = NS). Incidences of postoperative complications were significantly higher in the aorta group (77 vs 36%, P <0.001), as well as higher ICU length of stay (8.7 ± 16.1 vs 3.3 ± 4.5 days, P <0.001), but similar ICU mortality (4.5 vs 3.2%, P = NS). When AAD patients were compared with the aneurysm group, the main differences were: more urgent procedures in ascending aortic dissection patients (91 vs 10%, P <0.001), longer length of mechanical ventilation (45.1 ± 57.5 vs 7.3 ± 6.1, P = 0.05), and length of hospital stay (34.6 ± 35.8 vs 10.6 ± 4.7, P = 0.05). Incidences of postoperative complications and 1-month and 6-month mortality were similar in these groups. After matching paired CABG patients to the AAD group, significantly worse results were found for the last group: Euroscore (10.1 ± 3.3 vs 5.9 ± 4.1, P = 0.02) and Ontario (7.0 ± 1.2 vs 4.3 ± 3.0, P = 0.01) scores, higher incidence of postoperative complications (91 vs 45%, P = 0.03), and longer hospital length of stay (34.6 ± 35.8 vs 12.9 ± 8.5 days, P = 0.05). However, 1-month and 6-month mortality was very similar in both groups (8 and 16%, respectively; P = NS). Conclusions Although AAD surgical correction is associated with increased incidence of postoperative complications and hospital length of stay, 1-month and 6-month mortality is very similar to elective aortic aneurysm repair and paired matched CABG controls.

P35
High risk for obstructive sleep apnea in patients with acute coronary syndrome: initial experience from a coronary unit the group submitted to CABG, two patients were submitted to thrombolytic therapy (TIMI score 3 and TIMI score 7) without mortality.
Conclusions The mortality was significantly higher in the group with clinical treatment (P = 0.0189). The comparative study about the mortality of each group of TIMI score was not statistically significant: there was no correlation of TIMI score and mortality. The impact of this new approach was particularly significant when the renal outcome of the patients was evaluated. While some subsequent studies corroborated Van den Berghe and colleagues' results, others could not demonstrate any benefit of intensive insulin therapy on mortality and renal outcome. One of the difficulties in comparing the incidence of acute renal dysfunction is the lack of consensus about its definition. The RIFLE criteria, proposed in 2004 [2], had the objective of standardizing this definition.

Nephrology
Objective To compare the incidence and severity of acute kidney injury (AKI) in critically ill patients submitted to two different regimens of glycemic control, using the RIFLE criteria. Methods Analysis of 228 patients who had been previously included in a prospective study, randomized to intensive insulin therapy (Group 1) or to a carbohydrate restrictive strategy (Group 2). The RIFLE criteria were established according to the creatinine values on the first day and the last day of the ICU stay, and the highest value obtained during this period. The renal outcome was evaluated through the comparison of the last RIFLE score obtained during the ICU stay and the RIFLE score at admission, and then classified as favorable, stable or unfavorable.

Results
The two groups were comparable regarding demographic data, APACHE III score and comorbidities. The median blood glucose levels were 132.6 mg/dl in Group 1 and 142.0 mg/dl in Group 2 (P = 0.02). Hypoglycemia occurred in 20 (18.1%) patients in Group 1 and in five (4.2%) patients in Group 2 (P = 0.001). AKI developed in 52% of the patients and was associated with a higher mortality (39.4%) as compared with those who did not have AKI (8.2%) (P <0.001). The renal function outcome was comparable between the two groups (P = 0.37) ( Table 1). On the other hand, we have observed a significant correlation between the blood glucose levels and the incidence of AKI (P = 0.007) (Figure 1). In the multivariate logistic regression analysis, only previous diabetes mellitus and age higher than 60 years were risk factors for AKI. Independent risk factors for mortality were hypoglycemia and APACHE III score >60. Conclusions Intensive insulin therapy does not reduce the incidence of acute kidney injury evaluated through the RIFLE criteria when compared with a carbohydrate restrictive strategy. However, we have observed that an increase in the blood glucose levels beyond normal values is associated with an increase in the incidence of AKI. This, as well as the higher incidence of hypoglycemia, suggests that a carbohydrate restrictive strategy is safer than and as efficient as intensive insulin therapy in preventing AKI in critically ill patients. Acute kidney injury (AKI) according to glycemic levels. The mean systolic pulmonary arterial pressure at the beginning of surgery was 37.25 mmHg (11 to 88 mmHg), and was 26.42 mmHg (10 to 44 mmHg) at the end of surgery. In the early postoperative state, the mean APACHE score was 17.43 (9 to 33) and the mean SAPS III was 30 (14 to 49). The mean time of stay in the ICU was 9.5 days (1 to 192 days), the mean time of intubation was 12.5 hours (1 to 432 hours) after the surgery, and 14% of patients needed reintubation. The mean PaO 2 /FiO 2 ratio in the immediate postoperative period was 203, and was 266 in the first 24 hours postoperative. The mean systolic pulmonary arterial pressure was 26.1 mmHg (11 to 76 mmHg), and the mean wedge pulmonary arterial pressure was 9.8 mmHg (1 to 22 mmHg). The median time of vasopressor was 26 hours (1 to 336 hours). Thirty-four patients had ischemia/reperfusion injury (light = 7.4%, mild = 7.4% and severe = 15.7%) and 43.5% had acute rejection in the first 30 days postoperative (light = 12%, mild = 13% and severe = 10%). Forty-three patients had infectious complications in the early postoperative stage; the respiratory system was the most compromised (40%). Only four patients had surgical complications that forced their return to the operating room (hemothorax = three patients, bilateral pneumothorax = one patient). The most common clinical complications were acute renal insufficiency (13%, and 8.3% of them needed dialysis), intestinal perforation (3.7%), delirium (2%), and stroke (0.9%). The mortality rate at 28 days was 22%, and at 90 days was 25%. The variables that correlated with mortality at 28 and 90 days were ischemia/reperfusion injury (P = 0.022 and P = 0.05, respectively), the PaO 2 /FiO 2 ratio in the first 24 hours postoperative (P = 0.003 and P = 0.004, respectively), acute renal insufficiency with need for dialysis (P <0.001), need for reintubation (P = 0.03 and P = 0.012, respectively), and acute rejection in the first month postoperative (P = 0.029 and P = 0.012, respectively). Conclusions Lung transplantation is the treatment of choice for many lung diseases in the terminal state. However, a number of postoperative complications can affect the outcome. Improvement of early postoperative care to prevent complications is indispensable for a positive outcome. reduction of V T and hyperdistension. We studied the long-term consequences of both strategies in a controlled experiment where pigs where randomized and submitted to one of both strategies for 48 hours. Lung injury was induced by saline lavage followed by 3 hours of injurious mechanical ventilation. In the OLA arm, PEEP was selected by the EIT after a recruitment maneuver (RM), trying to keep lung collapse to a minimum, while the ARDSnet group followed a PEEP x FiO 2 table. We scrutinized lung function at the end of a 48-hour period of lung protection, after a standardized RM. The concept was to provide equivalent conditions for lungs to perform, irrespective of lung history or treatment in previous days, in terms of gas exchange or alveolar stability during slow-deflation maneuvers. Before sacrifice and after maximum RM, we collected blood gases and expiratory pressure versus ΔZ (impedance changes by electric impedance tomography). Oxygenation and alveolar stability were equally impaired in both arms after injury. However, at the end of the 48 hours there was significant improvement in the capacity of oxygenation in the OLA arm (mean = 494 mmHg; P = 0.043), but not in the ARDSNet arm (mean = 278; P = 0.285). The potential to maintain airspace volumes improved along the protective period in the OLA arm, but deteriorated in the ARDSnet arm. (See Figure 1.) Conclusions Even after full reversal of atelectasias, lung function seems to be severely impaired after 48 hours of the ARDSnet strategy, as compared with OLA.   Objective There is some controversy in the literature on how patients with concomitant intra-abdominal hypertension (IAH) and acute lung injury (ALI) should be ventilated. It has been suggested that application of external positive end-expiratory pressure (PEEP) matching abdominal pressure could improve ventilatory performance during ALI and IAH [1]. Our purpose with the present study was to evaluate the cardiopulmonary effects of PEEP matching abdominal pressure (AP) in an experimental model of IAH and ALI.
Methods Eight anesthetized pigs were instrumented and then submitted to IAH of 20 mmHg for 30 minutes with a CO 2 insufflator. Respiratory and hemodynamic parameters were measured and then also after ALI was induced by lung lavage with saline (3 ml/kg) and Tween (2.5%). Pressure x volume curves of the respiratory system were performed by a quasi-static low flow method during IAH and ALI, and PEEP was then adjusted to 27 cmH 2 O for 30 minutes.
Results IAH decreased pulmonary and respiratory system static compliances and increased airway resistance, alveolar-arterial oxygen gradient and respiratory dead space. The presence of concomitant ALI exacerbates these findings. Thirty minutes of mechanical ventilation with PEEP identical to AP moderately improved oxygenation and respiratory mechanics. Even though cardiac output was maintained through an increased heart rate, this short course of increased PEEP caused an important decline in the stroke index and right ventricle ejection fraction.
Conclusions Concomitant IAH and ALI produce important impairments in the respiratory physiology. PEEP equalization to AP may improve the respiratory performance, but with a secondary hemodynamic derangement. Objective To study the prevalence and mortality of ARF and characteristics of the patients with ARF admitted to a general ICU. Methods During 6 months, all adult patients admitted to a 17-bed ICU in Salvador, Bahia, Brazil who had ARF defined as a patient receiving mechanical ventilation for more than 24 hours were prospectively studied.
Results From March to August 2008, there were 411 admissions to the ICU. From these, 82 (20%) patients received mechanical ventilation for more than 24 hours with a median duration of 4 days (varied from 1 to 69 days). The characteristics of patients with ARF are presented in Table 1. These patients were older than the total population admitted to the ICU (68 vs 65 years, P <0.05), had higher APACHE II score (19 vs 13, P <0.0001), higher ICU stay (10.5 vs 3.0 days, P <0.001) and higher mortality (51.2 vs 17.5%, P <0.0001). Mortality of ARF patients was associated with old age, odds ratio of 3.0 for death for patients older than 70 years (95% CI = 1.2 to 7.5), and development of complications in the ICU (OR = 8.4, 95% CI = 3.1 to 22.8). Conclusions ARF prevalence was 20% in the studied population. The main cause of ARF was sepsis, these patients were older, more severely ill, had higher ICU stay and mortality than the total population admitted to the ICU and were often complicated with multiple organ failure. Results Five hundred and four patients with the mean age of 53.9 ± 19.9 years and a female majority (56.3%) were included. APACHE II (25.0 ± 7.9 vs 17.9 ± 8.1, P <0.001), length of stay (9 (5 to 14) vs 3 (1 to 5) days, P <0.001) and mortality rate (52.0% vs 22.1%, P <0.001) were higher in patients that developed ARF during the ICU stay, compared with those without ARF. The risk factors most frequently associated with ARF were pulmonary embolism, pneumonia and hypovolemic shock (Table 1).
Conclusions Pulmonary embolism, pneumonia and hypovolemic shock were risk factors most associated with ARF, and patients that developed ARF during the ICU stay had high mortality.  Conclusions ARM was effective in improving the PaO 2 /FiO 2 ratio in the three groups. The three ARM were similar in improving the Cst,rs and PaCO 2 . In our study, the ARM showed better results according to the PaO 2 /FiO 2 ratio when performed initially with progressive levels of PC and later with progressive levels of PEEP in comparison with the other two approaches. Objective To observe the effects of expiratory trigger (ET) setting on the following respiratory parameters in nonchronic obstructive pulmonary disease (COPD) patients: respiratory rate (RR), frequency to tidal volume ratio (f/Vt ratio), tidal volume (Vt), minute volume (Ve), oxygen saturation (SpO 2 ), duty cycle (Ti/Ttot).

Materials and methods
Twenty-four stable patients were evaluated. All patients were ventilated in pressure support ventilation, with pressure support between 8 and 12 cmH 2 O (to obtain Vt between 7 and 8 ml/kg), positive end-expiratory pressure between 5 and 7 cmH 2 O, fraction of inspired oxygen (FiO 2 ) ≤40% (to obtain SpO 2 ≥96%). The expiratory trigger was set at 1%, 25%, 50% and 70%, for a 5-minute period each. The RR, f/Vt ratio, Vt, Ve, SpO 2 , and Ti/Ttot were measured at each percentage of ET. Analysis of variance for repeated measures was used to analyze variations during the four ET values and to verify variations out of the comfort zone, defined as: RR >30 bpm, f/Vt >100 breath/min/l, Vt <300 ml, Ve >10 l, SpO 2 <90%, Ti/Ttot >0.45. The Bonferroni test was used to identify which values were significantly different among the multiple comparisons. A probability of less than 0.05 was considered significant. The ventilator used was the Bennet 840. Results All respiratory parameters presented significant variations when the comparisons were made from 1% to 70% of ET (P = 0.0001), and 0.0003 for Ti/Ttot. The Vt, RR and f/Vt ratio presented significant increase in the percentage of patients that showed these parameters out of the comfort zone (P = 0.0025, P = 0.0002 and P = 0.007, respectively). No respiratory parameter presented significant variations when the comparisons were made from 1% to 25% of ET. Conclusions In non-COPD patients, the use of ET at 1% or 25% has no effect on the respiratory parameters. The increase of ET to 50% or more can worsen the respiratory parameters and lead to rapid shallow breathing, suggesting that these values should be avoided in non-COPD patients. Introduction Pulmonary embolism (PE) is a clinical syndrome resulting from occlusion of the pulmonary arterial circulation from one or more emboli. In the United States, its incidence is estimated at 1/1,000 hospital admissions per year [1].
Objective To describe the clinical and epidemiological data of patients admitted to an ICU due to PE.  = 14), and the most prevalent was respiratory failure with 10 cases (13.69%), followed by pneumonia (5.47%), hemorrhage (4.1%), pulmonary edema and sepsis, each one with two patients (2.73%). Low-molecular-weight heparin was used in 35 patients (47.94%), unfractionated heparin in 28 patients (38.35%), thrombolytics in eight patients (10.95%) and inferior vena cava filter in two patients (2.73%). The hospital mortality was 8.21%. The mean length of stay in the ICU was 5.45 ± 11.61 days (0 to 83). Patients who had dyspnea (84.93%) on admission and those who needed orotracheal intubation had a higher mortality rate, with statistical significance (P = 0.0405 and P = 0.0305, respectively). Conclusions The most prevalent risk factors for PE were recent surgery, previous PE or DVT, restriction of mobility, venous insufficiency and obesity [1,2]. Dyspnea and chest pain were the most prevalent clinical manifestations [1,2]. The D-dimer was performed in a small percentage of patients. Computerized tomography was performed in 94.52% of patients, as it is an important diagnostic test for its high sensitivity and specificity [1]. The presence of dyspnea and orotracheal intubation were risk factors for mortality.

Materials and methods
Introduction Pneumonia associated with ventilation (VAP) develops after 48 hours of orotracheal intubation (OTI) and mechanical ventilation (MV). It develops because of the imbalance between the patient's defense mechanisms and the microbial agent. The patient using OTI loses the natural barrier, eliminating the cough reflex and promoting the accumulation of secretions above the cuff, and therefore could facilitate colonization and the aspiration of contaminated secretions. The incidence of VAP is high, varying between 6% and 52%, depending on the studied population, on the type of UTI and on the type of diagnosis technique used; therefore, in spite of being an extremely important infection, it is one of the most difficult diagnoses in critically ill patients. When compared with other nosocomial infections, such as the one of the urinary tract and skin, where the mortality is between 1% and 4%, VAP becomes an important mortality predictor, since this varies between 24% and 50%, and could be more than 70% when caused by multiresistant microorganisms. Patients seriously ill with diagnoses of trauma -it reviles cerebral, vascular accident -are of particularly larger risk for VAP, the incidence estimated to be between 40 and 50%. Programs of basic education have been recognizing that the occurrence of VAP can be reduced in 50% or more using several interventions to prevent the colonization and the aspiration of secretions as well as gastric content. The increasing frequency of resistant microorganisms represents a serious health problem. The ICU is a great source of resistant microorganisms. Therefore, prevention should be part of the strategies of handling VAP. The mortality of this pathology can be reduced by the identification of the risk factors and of the prevention.

P50
Serum glucose variability and brain-serum glucose ratio predict metabolic distress and mortality after severe brain injury Objective Cerebral glucose metabolism and energy production are affected by serum glucose levels. The objective of the present study was to assess whether serum glucose variability and the ratio between cerebral and serum glucose are associated with cerebral metabolic distress and outcome after severe brain injury. Methods We studied 46 consecutive patients that underwent multimodality monitoring with intracranial pressure, PbtO 2 and microdialysis in a neurological ICU of a university hospital. The relationship between brain-serum glucose ratio and cerebral metabolic distress, as measured by microdialysis lactate/pyruvate ratio (LPR) ≥40, was analyzed for every hour of measurement. The relationship between daily serum glucose variability, as measured by the standard deviation (SD) and the mean amplitude glycemic excursions (MAGE), and metabolic distress was analyzed for every day of monitoring. Mortality was analyzed at hospital discharge. All analyses used general linear models of logistic function for dichotomized outcomes utilizing generalized estimating equations accounting for within-subject and between-subject effects.

Results
The mean age was 55 years (IQR 42 to 64), 27 (59%) patients were female and the median admission Glasgow Coma Scale (GCS) was 7 (IQR 5 to 9). Diagnoses included subarachnoid hemorrhage (61%), intracerebral hemorrhage (22%), traumatic brain injury (13%) and cardiac arrest (4%), and 28% were dead at discharge. A total of 5,264 neuromonitoring hours and 300 days were analyzed. In a multivariable model, brain/serum glucose ratios below the median (12%) were independently associated with increased risk of metabolic distress (adjusted OR = 6.1 (4.5 to 8.2), P <0.0001). In a similar multivariable model analyzing daily averaged data, increased serum glucose variability was also independently associated with higher risk of cerebral metabolic distress (adj OR = 1.8 (1.3 to 2.5), P <0.0001 for SD; and adj OR = 1.2 (1.02 to 1.4), P = 0.03 for MAGE). Both analyses were adjusted for significant covariates such as GCS, cerebral perfusion pressure and serum glucose levels. An averaged brain/serum glucose ratio lower than the median and an increased serum glucose variability were independently associated with mortality at hospital discharge after adjusting for age and APACHE II score (adj OR = 6.9 (1.6 to 36.7), P = 0.01; and adj OR = 7.2 (1.3 to 41.7), P = 0.03, respectively). Conclusions The ratio between brain and serum glucose levels as well as serum glucose variability are associated with cerebral metabolic distress and increased mortality at hospital discharge in patients with severe brain injury. Available online http://ccforum.com/supplements/13/S3

Figure 1 (abstract P52)
Introduction In order to implement a tight glycemic control protocol in the ICU it is essential to obtain active nurse involvement. Our objective was to evaluate nurse's perception about three different blood glucose control protocols for critically ill patients. Methods As part of a randomized control trial (RCT) comparing three blood glucose control protocols in ICU patients, we issued a questionnaire to all nurses who participated in the study to evaluate their perception on protocol efficacy, benefits, safety, risks and feasibility and a question asking which protocol the nurses would like to be adopted in their ICU. The RCT arms were: computer-assisted insulin protocol (CAIP) with continuous insulin infusion to maintain blood glucose between 100 and 130 mg/dl; Leuven protocol with insulin infusion to maintain blood glucose between 80 and 110 mg/dl; and conventional treatment with subcutaneous insulin if glucose >150 mg/dl. Results Sixty nurses answered the questionnaires. CAIP was considered the most efficient by 57% of the nurses. About 58% of the nurses evaluated its performance as good or very good, compared with 22% for the Leuven protocol (P <0.001) and 40% for conventional treatment (P = 0.08). CAIP was considered easier to use than the Leuven protocol (P <0.001) and as easy as conventional treatment (P = 0.78). Fifty-six percent of the nurses chose CAIP as the protocol they would like to be adopted in their institution.
Conclusions CAIP was more efficacious, safer and easier to use than the Leuven protocol. Compared with conventional treatment, the feasibility and safety of CAIP were considered similar. Most nurses chose CAIP as the protocol they would like to be adopted in their ICU.

Introduction
In recent years several interventions to increase the quality of care and patient safety in the ICU have been proposed. Daily checklists have been introduced and are feasible and straightforward instruments to measure adherence to quality of care. Critically ill cancer patients are especially susceptible to ICUacquired complications such as thrombosis, infection and delirium among several other morbid conditions. We developed and implemented a checklist to evaluate the process of care measures in this specific group of patients. Methods We prospectively performed a daily checklist for consecutive patients admitted to the ICU between 1 January and 15 February 2009.The checklist included: feeding, analgesia, DVT prophylaxis, elevated head of the bed (HOB), stress ulcer (SU) prophylaxis and delirium. Data were inserted into an electronic database and automatic real-time reports were generated. Results One hundred and four patients were evaluated, 52 (50%) scheduled surgery, 43 (41.3%) medical patients and 9 (8.6%) emergency surgery. The SAPS3 score was 49.3 ± 23.2. Global ICU mortality was 22.1%. The main reasons for ICU admission were postoperative monitoring (n = 52/50%) and sepsis (n = 28/ 26.9%). The frequency of adherence to the measured processes of care was: nutrition = 85.1%, analgesia = 92.7%, DVT prophylaxis = 72.3%, HOB = 55.3%, SU prophylaxis = 78.2%, and delirium was detected in 12.8%. Introduction Increasing interest has been focused on survival beyond the ICU stay. Functional capacity, as described by simple everyday activities, can be severely compromised after ICU and hospital stay. Our aim was to identify patients' risk factors for reduced functional capacity at hospital discharge on the first 2 days after ICU admission.
Available online http://ccforum.com/supplements/13/S3  Activity  01  02  03  04  05  06   Feeding  10  10  10  10  10  05  Cleanliness  05  05  05  05  05  00  Evacuation  10  10  10  10  10  10  Diuresis  10  10  05  10  10  10  Dressing  10  05  10  10  10  00  Transference chair/bed  15  15  15  10  15  00  Toilet  10  10  10  10  10  00  Mobility  15  15  15  15  15  00  Stairs  10  10  10  10  10  00  Bath  05  05  05  05  15  00  Total  100  95  95  95  100  25 0 is the worst value and 100 the best result. , from 31 to 40 years (6.6%), from 41 to 50 years (10.3%), from 51 to 60 years (14.47%), and from 61 to 70 years (17.01%). The mean APACHE II index was 10 (± 7.46) and mean SAPS II was 30.87 (± 13.31). In this period, 178 (20.60%) patients were submitted to mechanical ventilation. We observed three (1.68%) cases of accidental extubation. None of the cases required reintubation. It was necessary to use noninvasive mechanical ventilation in two patients, only in one situation was the patient in the process of weaning. In two cases there was failure to control the exchange of fixing the orotracheal tube, and the other failed in containing the patient due to psychomotor agitation during the hygiene of nursing care. Only one case occurred during the night duty, in which was found 100% occupancy of the ICU and a decrease in the number of nursing assistants by licensed health. There were no cases of death. Conclusions The monitoring conducted by the physiotherapy team and the existence of routines for exchange of a fixed endotracheal tube favor a low incidence of adverse events of accidental extubation. Although there have been no cases of death, the search for preventive mechanisms confirms the quality of care. Conclusions The external jugular venous access is a good alternative to internal venous catheterization and it is associated with minor and low complication rates. Our results show a lower but not significant probability of success with the EJA. However, considering that the procedure was done by physicians not familiar with the technique, we did not find definite evidence to indicate that IJA is superior to EJA. . According to the Briceno score, 75% of hepatic grafts were retrieved from extended criteria donors. Donors were 40 years old or more in 54% of the cases (mean age of 41 years). The 1-year patient actuarial survival rate of transplanted patients with FHF was 57.1% and the 3-year actuarial survival rate was 54.1%. Forty-two percent of the patients died early after the transplant. The main cause of early mortality was sepsis/ multisystem organ failure, and late mortality was related to immunosuppressive therapy noncompliance. The mean patient follow-up was 16 months (range, 0 to 57). Another 36 patients with FHF referred to us died while awaiting LT. Only one patient survived without LT. In conclusion, despite the high mortality rate, urgent LT is still the best therapy for patients with FHF.