CORD-19:b30770ae30b35cdfaf0a173863e74e93edbb0329 JSONTXT 8 Projects

P001 Sepsis impairs the capillary response within hypoxic capillaries and decreases erythrocyte oxygen-dependent ATP efflux P002 Lower serum immunoglobulin G2 level does not predispose to severe flu Abstract Critical Care 2016, 20(Suppl 2):P001 Introduction: A hallmark of sepsis is early onset microvascular dysfunction. However, the mechanism responsible for maldistribution of capillary blood flow is not understood. Evidence suggests red blood cells (RBC) can sense local oxygen (O2) conditions and signal the vasculature, via adenosine triphosphate (ATP), to increase capillary flow. We hypothesized that sepsis impaired microvascular autoregulation over the entire capillary network, within a capillary and within the RBC. Study objectives were to: 1) measure capillary response time within hypoxic capillaries (capillaries with RBC hemoglobin oxygen saturation (SO2) < 20 %), 2) test the null hypothesis that sepsis had no effect on RBC O2-dependent ATP efflux and 3) develop a pathophysiological model. Methods: Hypotensive sepsis was studied in male Sprague-Dawley rats using cecal ligation and perforation, with a 6-hour end point. Rat hindlimb skeletal muscle microcirculation was imaged using a dual wavelength spectrophotometric intravital microscopy system, and capillary RBC supply rate (SR = RBC/s), RBC SO2 and oxygen supply rate (qO2 = pLO2/s) were quantified. Arterial NOx (nitrite + nitrate) and RBC O2-dependent ATP efflux were measured using a nitric oxide (NO) analyzer and gas exchanger, respectively. Results: Compared to control, sepsis increased capillary stopped flow and plasma lactate (p < 0.05). Increased plasma NOx (p < 0.001) was related to increased capillary RBC SR (p = 0.027). Analysis of 30second capillary SR-SO2-qO2 profiles, revealed a shift towards decreased (p < 0.05) oxygen supply rates in some capillaries. Moreover, capillary response time within hypoxic capillaries (time to restore capillary RBC SO2 > 20 %) increased 3-4 fold (p < 0.05). And, consistent with impaired microvascular autoregulation, the RBC's response to a hypoxic environment, measured as RBC O2-dependent ATP efflux, decreased by 62.5 % (p < 0.001). Conclusions: Sepsis impaired microvascular autoregulation at the capillary and erythrocyte level. Impaired autoregulation was manifested by increased capillary stopped-flow, increased capillary response time within hypoxic capillaries, decreased capillary oxygen supply rate and decreased RBC O2-dependent ATP efflux. This loss of local microvascular control was partially off-set by increased capillary RBC supply rate, which correlated with increased plasma NOx. Introduction: IgG2 deficiency has been suggested to predispose to severe H1N1 infection (1) . However, contradictory findings have been reported (2) and, therefore, further replication of these findings in different cohorts need to be performed. The purpose of this study is to assess the immunoglobulins and IgG subclasses levels, in a cohort of patients with influenza. Methods: We studied 137 Spanish patients who developed flu (80.3 % by H1N1pdm virus). Diagnosis was confirmed by the detection of Influenza virus in nasopharyngeal swabs using the Real-Time polymerase chain reaction. Immmunoglobulins and IgG subclasses (Binding Site Human IgG Subclass kits) were measured in both acute and convalescent phase and their correlation with severity was examined. We analyzed genetic variants at IGHG2 in order to evaluated their role in the IgG2 levels. Clinical and immunological characteristics were compared using the Chi-squared test or Fisher's exact test when needed. Results: Ninety-three patients were hospitalized and 49 required admission to intensive care unit. Sixty-four patients developed viral pneumonia and 26 acute respiratory distress syndrome. No differences in the serum levels of IgG, IgA, IgM or IgG subclasses were observed in the different subgroups according to severity of disease. Genotype G2m n-/n-was associated with lower serum IgG2 levels in the convalescent phase. Patients homozygous for the G2m(n-) allele had significantly lower serum IgG2 levels (256.7 +/-121.3 mg/dL, N = 16) than individuals carrying the G2m(n-) allele (n+/n-and n-/n-genotypes, 334,5 +/-120,4 mg/dL, N = 31) (p = 0.042). However no association of IgG2m genotypes and severity of flu was observed. Conclusions: In our study we have not been able to replicate previously reported observations. IgG2 deficiency does not appear to be a significant risk factor for severity of flu. Introduction: Neutrophil play an important role as the first line of innate immune defense. Activated neutrophils become segmented and cytoplasm grows larger including granules. However, sometimes it is difficult to diffentiate whether the segmented neutrophils are on the process to grow up or not, and to decide to stop antibiotics or not. There are some reports that hyper spectral imaging (HSI) is available for analysis including diagnosis of malaria infected cells and distinction of the cells to produce antibody. In this study, we evaluated serial changes of HSI spectrum in patients with infection, and examined whether we could distinguish activated neutrophils or not. Methods: Sputum and urine samples were collected from the clinically ill three patients with pulmonary infection(n = 2) and urinary tract infection(UTI; n = 1). These samples were smeared on glass slides and gram stain was conducted. We observed the slides with HSI to get the hyper spectral data 10 times from each cytoplasm of neutrophil, and analyzed hyper spectral data with the software system, MultiSpec®. Results: We could find that neutrophils have different spectrum in each infectious stage, especially during from 400 to 700 nm wavelength. Representative image of a patient with UTI was shown in Fig. 1 . When the neutrophils became segmented and had phagocytosis, the cytoplasm of neutrophil enlarged and showed strong intensity (black lines) compared to when microorganism disappeared and the size of cytoplasm decreased (blue and red lines). Conclusions: We could find the serial changes of spectrum of neutrophils during each infectious stage. HSI may be available to decide start or termination of antibiotics. Further study is required to establish the cut off intensity level of the activated neutrophils. Introduction: Recent studies have demonstrated that the endotoxin activity levels, which were analyzed using the Endotoxin Activity Assay (EAA), correlated with the severity of sepsis in patients admitted to the ICU. On the other hand, there are several reports that they dispute the clinical utility of the EAA. We focused on chemiluminescent intensity (CI) in response to lipopolysaccharide (LPS) by EAA and evaluated the predictive value for the incidence of postoperative infectious complications following elective gastroenterological surgery. Methods: Forty eight patients who underwent elective surgery for gastrointestinal cancer were enrolled in this study. Blood samples were taken on the previous day (before surgery), and on the first (POD1) and third postoperative days (POD3). All blood samples were analyzed using the EAA system. We focused on CI max, which is maximally stimulated with LPS and the minimum CI, which is only stimulated with opsonized zymosan. Results: Postoperative infectious complications occurred in 23 of the 48 patients. There were significant differences in the EA levels between patients who developed postoperative infectious complications and patients who did not on POD3. Minimum CI was significantly higher in the patients who developed postoperative infectious complications at all points compared to those from the patients who did not. In addition, the ratio of maximal CI to minimum CI, which reflects the neutrophil function against maximal LPS, was significantly lower before surgery and on POD1 in patients who developed postoperative infectious complications. Although a significant positive correlation was observed between the neutrophil counts and maximal CI or minimum CI, no correlation was observed between the neutrophil counts and the ratio of maximal CI to minimum CI. Conclusions: Chemiluminescent intensities, especially the ratio of maximal CI to minimum CI, could be an early predictive marker for the development of postoperative infectious complications. Introduction: C1 inhibitor (C1INH), belonging to the superfamily of serin protease inhibitors, regulates not only complement system, but also plasma kallikrein-kinin system, fibrinolytic system and coagulation system. The biologic activities of C1INH can be divided into the regulation of vascular permeability and anti-inflammatory functions. The objective of this study was to clarify the serial change in C1INH in patients with sepsis and evaluate the impact of C1INH on their clinical course. Methods: This study was a single center prospective observational study. We serially examined C1INH activity values (normal range 70-130 %) in patients with sepsis admitted into the intensive care unit of the Trauma and Acute Critical Care Center at Osaka University Hospital (Osaka, Japan) during the period between January 2014 and August 2015. We defined refractory shock as septic shock unresponsive to the conventional therapy such as adequate fluid resuscitation and vasopressor therapy to maintain hemodynamics. Results: The serial change of C1INH was evaluated in 40 patients with sepsis (30 male and 10 female; 30 survivors and 10 non-survivor; mean age, 70+/-13.5 years). We divided patients into three groups such as (i) non-shock group (n = 14), (ii) non-refractory shock group (n = 13), (iii) refractory shock group (n = 13, survivors; n = 3, non survivors; n = 10). In non-shock group, C1INH were 107.3+/-26.5 % at admission and 104.2+/-22.3 % at day1, and it increased after day1 (128.1+/-26.4 % at day3, 138.3+/-21.2 % at day 7, 140.3+/-12.5 % at day 14)(p = 0.0040). In non-refractory shock group, C1INH were 113.9+/-19.2 % at admission and it increased after admission (120.2+/-23.0 % at day1, 135.7+/-19.9 % at day3, 138.8+/-17.2 % at day 7, 137.7+/-10.7 % at day 14)(p = 0.0029). In refractory shock group, C1INH were 96.7+/-15.9 % at admission and 88.9+/-22.3 % at day1 and it increased after day1 (119.8+/-39.6 % at day3, 144.4+/-21.1 % at day 7, 140.5+/-24.5 % at day 14)(p < 0.0001). The difference between these three groups was statistically significant (p = 0.0039). C1INH in nonsurvivors did not increase significantly during their clinical course (p = 0.0773). Conclusions: In refractory shock patients with sepsis, the values of C1INH were low (especially in non survivors) at admission and day 1. The validity of C1INH replacement therapy in patients with septic shock may lead to a new strategy for management in sepsis. Introduction: There is a lack of studies comparing the utility of Creactive protein (CRP) with procalcitonin (PCT) for the management of patients with acute respiratory tract infections (ARI) in primary care. Our aim was to study first the correlation between these markers, and second to compare their predictive accuracy in regard to clinical outcome prediction. Methods: This is a secondary analysis using clinical and biomarker data of 458 primary care patients with pneumonic and nonpneumonic ARI. We used correlation statistics (spearman's rank test) and multivariable regression models to assess association of markers with adverse outcome, namely days with restricted activities and ongoing discomfort at day 14. Results: At baseline, CRP and PCT did not correlate well in the overall population (r2 = 0.16 and r2 = 0.04) and particularly in the subgroup of patients with non-pneumonic ARI. Low correlations were also found comparing cut-off ranges, day seven levels and biomarker changes from baseline to day seven. High admission levels of CRP (>100 mg/dL, regression coefficient 1.7, 95%CI 0.6 to 2.8) as well as PCT (>0.5ug/L regression coefficient 2.3, 95%CI 0.3 to 4.3) were significantly associated with days with restricted activities. There were no associations of both markers regarding ongoing discomfort at day 14. Conclusions: CRP and PCT levels do not well correlate and have both have moderate prognostic accuracy in primary care patients with ARI to predict clinical outcomes. The low correlation between both markers calls for interventional research comparing these markers head to head in regard to their ability to guide antibiotic decisions. Introduction: Procalcitonin (PCT) has been proposed as a helpful tool to guide treatment with antibiotics and to improve antibiotics stewardship. [1, 2, 3] However the cut off for the PCT level has not been agreed as different studies have suggested different thresholds to predict the need for antibiotics, these cut off points ranged from 0.25 ng/mL -1 ng/mL. In our hospital the reference range is set at 0.5 ng/mL. This study aimed at identifying the best PCT level to rule out sepsis. Methods: We have retrospectively reviewed 53 intensive care unit patients, who had serial PCT tests as part of their daily blood tests. Three serial readings of PCT were collected in addition to microbiology culture results and whether the patient met the definition criteria for sepsis as set out in the 2001 International Sepsis Definition conference. Results: Out of the 53 patients, 26 (49.05 %) patients had negative microbiological cultures with no evidence of sepsis. PCT test level of 0.13 ng/mL had 100 % sensitivity as no patient below this cut off had evidence of sepsis or positive microbiology culture. The Area under the ROC was 0.702 with 95 % Confidence interval of 0.56-0.84 and a p value of 0.012 (Fig. 3) . Conclusions: Using lower PCT cut off would further enhance the ability of PCT to rule out sepsis in the critically ill patient. Introduction: Central Nervous System (CNS) infections after neurosurgical procedures with extensively drug resistant bacteria (XDR), in intensive care unit (ICU) patients, is a life threatening complication requiring early identification and immediate action. Increased number of white blood cells (WBC), high temperature and circulating acute phase proteins are common in those patients making timely diagnosis challenging. We sought to investigate the reliability of serum C-Reactive protein (CRP) and procalcitonin (PCT) to early identify and monitor CNS infections. Methods: From January 2013 to November 2015 all cases with CNS infections were recorded. Inclusion criteria were the presence of fever > 38.5oC and compatible lumbar puncture findings (increased number of polymorphonuclear leukocytes, increased protein and low glucose compared to serum levels). The WBC, CRP and PCT levels before (when the inclusion criteria were met) and after the infection (last intrathecal administration of Collistin 300.000 IU, Amikacin 25 mg and Vancomycin 25 mg) were compared. Results: Twelve patients (mean age 45.7 ± 18.1) were studied. WBC remained high before and after infection. Conversely, CRP values were statistically different before and after the infection whereas the PCT ones were not (Table 1) . Conclusions: Based on our findings CRP carries a better predictive value than PCT as biomarker for early diagnosis and monitoring of CNS infections in ICU. The small number of patients is a limitation for these results. Introduction: Adrenomedullin (ADM) is a peptide with 52 amino acids which has strong vasodilator activity. Elevated plasma ADM levels have been detected in a wide variety of physiological and pathological conditions, with the highest elevations observed in septic shock. Marino et al. demonstrated a strong association of admission ADM levels with the severity of sepsis, supporting an earlier report describing elevated ADM levels in those with severe sepsis and those with septic shock [1] . By using a novel assay specific for bioactive ADM (bio-ADM) as Marino et al.the aim of the study was to assess the relation between ADM measured at admission and in-ICU mortality in consecutive patients regardless the cause of admission. Methods: The French and euRopean Outcome reGistry in Intensive Care Units (FROG-ICU) study was a multicenter observational study, including 2087 consecutive patients followed up to one year for those who survived to ICU stay. The protocol has previously been described [2] . Plasma were collected at admission for all patients and at discharge for in-ICU survivors. ADM was measured in all plasma using a sandwich assay specific for bio-ADM. The association between in-ICU mortality and the level of ADM was assessed by univariate analysis and adjusted analysis for severity at admission measured by the SAPS-II. Improvement in area under the ROC curve and reclassification indices were assessed. Results: 2087 patients have been included, 65 % male with a median age of 63 (51-74), a median Charlson score of 3 (1) (2) (3) (4) (5) Impact of disease severity assessment on performance of heparinbinding protein for the prediction of septic shock R. Arnold 1 , M. Capan 1 , A. Linder 2 , P. Akesson 2 Introduction: Prognostic biomarkers for sepsis are described irrespective of patient severity. Our objective was to describe the ability of Heparin-Binding Protein (HBP) to predict the development of septic shock relative to the PIRO score. Methods: Design: A secondary analysis of a multi-centered observational study. Inclusion: 1. adults >17 years with acute infection, 2. hospital admission and 3. two measurements of HBP. Exclusion: Hypotension within the first 12 hours. Initial PIRO scores calculated on arrival. Outcome: The development of delayed hypotension (dShock) = sBP <90 after initial assessment. Analysis: Subjects were grouped according to PIRO score range using previously defined cutoffs, and outcomes for each group were summarized. Average values and change in HBP were described across each group. Results: 367 subjects were identified from original 759 meeting all study criteria. Frequency by disease severity: PIRO 0-4: 109 (30 %); PIRO 5-9: 191 (52 %); PIRO 10-14: 67 (18 %). There was a progressive increase in the frequency of dShock by increasing severity (37 %, 51 %, and 61 %). HBP was significantly elevated in dShock subjects across all subgroups (0-4:66 v 25; 5-9:75 v 39; 10-14:87 v 36, p < 0.05). The change in HBP was significant at the high-severity subgroup (Fig. 4) . Conclusions: There is an increased predictive performance of HBP associated with worsening sepsis severity. Assessment of serial HBP provided additional benefit at the high-severity subgroup. Introduction: sCD14 represents a key receptor for Lipopolysaccharide -Lipopolysaccharide binding protein complexes and activation of TLR-4 cascade. Our aim was to investigate the potential prognostic role of this new marker on a cohort of patients with severe sepsis and septic shock. Methods: We prospectively included 148 patients admitted to the ICU with the suspected diagnosis of sepsis. Of these, 28 patients met the inclusion criteria and were included in the final analysis. sCD14, procalcitonin (PCT), C-reactive protein (CRP) were recorded at admission ICU day 1 and ICU day 3. Organ dysfunction, sepsis severity scores (SOFA scores, Apache II score), vital parameters and management were recorded during ICU stay. Results: sCD14 at admission was associated with cardiovascular dysfunction: decreased mean arterial pressure (correlation coefficient = -0.552, p = 0.048) and increased vasopressor support (correlation coefficient = 0.761, p = 0.035). Patients with increase sCD4 had a more severe hepatocytolysis as demonstrated by increased levels of ALT (p = 0.019) and AST levels (p = 0.040). ICU length of stay correlated with sCD14 levels at admission (p = 0.011) and ICU day 1 (p = 0.012). Patients with sCD14 levels < 2000 at admission had the lowest mortality rates and patients with presepsin levels > 6000 had the highest mortality. Statistically significant differences were observed between sCD14 between survival and non-survivals on admission (p = 0.042), ICU day 1 (p = 0.036) and ICU day 3 (p = 0.042). Conclusions: Although sCD14 is associated with cardiovascular and hepatic dysfunction, the exact dynamics of sCD14 are still unknown. We observed a good correlation between presepsin levels at the time of admission and ICU survival, demonstrating a good prognostic role of this new inflammatory marker. Introduction: Current biomarkers for sepsis diagnosis are neither sufficiently specific nor sensitive. Moreover, comprehensive diagnosis is today delayed by the long time-to-result required for the quantitative measurement of blood proteins in centralized laboratories. Therefore, the need to combine a rapid, near-patient diagnostic platform with the accurate measurement of proteins in complex matrices is the key to improve sepsis patient outcome. We developed nanofluidic biosensors that accelerate molecular interaction and thereby reduces incubation time from hours to minutes. Biosensors are analyzed in the abioSCOPE, a miniaturized automated fluorescence microscope. Fluorescent antibodies specific for the tested protein are mixed with 50 μ l of blood collected at the patient's fingertip. Complexes of detecting antibodies and analytes are captured on the sensing area of the biosensors and, upon excitation, emit a signal proportional to the concentration of the analyte. As a proof-of-concept, we prepared biosensors for the quantification of pancreatic stone protein (PSP/reg), a promising biomarker that showed its superior ability to predict the outcome of patients affected by sepsis in several clinical studies [1] . Methods: PSP/reg biosensors were analyzed in the abioSCOPE to determine the analytical performances of the test. PSP/reg levels measured in a panel of serum samples from patients who underwent surgery were measured in the abioSCOPE and compared to the concentration measured in a microtiter plate ELISA. Results: High analytical specificity was showed in a competitive inhibition study. Analytical sensitivity of the test is inferior to the mean PSP/reg value of a cohort of healthy individuals and the test is linear up to 1000 ng/ml of PSP/reg, thereby meeting clinical requirements. In a small-size comparison study, a good agreement was observed for PSP/reg values measured in microtiter ELISA and in the abioSCOPE. Conclusions: The analytical performances of the PSP/reg test in nanofluidic biosensors highlight that rapid and accurate quantitative measurement of low abundance proteins can be achieved at no quality costs. More data is however needed to better evaluate the diagnostic performance of PSP/reg in various settings. Quantification of PSP/reg in 5 minutes in the abioSCOPE, together with the evaluation of other specific clinical signs performed at the bedside, will hopefully improve the decision-making process of patient with suspected sepsis and improve patient's management in various clinical scenarios. Introduction: Sepsis remains a major cause of mortality in intensive care units and its incidence is increasing along with the antibiotic resistance of the causing microorganisms. The management of sepsis is technically demanding and costly with only few therapeutic options available. Prompt diagnosis and hence early treatment has a major impact on patient survival. The advent of nanotechnology has brought with it an astonishing number of novel tools enabling novel technological solutions [1] . Methods: Here I will discuss how nanotechnology-enabled approaches could contribute to the prevention, diagnosis and treatment of bacterial infections. Results: I will present nanoparticle-based approaches for ultrasensitive detection of analytes in body fluids and the rapid removal of pathogens from whole blood using magnetic nanoparticles [2] . Additionally, potential hurdles encountered when translating nanomaterialbased approaches into clinical settings will be illustrated. Conclusions: This presentation will critically discuss the opportunities and challenges of particle-enabled approaches for the diagnosis and treatment of bacterial infections. Introduction: Despite therapy improvement septic shock mortality is still around 30 %. Better understanding of immunosuppression mechanisms led to new therapeutics perspectives such as recombinant IL-7 [1] . However it remains to better identify patients eligible for such immunotherapy. The soluble form of IL-7 receptor seems to be a promising candidate biomarker for this purpose since an association between its plasmatic concentration and mortality has been reported [2] . As we have no data on the transcriptional regulation of IL7R gene expression in septic shock, the aim of this study was to describe, in this pathology, the expression of several mRNA splicing variants of IL7R and to assess their association with mortality. Methods: This retrospective study involved 30 ICU patients with septic shock. Whole blood sample was collected on the first (D1) and the third day (D3) after septic shock diagnosis. Expression levels of IL7R variants were measured using appropriate RT-qPCR designs: one specific of the variant encoding the membrane form of IL-7R, one specific of a variant corresponding to a potential soluble form and one covering all known splicing variants, including the 2 mentioned previously. Expression levels of these variants were described according to time and compared to 19 healthy volunteers. The association between our candidate biomarkers and day 28 status was assessed by logistic regression analyses. ROC curves were performed in order to evaluate performances. Results: We noticed a decrease of expression levels of all IL7R splicing variants in patients compared to controls. An association between mortality and IL7R decreased expression was found for measurements at D3 and the ratio of expression D3/D1. This association was observed whatever the transcription variant used, in particular at D3 for the membrane form and for the potential soluble form of IL7R (OR = 0. Conclusions: This work put in highlight a marked decreased expression level of all IL7R splicing variants studied during septic shock. Persistence of lower expression on D3 was associated with higher mortality. The quantification of the expression of IL7R could provide an interesting biomarker to identify the most seriously ill patients who could benefit from new immunotherapies. in different ICUs the relationship of disbalance of PhCAs with severity and mortality. Methods: Blood samples (n = 147) were collected in two medical centers from patients on the day of admission to ICU. Patients from mixed ICU (M-ICU), n = 89, APACHE II 12 [8] [9] [10] [11] [12] [13] [14] [15] [16] were with documented infection of various localization, average age 58 (47-65) years. Patients from surgical ICU (S-ICU), n = 58, APACHE II 8 [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] ) were with intestine perforation (n = 35) and intestinal obstruction (n = 23) after emergency surgery, the average age 64 (52-78) years. Serum levels of PhCAs were measured using Gas-chromatography. Healthy adult donors (n = 72) were used as a control [4] . Data were compared by Mann-Whitney U-test, Chi-square test Pearson's correlation coefficient (IBM SPSS Statistics 22). Results: It is noted that phenylpropionic acid (PPA) disappeared in critical ill patients while always present in healthy. Levels of PLA, p-HPLA and p-HPAA were significantly higher in critically ill patients of both ICUs versus donors (p < 0,05). A positive correlation with total serum levels of the 3 PhCAs and APACHE II were found (rM-ICU = 0.7 and rS-ICU = 0,8, p < 0.001). PLA and p-HPLA was significantly higher in patients with arterial hypotension (MAP < 70 mm). Mortality was 31 % in ICUs. The total 3PhCAs serum level in died (n = 45) was higher 6 times versus surviving (n = 102), p < 0,001. Area under the prognostic ROC curve for APACHE II scale was 0,9 (p < 0,001), for ∑ 3PhCAs 0,9 (p < 0,001), this confirms the high predictive value of ∑ 3PhCAs. Conclusions: We observed disbalance of aromatic acids metabolites in critically ill patients. Absence of serum PPA and high level of PLA, p-HPLA, p-HPAA are associated with severity and mortality. It is advisable intensify studies of the molecular mechanisms with participation of PhCAs in critically state. Supported by Russian Science Foundation Grant 1 15-15-00110. Introduction: Sepsis induced immunosuppression is an important risk factor for unfavourable outcome in severe sepsis. The monocyte HLA-DR expression (mHLA-DR) is suggested as a useful biomarker for immunosuppression. In this prospective study of bacteremic sepsis we aimed to; 1) assess the predictive value of mHLA-DR for secondary sepsis, and 2) compare mHLA-DR levels and dynamics of patients with different bacteremic aetiology of sepsis. Methods: Septic patients with positive blood cultures (n = 111), 1-2 days after admission, were included. Sampling was additionally preformed on day 3, 7, 14, and 28. mHLA-DR was analysed by flow cytometry, using a standardized protocol [1] . Data on events of secondary bacteraemic sepsis were collected retrospectively. Results: Secondary sepsis occurred in 7 cases. The mHLA-DR levels on day 1-2 were significantly lower for cases who developed secondary sepsis; median 8990 vs. 18200 AB/C (p = 0.009), AUC 0.798. The negative predictive value (NPV) was 98 %, and the positive predictive value (PPV) was 22 %. The three most prevalent pathogens demonstrated differences in linear association over time, as shown in Fig. 6 Introduction: Upregulation of the expression of negative costimulatory molecules on T-cells is one of the mechanisms behind immunoparalysis in sepsis. We wanted to evaluate the prognostic capacity of the soluble isoform of the negative co-stimulatory receptor soluble B-and T-lymphocyte Attenuator (sBTLA) in severe sepsis and septic shock. Methods: A prospective observational study was conducted in the mixed intensive care unit of Karolinska University Hospital, Huddinge, Sweden. 101 patients with severe sepsis and septic shock (sepsis cohort) and 28 patients with non-infectious critical illness (ICU controls) were included. 31 blood donors served as healthy controls. Blood samples taken on enrollment and at 24 and 48 hours were included in the analysis. Plasma concentration of sBTLA was measured with Enzymelinked Immunosorbent Assay (ELISA). Patients were followed from inclusion until day 28 or the day of death. Prognostic capacity of sBTLA was evaluated on a categorical scale (the patients were divided into three categories with the same number of subjects, based on the concentration of sBTLA) with Cox Regression and adjusted for age. Comparison of plasma concentrations between survivors and non-survivors was made with the Mann-Whitney U test. Results: sBTLA levels were statistically significantly increased in the sepsis cohort compared to ICU controls and blood donors. The overall 28 day mortality was 18 %. sBTLA levels were statistically significantly higher at the time of study inclusion and until 48 hours in 28 day sepsis non-survivors than in survivors, and did not change over time. 28 day mortality was 5-fold higher in patients with a baseline plasma sBTLA > 21 ng/mL compared to those with a level lower than 9 ng/mL (HR 5.0, 95 % CI 1.3-18, p = 0.02), but there was no increased risk for death for patients with intermediate (10-21 ng/mL) compared to low baseline sBTLA (HR 1.2, 95 % CI 0.2-5.8, p = 0.8). Conclusions: Plasma concentrations of soluble BTLA were increased in severe sepsis/septic shock compared to in patients with noninfectious critical illness and healthy controls, and did not change significantly over the first 48 hours after study inclusion. A baseline sBTLA concentration >21 ng/mL was a negative prognostic indicator. Introduction: It is well recognised that coagulation is altered across the sepsis spectrum (sepsis, severe sepsis and septic shock). This can range from an increased thrombotic risk to disseminated intravascular coagulation (DIC) contributing to increased morbidity and mortality. Previous studies have attempted to investigate these changes in coagulation using standard and global markers of coagulation [1] ; however, clot microstructure has not been investigated. Recent research has led to the development of a new biomarker of clot microstructure, Df, which quantifies the fibrin network of the incipient clot [2] . This study aims to investigate the role of clot microstructure in sepsis, by using new quantitative biomarker of clot microstructure, Df. Methods: This study had full ethical approval from the South West Wales Research Ethics Committee. Patients with a diagnosis of sepsis were recruited from the Emergency Department (ED) and Intensive Care Unit (ICU) of a large teaching hospital in South Wales. Blood samples were taken to determine Df, full blood count and standard coagulation screen. A healthy control group matched for age and gender was also recruited. Results: 95 patients were included in the study: 49 with sepsis, 19 with severe sepsis and 27 with septic shock. 44 healthy volunteers were recruited as a matched control. Mean Df in the healthy control group was 1.74 ± 0.03. Mean Df in patients with sepsis and severe sepsis was significantly higher (1.78 ± 0.07 and 1.80 ± 0.05 respectively (p < 0.05, One-way ANOVA Post-hoc Bonferroni correction)). Mean Df in patients with septic shock was significantly lower compared to all other groups (1.66 ± 0.10 (p < 0.001, One-way ANOVA Post-hoc Bonferroni correction)). Df was also significantly lower in non-survivors than survivors at 28 days (1.66 ± 0.13 vs 1.76 ± 0.08, p = 0.006 (Students t-test)). Our results indicate that patients with sepsis and severe sepsis form tight highly branched fibrin clots (as indicated by high Df). As the disease progresses to septic shock, much weaker clots are formed (as indicated by low Df). This may help to explain the dichotomy of thrombogenicity and bleeding diathesis in patients across the sepsis spectrum. The new functional biomarker, fractal dimension (Df), therefore can be used to quantify clot microstructure across the sepsis spectrum. It can also be used as an outcome measure, although this study was not powered for outcome. Introduction: It is well recognised that coagulation is altered across the sepsis spectrum (sepsis, severe sepsis and septic shock). Sepsis is recognised to be a prothrombotic state, due to an increased procoagulant activity and impairment of the natural anticoagulants and fibrinolytic proteins [1] . Altered fibrinolysis could have a key role in the accumulation of microthrombi leading to disseminated intravascular coagulation (DIC). The aim of this study was to assess the changes in fibrinolysis across the sepsis spectrum using the Rotational Thromboelastometry (ROTEM) lysis index (LI60) and D-dimer concentration. Conclusions: Fibrinolytic activity was increased in patients with septic shock. However there was an impairment of the fibrinolytic function as measured by ROTEM. The exact mechanisms leading to this are not fully understood, however consumption of fibrinolytic factors could contribute as evidenced by elevated D-Dimer concentration. ROTEM lysis index (LI60) can potentially be used as a biomarker to identify septic shock and as an outcome measure, although this study was not powered for outcome. Methods: A prospective, observational study was conducted at an academic medical center and a community hospital in Utah. Adults admitted to ICUs fulfilled SIRS criteria. The attending physician and a study investigator classified patients at admission as sepsis, SIRS, or indeterminate. The reference diagnosis was determined through independent review of all clinical data at discharge by two study physicians with adjudication by a third for discordant classifications. Inter-physician admission classifications at admission and discharge were compared. Results: Of the 210 enrolled subjects, 58 (28 %) cases had at least one physician classify the patient as indeterminate, and in 9 (4 %) there was major disagreement with the classifications of SIRS and sepsis. At discharge, in 46 (22 %) cases, at least one classification was indeterminate and in 4 (2 %) there was major disagreement between classification of SIRS and sepsis. When compared to discharge classification (Fig. 7) , 142 (68 %) cases initially classed as SIRS or sepsis by the attending physician remained unchanged. Of 89 patients initially diagnosed with sepsis on admission, 66 (74 %) were deemed to be correct, 8 (9 %) were deemed SIRS and 15 (17 %) indeterminate when compared to discharge classification. Of the 40 (19 %) subjects indeterminate on admission, 12 (30 %) remained so at discharge. 19 patients initially diagnosed with sepsis or SIRS became indeterminate at discharge. One patient initially classed as SIRS and 11 as indeterminate were ultimately discharged with a diagnosis of sepsis. Conclusions: Considerable uncertainty exists with use of clinical criteria for sepsis in ICU patients. Accurate tests that improve early diagnostic accuracy are needed in the management of critically ill patients. Introduction: Temperature measurement is an essential component for decision making on intensive care both in the context of identifying and treating sepsis and also in accurate calibration of blood gas analysis. SpotOn™ a Zero Heat Flux Cutaneous thermometer has been validated as a non-invasive, continuous and accurate method of measuring temperature in patients undergoing cardiac surgery in the operative and peri-operative period. 1 We compared SpotOn™ with Axilla thermometers, our current method of non-invasive temperature monitoring, to establish how inaccurate our current practice was. Methods: 25 acute patients were selected on admission to intensive care at Lewisham Hospital, 17 male, 18 female, 16 medical and 9 surgical. Temperatures were recorded for an average of 47 hours on each patient using Covidien Fillac™ Axilla probes and SpotOn™ forehead probes as and when our current clinical practice indicated. Results: A total of 435 comparative values were recorded over a range of 34.4C to 38.5C. On average the SpotOn™ thermometer recorded temperatures 0.14C above the Axilla (95 % CI +/-0.04 with a Wilcoxon matched-pairs signed rank test p < 0.0001). The average difference between SpotOn™ and Axilla over a normal range (36-37.4C) was 0.07 (95 % CI +/-0.04) p = 0.0028. The average difference between SpotOn™ and Axilla at hypothermic temperatures (<36C) was -0.1 (95%CI +/-0.11 p = 0.13). The average difference between SpotOn™ and Axilla at hyperthermic temperatures (> = 37.5C) was 0.53C (95%CI +/-0.11 p < 0.0001). There were 12 occasions when the SpotOn™ thermometer identified a SIRS defined temperature (<36 or > =38.3C) that the Axilla temperature did not, affecting 10 of the 25 patients. Conclusions: Zero Heat Flux Cutaneous thermometers have separately been shown to accurately record core temperatures. We have highlighted that axilla probes are inaccurate in comparison to SpotOn™. The Axilla probe appears to under-read at hyperthermic temperatures and over-read at hypothermic temperatures. The difference between the two methods appears most marked at hyperthermic temperatures. As a consequence it is likely that axilla probes could fail to identify sepsis in patients on intensive care. Introduction: AQ is based in the North West of England (population 9 M). The programme aims to improve sepsis care & clinical coding within the region. In the first 12 months the care of over 9500 patients with infection was examined. Methods: A panel of clinical experts developed a sepsis measure set with the aid of BMJ evidence & identified a study population using 25 ICD10 sepsis codes. Patients with a sepsis code on discharge had their initial care examined against a series of evidence-based interventions. An Appropriate Care Score (ACS) measured compliance with the sepsis care measure set. The project is promoted by an ongoing series of collaborative events, where performance is published & best practice is shared. A healthy sense of competition drives improvement in performance & commissioners are able to monitor performance in their locality. Results: Figure 8 shows the increase in ICD-10 coding in AQ participating trusts in the region vs Non AQ participating trusts. Figure 9 shows that at the start of the programme 23.6 % of septic patients received antibiotics within 1 hour & in the last month this was up to There is a downward trend in mortality when patients have achieved all the measures within the measure set. Conclusions: Use of the measure set and collaboration in the NW region of England appears to have improved early recognition & treatment of patients with sepsis. Results have demonstrated improvements in outcome measures, whilst recognising that improved coding & recording of sepsis may change the studied population. Introduction: Fever is a sign that is commonly associated with syndromes in intensive care. Hyperthermia is initiated by the action of IL-1β at organum vasculosum laminae terminalis, where prostaglandin E2 is released. Ortogel is a gel which contains plant extracts (Capsicum, Arnica Montana, Symphytum Officinale, Juglans Regia, Tannus Communis, Sambucus, Armoracia Rusticana, Lavandula Angustifolia) with anti-inflammatory, analgesic and anti-pyretic effects, producing cutaneous vasodilation. The purpose of this study was to assess the anti-pyretic effects of Ortogel when compared to Acetaminophen administration in Intensive Care patients. Methods: A randomized control trial was conducted in our Intensive Care Unit of the Clinical Emergency Hospital in Bucharest from January to June 2015. It included a total of 70 patients, divided in two equal groups, suffering from fever of infectious cause (cutaneous temperature of 38-39 degrees Celsius, measured using Nihon KOHDEN BSM 6000 monitor, in controlled room temperature). Hemodynamically unstable patients were excluded. The study group received 30 ml gel (Ortogel) with cutaneous administration while the control group received 1 g iv acetaminophen. Primary outcomes were difference in cutaneous temperature after 0.5, 1, 2 and 3 h from the baseline reading. Results: Initially, the average temperature for the Ortogel group was 39.5 ± 0.11 degrees Celsius, with similar values in the control group (39.0 ± 0.14). Ortogel was able to significantly decrease hyperthermia at Δ T-max by 2.7°C (p < 0.001), similar with paracetamol effects at Δ T-max of 3.2°C (p < 0.001). The gel showed significant suppression in cutaneous temperature by 2.6°C, from 39.5 ± 0.11°C to 36.2 ± 0.32°C (p < 0.001), and the percentage inhibition of fever was 6.8 %. The antipyretic effect started 45 min after administration (p < 0.01) and the reduction in rectal temperature was maintained for 3 h (p < 0.001). Conclusions: The present study shows similar antipyretic effects of Ortogel when compared to intravenous Acetaminophen. The antipyretic effect installed in a similar time frame for the two groups. Ortogel may represent an alternative treatment provided for patients with Acetaminophen contraindications. Introduction: Hypothermia is associated with adverse outcome in patients with sepsis. Knowledge on the pathophysiology is highly limited. The objective of this study was to establish whether hypothermia was associated with differences in the systemic host response to sepsis as reflected by cytokine profile, endothelial activation markers and cellular responsiveness towards lipopolysaccharide (LPS) during the first days of intensive care unit (ICU) admission. Methods: A prospective observational study in patients with sepsis on ICUs of 2 tertiary hospitals in the Netherlands. Hypothermia was defined as the lowest body temperature measurement below 36°C in the first 24 hours of ICU admission. Exclusion criteria were immunodeficiency, active cooling, readmission or admission from operating theatre or other ICU. Logistic regression was used to investigate the independent association of hypothermia with 90-day mortality. Plasma levels of endothelial markers and cytokine levels were measured upon ICU admission and at days 2 and 4 thereafter in all hypothermic and nonhypothermic patients with sepsis. Analyses were performed in the entire cohort as well as a cohort matched for disease severity. LPS ex vivo whole blood stimulation was performed on day 1 of admission in a subset of patients. Results: Hypothermia was identified in 186 of 525 patients and was independently associated with increased mortality in multivariate analysis. At baseline, patients with hypothermic sepsis were significantly older, had a lower body mass index and increased incidence of cardiovascular disease. Levels of pro-and anti-inflammatory cytokines were not different between groups at all time points. Hypothermia was also not associated with an altered response to ex vivo stimulation with LPS in a subset of patients. Hypothermia was associated with sustained elevated levels of the endothelial activation marker fractalkine during the first 4 days of ICU stay compared to nonhypothermic patients in both the entire cohort and the cohort matched for disease severity. Conclusions: Hypothermic sepsis is associated with increased mortality. Patients with hypothermia showed increased levels of fractalkine, irrespective of disease severity during the first 4 days of ICU stay. This research was performed within the framework of CTMM, the Center for Translational Molecular Medicine (http://www.ctmm.nl), project MARS (grant 04I-201). Septic shock alert over SIRS criteria has an impact on outcome but needs to be revised Introduction: Clinical information systems(CIS) through protocolbased alerts have long been used in intensive care units(ICU).They can serve for early recognition and intervention of sepsis and septic shock.We aimed to determine whether early response to "septic shock alert" has any impact on the true diagnosis and patient outcomes in septic shock. Methods: 1384 patients admitted were reviewed over CIS (Metavision, iMDsoft) in Bakirkoy Dr.Sadi Konuk.An early warning was protocolized in the system relevant to international guidelines.Septic shock alert was activated if the condition as per "> = 2 systemic inflamatory response syndrome(SIRS) criteria together with SBP < 90 mmHg" was established.Patients were categorized into 3 based on intervals between first alert activation and initial response as per orders for diagnostics and interventions. Group I assigned for "early acknowledgement and response within 10 minutes after alert", Group II as "response within 10-60 minutes after" and Group III as "late response after >60 minutes". First acknowledgement of the alert, SOFA and APACHE 2 scores; and patient outcomes were analysed. Results: 165 patients recieved alert, however only 33 were diagnozed with sepsis/septic shock on admission, on the other hand, later in course of disease, 112 patients were confirmed to have sepsis. Group I had the lowest mortality compared to others (Table 4) . Conclusions: CIS helps in early recognition and intervention of critical disease, thus has an impact in decreasing mortality.In order to further decrease mortality, training is needed to increase compliance to the alert systems.However, septic shock alert protocolized over SIRS criteria seems to lead false positive diagnosis in one third of patients.This reveals the need for revising the sepsis and septic shock definition. Association between previous prescription of βblockers and mortality rate among septic patients: A Introduction: Preclinical and clinical studies have investigated the role of β -blockers in sepsis, especially in early sepsis, which characterized by elevated sympathetic response. The purpose of this study was to examine the association between a previous prescription for β -blockers and mortality rate among septic patients. Methods: We conducted a retrospective cohort study from January 1, 2003, to December 31, 2013 , in a tertiary care academic medical center. We included all patients admitted to the intensive care unit (ICU) with severe sepsis and septic shock who were > =14 years of age. Patients were defined to have a previous prescription of β -blockers if the prescription was active for the 3 months prior to hospital admission. Results: We identified 4,629 patients with severe sepsis and septic shock, 623 of whom had a previous prescription for β -blockers before hospital admission and 4006 of whom were not previously taking β -blockers. Common medications used by patients were metoprolol (77 %) and carvedilol (13 %). The overall mortality rate was 30 %. Among patients who were previously taking β -blockers, 181 (29.1 %) died in the ICU and 442 (70.9 %) were discharged alive. Of the patients who did not have a previous prescription for β -blockers, 1,231 (30.7 %) died in the ICU and 2,775 (69.3 %) were discharged alive. There was no statistically significant association between patients having a previous prescription for β -blockers and ICU mortality (risk ratio 0.94; 95 % confidence interval [CI] 0.82 to 1.08; p = .39). We further stratified patients on the basis of highest heart rate in the first 24 hours of ICU admission into those with a heart rate >95 beats per minute and those with a heart rate of < =95 beats per minute. There were no statistically significant association between patients who had a previous prescription for β -blockers and ICU mortality in either group (risk ratio 0.91; 95 % CI 0.77 to 1.06; p = 0.22) and (risk ratio 1.095; 95 % CI 0.78 to 0.1.54; p = 0.60) respectively. Conclusions: Our study demonstrated no significant association between previous prescription for β -blockers and ICU mortality. This held true even after further stratification of patients on the basis of highest heart rate in the first 24 hours of ICU admission. This result may be due to lack of effect of β -blockers or short-term action of medication use. Introduction: The UK Sepsis Trust aims to raise awareness of sepsis and recommends an evidence based treatment bundle for severe sepsis known as 'The Sepsis Six' [1, 2] . An audit was completed on the Obstetric units of Lewisham and Greenwich NHS Trust, UK. We wanted to assess the impact of teaching and posters on improving knowledge. Methods: We gave an anonymous questionnaire to maternity staff to assess baseline knowledge. This tested respondent's on the diagnostic criteria for systemic inflammatory response syndrome (SIRS), sepsis, severe sepsis and their knowledge of the 'Sepsis Six' treatment bundle for severe sepsis [1] . Our interventions consisted of 1:1 teaching and the provision of an information poster. A re-audit was performed in the same units a week later. Results: We initially audited 33 maternity staff and re-audited 27. Those re-audited reported that all staff had received 1:1 teaching and 96 % had seen the leaflet or poster. SIRS: Initially, 82 % had heard of SIRS, could name on average of only 1.5/6 criteria and only 58 % knew 2 criteria were needed for the diagnosis. This improved to 100 %, 5.2/6 and 81 % respectively. SEPSIS: 52 % could correctly define sepsis, only 3 % could define severe sepsis with staff able to name on average only 0.8 of the severe sepsis criteria. Their knowledge also improved to 70 %, 63 % and 2.4 respectively. SEPSIS 6 MANAGEMENT: Staff knew 4.8/6 of the Sepsis Six steps for management and 88 % knew it was required within an hour. There was a slight improvement to 5.5/6 for management and a surprising decrease in initiation of treatment to 85 %. Conclusions: Our maternity staff were initially aware of SIRS and Sepsis and simple interventions required to diagnose and treat patients, however their baseline knowledge of details in diagnosis and management was poor. We demonstrated that simple interventions such as 1:1 teaching, a poster and readily available leaflet can dramatically improve knowledge. Introduction: There is paucity of data concerning the group of patients with a clinical syndrome of sepsis, suspected clinical infection and negative cultures regarding disease severity, end organ failure and outcomes. It is also not clear whether patients with only negative blood cultures (BCs) but other positive cultures have a different disease entity than patients who have no growth in all of the cultures taken. Our study attempts to further define true culture negativity and its effect on disease severity and outcomes. Methods: All blood cultures results taken in the Detroit Medical Center ICUs from December 2012 to March 2013 were collected. Any patient who had a negative BC was reviewed. All cultures taken within 10 days of the negative BCs were further reviewed. Patients were then divided into 3 groups: 1.BC positive group-BCs within 10 days of the negative BC were positive. 2. Non BC positive group -other cultures within 10 days of the negative BC were positive. 3. All culture negative. The 3 groups were compared. Results: During the study period, 300 patients had at least one negative blood culture. 132 had all cultures negative, 76 had positive BCs, and 92 had other positive cultures. The BC positive group resembled the negative group in all baseline characteristics. There was a significant difference in the admission scores, with higher SOFA and APA-CHE2 scores of the BC positive group. SOFA score was higher for the positive group throughout the ICU stay. There was no difference in SIRS in both groups. However, there was more shock, need for vasopressors, renal failure, respiratory failure and neurological alterations in the BC positive group. The BC positive group was more frequently started on antibiotics and had more antibiotic days then the negative group. They also had worse outcomes with a higher mortality in the ICU (40 % in the positive vs. 6.8 % in the negative, p < 0.0001) and in the hospital (47.4 % vs. 9.9 % respectively, p < 0.0001). In a multivariate analysis of ICU death predictors, the only independent predictors were APACHE2 > 25 and positive BC group. Disease severity was also worse in non-blood positive culture group when compared with the negative group; scores were higher, there was more end organ dysfunction, antibiotic treatment was longer and Outcomes were worse. Conclusions: We have shown that the culture negative patients have lower disease severity, end organ failure and better outcomes, and therefore should be considered for shorter antimicrobial treatments and early de-escalation. Culture positivity indicate worse prognosis even for non BCs. Introduction: In administrative data, severe sepsis cases can be identified by different ICD code abstraction strategies. Comparing those strategies, there is a substantial variability in incidence and mortality of severe sepsis depending on the codes used. To understand which mechanisms depend the precision of case identification, we aimed to investigate coding of organ dysfunction in patients with severe sepsis hospitalized in Germany between 2007-2013 comparing administrative coding with prospective data from a national cohort study. Methods: Severe sepsis patients (>18 y) were identified in a nationwide database of hospital discharge data (DRG statistics) using ICD-10 codes for I) sepsis + organ dysfunction (explicit coding strategy) and II) infection + organ dysfunction (implicit coding strategy). Explicit sepsis codes included 26 ICD-codes. Infection codes were adapted from Angus et al. (2001, Crit Care Med) . Organ dysfunctions were identified by 27 organ failure codes. Septic shock was defined by code R57.2, introduced in 2010. Comparative organ dysfunction data was extracted from a German ICU cohort study (1) . Results: Between 2007-2013, we identified I) 941 957 severe sepsis patients using explicit and II) 4 785 511 severe sepsis patients using implicit coding strategies, including 18,2 % and 3,5 % of patients with septic shock, respectively (112 787 patients 2010-2013). Respiratory failure was the leading organ dysfunction coded (56,4 % of explicitly vs. 59,6 % of implicitly identified cases). Renal failure was identified more often when using explicit coding strategies (44,7 % vs. 26,5 %). This was also true for coagulopathy (23,5 % vs. 12,5 %) and metabolic alterations (13,1 vs. 6,1 %). Hypotension was coded in 18,7 % (explicit) and 5,1 % (implicit) of cases. Compared to a prospective cohort of ICU patients with severe sepsis, distribution of organ dysfunctions was similar to the one identified by explicit coding strategies (respiratory failure: 52 %, renal failure: 42,2 %, encephalopathy: 27,7 %, coagulopathy: 22,2 %, metabolic acidosis: 17,8 %, septic shock 50,8 %). Conclusions: Pattern of organ dysfunctions in severe sepsis patients differ depending on the coding strategy used. Hypotension and the direct code for septic shock is coded in a similar percentage of patients, but in much fewer cases compared to a prospective cohort study of ICU patients. The quality of organ dysfunction coding has substantial influence on the accuracy of coding strategies for severe sepsis in administrative data and therefore requires further evaluation. Introduction: There were improvements in the knowledge about sepsis care and also well-established guidelines. However, the morbidity and mortality of septic patients remains unacceptably high. Our objective is to evaluate the knowledge of residents in different departments regarding the; Surviving Sepsis Campaign (SSC) 2012. Methods: A cross-sectional descriptive study via a 15 question, questionnaire which was distributed to residents in Songklnagarind hospital, a-tertiary referral university teaching hospital in southern Thailand. The interns as well as, the training residents in the Department of Internal medicine, Surgery and Emergency medicine were included in our study. Results: The response rate was 136 (89 %) from 153 residents. The residents included 46 (33 %) interns, 42 (30 %) internal medicine residents, 41 (30 %) surgical residents and 7 (5 %) emergency residents. Regarding the definition of sepsis, severe sepsis and septic shock, only 44 (32.3 %) residents were able to differentiate the severity of sepsis. Internal medicine residents have a significantly higher rate of correct answers than non-medicine and surgical residents (45.2 % vs. 26.6 %, P = 0.03 and 45.2 % vs. 12.2 %, P = 0.001). Only 77 (51 %) residents would measure blood lactate in sepsis patients, and there is no difference in overall knowledge about lactate measurement and interpretation between, internal medicine residents compared with other residents. In fluid resuscitation, all residents (95.6 %) chose normal saline solution as their first choice. However, in respects to the dose of fluid resuscitation, only 28 (20.5 %) residents gave the recommended fluid bolus (30 mL/kg) and internal medicine residents had a higher percentage of correct answer than surgical residents (28.6 % vs.7.32 %, P = 0.01). One hundred and fifteen (85 %) residents and 123 (90 %) residents used appropriate target mean arterial pressure and vasopressors, respectively. Most residents could give antimicrobial agents (73.5 %) and steroid (93.4 %) appropriately in patients with sepsis and septic shock. However, only half of the residents knew the target range of blood sugar control in sepsis patients, and internal medicine residents have a better knowledge than the other residents Conclusions: Our residents' knowledge about SSC 2012 is not satisfactory. Teaching couple with the learning process of sepsis management should be further provided. Because, an improvement in knowledge will surely decrease morbidity and mortality in sepsis patients. Effectiveness of a septic shock bundle to improve outcomes in the ICU F. Breckenridge, A. Puxty Glasgow Royal Infirmary, Glasgow, UK Critical Care 2016, 20(Suppl 2):P047 Introduction: The formation of the Surviving Sepsis Campaign in 2002 [1] led to the introduction of various sepsis management bundles, with evidence of improved outcome [2] . In 2014 our ICU introduced a quality improvement project to implement our septic shock bundle, encompassing early central line insertion, dynamic fluid boluses, mean arterial pressure >60 mmHg, measuring central venous oxygen saturations, blood cultures and antibiotics. We aimed to investigate if the introduction of the bundle was associated with improved patient outcomes. Methods: A retrospective search of the WardWatcher TM database identified patients admitted to our ICU with septic shock, from 2009-2011 (Group 1) and after introduction of the quality improvement bundle from 2014-2015 (Group 2). Patients with a significant underlying renal or vascular diagnosis were excluded. Acute physiology and chronic health evaluation II scores (APACHE II) and outcome measures including length of ICU stay, duration of cardiovascular support, need for renal replacement therapy (RRT) and ICU survival were noted. Results: A total of 171 patients were included; 88 patients in group 1 and 83 in group 2. Median compliance with all aspects of the bundle was 60 % for group 2. Mortality was 43.6 % in group 1 compared to 38.6 % in group 2 (p = 0.64). Median APACHE II scores were similar and duration of cardiovascular support was not significantly different between the two groups. Length of ICU stay (median [interquartile range]) was longer in group 2 at 6 days [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] compared to 3 [1] [2] [3] [4] [5] [6] [7] (p < 0.001) however this significance was lost when non-survivors were excluded. There was a significantly lower requirement for RRT amongst ICU survivors in group 2 at 15.7 % compared to 34 % (p = 0.0078). Conclusions: We have shown a significant improvement in rate of RRT associated with implementation of our sepsis resuscitation bundle. Whilst implementation of care bundles has been associated with improved patient outcome, the basis of this relationship remains unclear [2] . Our work fails to demonstrate a significant reduction in mortality however we acknowledge the limitations of a small sample size. Median bundle compliance is higher than in other published work [2] however ongoing quality improvement work may increase this further. We identified a longer length of stay in group 2. This may represent either a longer period of treatment before deciding to change the focus of care to palliation or a faster decline in patients within group 1. Introduction: Vasopressor therapy is required to sustain life and maintain perfusion in the face of life-threatening hypotension occurring with severe sepsis and septic shock, even when hypovolemia has not yet been resolved. [1] We hypothesized that a dose of norepinephrine administered in the first 24 hours correlates with the outcome of treatment in patients with severe sepsis and septic shock. Methods: We analysed a total of 632 consecutive patients with septic shock (sepsis-induced hypotension persisting despite adequate fluid resuscitation) from the EPOSS database (Data-based Evaluation and Prediction of Outcome in Severe Sepsis), which was developed to monitor and assess the treatment efficacy in patients with severe sepsis and septic shock. Patients were admitted to participating intensive care units (twelve hospitalsseventeen high-volume care units) in the Czech Republic from 1st of January 2011 to 5th of November 2013. The patients were divided into two groups: survivors (n = 316) and nonsurvivors (n = 316). Introduction: Low-dose vasopressin (VP) recently emerged as a promising therapy for septic shock [1] . The rationale for its use is the relative VP deficiency in patients with septic shock and VP ability to restore vascular tone and blood pressure, reducing the need for cathecolamines [2] ; however VP outcome effects in septic patients remain unclear [3] [4] . Methods: We retrospectively analyzed patients admitted to our general ICU for septic shock in the last 23 months (between 1/2014 and 11/2015) and treated with Norepinephrine (NE) or with the association NE + VP. Patients were treated with NE after adequate fluid expansion. VP was added (0.02-0.03U/min) in case of MAP < 60 mmHg with NE dosage > = 0,4mcg/kg/min. We analyzed severity scores and plasma lactates at ICU admission, mortality in ICU, urinary output during the first 24 hours of vasopressor and need for RRT during ICU stay. Mann-Whitney and Chi-Square tests were used for statistical analysis. Results: 39 patients were enrolled, 15 patients received NE + VP (NV group), 24 received NE (NE group). Overall mortality rate was 46.1 %: 53.3 % in NV and 42 % in NE group respectively (p = 0,47). The need of RRT was greater in NV than in NE group (40vs20%, p = 0,19). Urinary output in the first 24 hours of vasopressor was lower in NV group (0,7vs1 ml/kg/h, p = 0,47). NV group patients had more severe haemodynamic impairment and also worse severity score (SOFA = 11,8vs9,9; p = 0,03), worse renal function (AKIN 3vs2) at ICU admission and higher plasma lactates levels (3,9vs3,6; p = 0,39) NV group had greater incidence of thrombocytopenia (105vs207, p = 0,03). Conclusions: We didn't find any statistically significant difference between NV and NE groups in ICU mortality, despite NV had a significantly higher predicted mortality according to the SOFA. Renal function impairment wasn't significantly different in the two groups. The greater incidence of thrombocytopenia (p = 0.03) observed in NV group is in line with other studies. Introduction: Generalized vasodilation with nonresponding hypotension is present in half death cases due to septicaemia. Methylene blue could be used as a valuable complement in refractory hypotension treatment. The aim of this study was to determine the effectiveness of methylene blue as contributory treatment in patients with septic shock Methods: A controlled, randomized, double blinded, clinical trial was performed. 60 patients were divided in two groups. A Group received a single dose of methylene blue calculated 2 mg/kg per body weight diluted in 100 cc of 5 % dextrose infused in 60 min. and C Group, (control) received 100 cc of 5 % dextrose infused in 60 min. Basal measurements of study variables were taken (MBP, lactate, base deficit, central venous saturation and CO2 delta) prior blue methylene administration and every hour afterwards, until MBP >65 mmHg without vasopressor or 72 hours passed after shock began. Data about total noradrenaline dose in mg, length of stay, mechanical ventilation length and mortality was recorded Results: MBP increased progressively first 6 hours after methylene blue infusion in A Group 22 % and C Group 9.2 % (p:< 0.05), steadily until 72 hour follow up. Noradrenaline dose decreased in the first 6 hours, on A Group an 86 %, C Group was 56 % (p:<0.05). Lactate clearance first 6 hours was 62 % in A Group, in contrast with C Group with 33 % clearance (p:<0.05). Mortality at ICU discharge on A Group was 20.0 % and C Group was 36.6 % (p: <0.05) without variation at 21 days Conclusions: Methylene blue is effective as contributory in septic shock treatment. Introduction: Coagulation disorders are common in septic patients and constitute a considerable diagnostic and therapeutic challenge in the ICU. The excessive activation of coagulation involves consumption of coagulation factors and platelets which may even lead to the development of disseminated intravascular coagulation (DIC). In the present study we investigated the usefulness of the thromboelastometry method for bedside monitoring of coagulation abnormalities in patients with DIC treated at the ICU. Methods: In the observational study, patients with a diagnosis of severe sepsis and overt DIC on admission to the ICU were included. DIC was recognized using a scoring system based on the criteria proposed by the International Society of Thrombosis and Hemostasis [1] . The results of thromboelastometry tests (ROTEM) performed on the day of admission (day 1) and then daily for the next 3 days were Results: Among 51 patients admitted with severe sepsis, 16 (31 %) had a diagnosis of overt DIC. Out of 16 DIC patients, 5 patients died (NS; DIC score 5 ± 0.4, 6 ± 1.0, 5 ± 0.0, and 6 ± 0.0 points on day 1, 2, 3, and 4 and 11 survived (S; DIC score 5 ± 0.4, 4 ± 1.2, 4 ± 2.1, and 3 ± 1.8 points). On admission, major coagulation abnormalities identified by thromboelastometry, indicating a hypocoagulable pattern of abnormalities, were recorded for patients who did not survive: increased CT (p = 0.005 in EXTEM; p = 0.51 in INTEM) and CFT (p = 0.02 in EXTEM; 0.05 in INTEM), a lower alpha angle (p = 0.008 in EXTEM; p = 0.02 in INTEM), and decreased MCF (p = 0.03 in EXTEM; p = 0.31 in INTEM), in comparison to the results recorded for survivors. Differences in thromboelastometry parameters for groups of NS and S were also observed on the 2nd, 3rd and 4th days. Fibrinolysis was inhibited in nonsurvivors (LI 60 = 100 % and ML = 0 % in both EXTEM and INTEM) during the entire time of observation. Conclusions: Thromboelastometry used as a point-of-care assay made it possible to accurately monitor patients with DIC. The presence of coagulation disorders indicated by thromboelastometry identifies a high-risk subpopulation of critically ill patients. Introduction: The frequency of early (within the first three days of onset) sepsis-associated coagulopathy (SAC) and its association with clinical outcomes varies depending on the definition(s) of coagulopathy. Furthermore, the frequency of SAC may have decreased with more effective use of sepsis bundles (early antibiotics, fluids, and vasopressors). Accordingly, we sought to determine the frequency and outcome of early SAC. Methods: We reviewed all patients admitted to the medical-surgical ICU of St. Paul's Hospital, a tertiary care hospital in Vancouver, Canada from January 2011 to July 2013. We included patients who met SAC inclusion criteria: sepsis and platelet count less than 150,000 (or a decrease of at least 30 %) and INR greater than 1.2 within the first three days of onset of sepsis. We assessed the presence and severity of SAC and the association of SAC with hospital mortality and need for vasopressors, ventilation and renal replacement therapy (RRT). ) in both unadjusted and adjusted analyses. Increasing severity of the combination of abnormal INR and platelets was associated with significantly higher mortality and greater need for vasopressors and RRT but not ventilation. Increasingly abnormal INR was associated with increasing mortality in a monotonic, increasing fashion whereas only severe thrombocytopenia (platelets (<80,000) was associated with significantly increased mortality. Conclusions: SAC is common (20 -35 % of septic patients) and is associated with increased mortality and need for vasopressors, ventilation and renal replacement therapy. Accordingly, there is a need for therapies that decrease the severity of SAC to attempt to decrease mortality and organ dysfunction in sepsis. Introduction: Hypercoagulability has been described both in cancer patients and also in sepsis, leading to microcirculatory failure and organ dysfunction in the latter. However, this is often overlooked by standard coagulation tests (SCTs).The aim of this pilot study was to compare hemostasis in cancer patients in the early stage of severe sepsis/septic shock to cancer patients without sepsis Methods: Adult patients operated for solid tumors (in the first 30 days after surgery) admitted in the ICU with severe sepsis/septic shock were included in the study group (SG). Patients scheduled for elective surgery due to intra-abdominal or pelvic malignancies were included in the control group (CG). Exclusion criteria were: liver diseases, chronic kidney failure, hematologic diseases, pregnancy, chronic anticoagulant/antiplatelet therapy and blood derivates or procoagulant treatments in the last 7 days. In both groups SCTs, plasma levels of coagulation factors and rotation thromboelastometry (ROTEM®, Germany) were determined in the first 24-36 hours after admission in the ICU (SG) and just before surgery in the CG. The following indices were calculated from the first derivative of the clot firmness curve: Maximum Velocity (MaxVel), Time to Maximum Velocity of clot formation (t-MaxVel) and area under the curve (AUC) [1] . Results: After Ethics Committee approval, 32 patients in the SG and 26 patients in the CG were included. Patients in the SG had lower platelet number (p = 0.001), prolonged SCTs (p < 0.001 for PT and aPTT) and lower II,V, VII, X, protein C, S (p < 0.001) and antithrombin III(p = 0.015) levels. In the SG, ROTEM showed delayed activation of hemostasis, prolonged clotting times (p < 0.001) and increased t-MaxVel (p = 0.001). But clot formation was similar in both groups with non-significant differences in maximum clot firmness (MCF), MaxVel and AUC. In the SG patients with APACHEII > =25 had higher MCF (p = 0.026) and AUC (p = 0.003) as compared to patients with APACHE < 25. Conclusions: Our study showed that cancer patients with severe sepsis/septic shock had similar hypercoagulability after a delayed clot initiation as compared to cancer patients without sepsis, despite thrombocytopenia and prolonged SCTs. These abnormalities were only detected by dynamic hemostasis measurements but not with SCTs. For firm conclusions the completion of this pilot study is required. Introduction: DIC has a high prevalence among the critically ill, but a specific diagnostic test is lacking. We aimed to assess whether ROTEM is of additional value to discriminate patients with and without DIC and whether it correlates with conventional and experimental markers/tests of DIC. Methods: In ICU patients with a coagulopathy (INR 1.5-3.0), DIC was defined as a score of > = 5 points on the ISTH DIC score scale. Concomitantly with conventional coagulation tests (INR, aPTT, fibrinogen, d-dimer and platelet count) ROTEM was performed. In addition, levels of coagulation factors II, V and VII, antithrombin, protein C activity and protein S were determined. Statistics by Mann Whitney and Spearman's rho. Results: 23 patients were included of which 13 had overt DIC, the majority was admitted to the ICU due to sepsis. Patients with DIC had lower platelet count and lower levels of fibrinogen, factors II, VII and VIII compared to patients without DIC. Endogenous anticoagulants; antithrombin, protein C and S were also reduced in DIC patients. Thromboelastometry profiles were more hypocoagulable in DIC patients compared to those without DIC: EXTEM CFT: 183 (Table 5) . Conclusions: In ICU patients with DIC, ROTEM showed hypocoagulable profiles and correlated with DIC score and low levels of endogenous anticoagulants. Thereby, ROTEM may be a useful tool in diagnosing DIC in the critically ill. Cessation of a preexisting chronic antiplatelet therapy is associated with increased mortality rates in severe sepsis and septic shock Introduction: A prior use of antiplatelet agents was associated with a survival benefit in sepsis [1, 2] . But it remains unclear, if chronic antiplatelet therapy should be continued or not during severe sepsis and septic shock. In addition, the subsequent need of red blood cell (RBC) transfusion in patients with continued chronic antiplatelet therapy remains to be determined. We hypothesized that the continuation of a preexisting antiplatelet medication (aspirin, clopidogrel, dipyridamole) during sepsis therapy reduces mortality rates of severe sepsis and septic shock without increasing RBC transfusions. Methods: We performed a retrospective, single-center cohort study of intensive care (ICU) patients with severe sepsis or septic shock at the University Hospital of Greifswald, Germany, from January 2010 to December 2013. The local ethics committee approved the study (Identifier: BB 133/10) and waived a written informed consent because of the anonymous data collection and the quality saving and observing character of the study. The administration of antiplatelet agents before and during ICU stay, the number of transfused RBC concentrates and mortality rates up to 90 days were registered. Categorical data are represented as percentages and counts. Results: Patients whose preexisting antiplatelet therapy was continued showed increased mortality rates in contrast to patients whose medication was discontinued. The Introduction: Neutrophil extracellular traps (NETs) are one of the immune systems to suppress dissemination of infection by the netted chromatin decorated with antibacterial molecules. We reported that NETs were not observed in the abscess, but a lot of NETs appeared after abscess drainage. Since closed wound area is known to be hypoxic, we hypothesized that NETs production was affected by the concentration of oxygen. Methods: Blood samples were obtained from healthy volunteers and neutrophils were purified and stimulated with PMA. Incubation was performed for 4 hours in a CO[sub]2[/sub] incubator or in a box with oxygen scavenger. The specimens were stained with SYTOX Orange and examined by immunohistochemistry. We counted the numbers of NETs and analyzed its rate by Wilcoxon signed rank test. Results: Five volunteers cooperated in this study. NETs were produced under both conditions. NETs amount decreased under hypoxia in the four out of five cases and the average reduction rate was 57.8 % (Fig. 11 ), although it was not statistically significant (p = 0.13). Conclusions: These results indicated that NETs production might decrease under low concentration of oxygen. Wound drainage may promote NETs production by providing fresh oxygen to the infected area. Further analysis has to be made to clarify the relations between NETs production and oxygen. Introduction: Administrative data is increasingly used in epidemiological research, but needs to be validated for this purpose. We therefore wanted to compare reported focus of infection in severe sepsis between administrative data and prospective cohort studies with chart review from Germany. Methods: We identified hospital cases with severe sepsis between 2007-2013 from the "DRG-Statistik", a database of nearly all German inpatient cases, by the specific ICD-10 code R65.1 (sepsis with organ dysfunction). Focus of infection was abstracted from documented diagnoses for each case. For comparison data on focus of infection was abstracted from a point prevalence study [1] and a prospective cohort study [2] of severe sepsis in German ICU's. Patients could have more than one focus in all three groups. Results: Relative frequency of the most important foci of infection is reported in Table 6 . Conclusions: There are striking differences in the reported foci of infection depending on sampling techniques and method of data abstraction. To improve the usefulness of administrative data for epidemiological research, further evaluation is needed. The surprisingly high incidence of device related severe sepsis warrants a closer look. Table 1 . All low income countries provided less than 10 completed responses. Conclusions: Adherence to international guidelines need to be reinforced at ICU level. Priorities in middle income countries should focus on improving 1) full barrier precautions and 2) using chlorhexidine >0.5 % for skin preparation. Reduction of device exposure through daily assessment of CL need remains a priority in both settings. Almost all respondents consider measurement of CLABSI key to quality improvement, however less than a quarter actually report their CLABSI rate. Our study was limited by a non-random sample of ICU doctors and nurses. Introduction: Urinary tract infections are common problems in patients in ICU with increased mortality, costs and hospitalization. The best way to treat urinary tract infections is to prevent them. The mechanism of action for preventing bacterial adhesion for noble metal alloy-coated urinary catheter is the generation of a galvanic effect. The aim of this study was to evaluate the benefits of noble metal coated-catheter compared to silicone Foley catheter in patients admitted to our Toxicology-Intensive Care Unit for drug poisoning, with short-term catheterization. Methods: We enrolled 120 patients which were randomly assigned to one of the two groups: one group received noble metal alloycoated catheter (Group 1) and the other one received silicon Foley catheter (Group 2). We excluded all patients with urinary tract precontamination. Urine full examination and urine culture was taken at admission and on day 3 of catheterization. Results: The incidence of bacteriuria was 2 % with the noble metal alloy-coated catheter and 6.6 % with the silicone catheter (p < 0.05) after a mean period of 3 days' catheterization time. Age over 65 years (odds ratio 6.08) was significant risk factors for bacteriuria. The Gram negative bacteria of Escherichia coli and Klebsiella pneumoniae were the most uropathogenic bacteria. We observed significant association between urinary tract infection caused by Escherichia coli and female gender (p < 0.05). Conclusions: Noble metal alloy-coated catheters may decrease the incidence of urinary tract infections compared with silicon ones and in the meantime may lower the need for antibiotics. Also, we noticed that the incidence of bacteriuria increased with age in both groups, but remained lower in noble metal alloy-coated catheter group. Methods : The retrospective data analysis of patients treated in surgical and medical intensive care units with positive blood culture for Gram-negative rod during 2005-2012 years was carried out. Results: There were found 430 cases of Gram-negative rod monobacteremia, 77 (17.9 %) of them caused by Acinetobacter spp. There was no difference in gender, age, comorbidities, source of bacteremia, length of stay in intensive care unit for Acinetobacter spp. bacteremia (P > 0.05). Primary bacteremia was associated with surgical procedures (OR = 4.677, P = 0.03) and previously treatment in other hospital departments (OR = 1.02, P = 0.04). Acinetobacter spp. strain was associated with high resistance to antibiotics (P > 0.05), as: cefotaxim (n = 75, 97,4 %), ampicillin (n = 74, 96.1 %), cefuroxim and ceftazidim (n = 66, 85.7 %), gentamicin (n = 65, 84.4 %), piperacillin (n = 65, 84.4 %), piperacillin with tazobactam (n = 62, 80.5 %), ciprofloxacin (n = 59, 76.6 %), amikacin (n = 41, 53.2 %), ampicillin with sulbactam (n = 33, 42.9 %). The low resistance to carbapenems (n = 6, 7.8 %, P = 0.03) was estimated. Acinetobacter spp. strains were found to be mostly multi-drug-resistant, as resistant to > = 3 classes of antibiotics (n = 74, 96.1 %, P = 0.02), especially in postoperative patients (n = 37, 92.5 %, OR = 4.042, P = 0.04) and previously treated in other hospital departments (n = 41, 100 %, OR = 4.701, P = 0.03). Overall mortality rate of Acinetobacter spp. bacteremia was 84.4 %. Lethal outcome was associated with mechanical ventilation (n = 65, 100 %, OR = 4.105, P = 0.05), multi-drug-resistant strain (n = 64, 98,5 %, OR = 1.182, P = 0.013), elderly (n = 36, 90 %, OR = 1.662, P = 0.016), resistance to cefalosporins (n = 57, 86.4 %, OR = 2.367, P = 0.04), and alcohol abuse (n = 9, 81.8 %, OR = 3.926, P = 0.05). Conclusions: Acinetobacter spp. monobacteremia usually appears as caused by multi-drug-resistant > = 3 classes of antibiotics strain, but with low resistance to carbapenems, especially in postoperative patients and previously treated in other hospital departments. It has extremely high overall mortality rate. Mortality was associated with elderly, alcohol abuse, multi-drug-resistant strain and resistance to cefalosporins also mechanical ventilation. Conclusions: Both Group A and Group B streptococcal infections were associated with significant morbidity and mortality. The APA-CHE 2 score appeared to predict the mortality reliably in patients with Group A Streptococcal infections. In patients with Group B Streptococcal infections however observed mortality was significantly higher than the predicted mortality from the APACHE 2 score. ICU and hospital length of stay were broadly comparable for those of the general ICU population over that period (means 216 and 648 hours respectively). Introduction: The diagnosis of spontaneous bacterial peritonitis (SBP) is usually investigated by cytobacteriologic analysis.we purpose to evaluate the effectiveness of the urinary strips URITOP + TM for rapid diagnosis of SBP and emphasize the utility of inoculation of blood culture bottles withascitic fluid for an accurate identification of bacteria. Methods: Our study included 44 patients with cirrhosis and tense ascites. Immediately, after paracentesis a urine test strips was performed searching visually the presence of leukocytes and nitrites on the ascitic fluid. The reaction between this fluid and the strip induced a color change. The color of the leukocyte reagent area was then compared with the color chart of the bottle and the result was scored in function of the number of crosses. Besides biochemical and bacteriological analysis of the ascitic fluid was done including bedside inoculation of five millimeters of this fluid into aerobic and anaerobic blood culture bottles. Cultures of blood were also obtained from all patients. Results: SBP was diagnosed in 11 of the 44 samples. URITOP + TM test was positive in 8 cases. The reading of leukocytes strips revealed one cross, two crosses and three crosses in 15,4 %, 30,8 % and 15,4 % respectively. Nitrites were present in 69,5 % of cases. Blood and ascitic inoculation were positive in 46,2 % and 53,8 % respectively. The major bacteria isolated were gram-negative bacilli. We found a significant correlation between SBP and visual detection of white cells and nitrites (p < 10-3 for both parameters). A significant correlation was also found between SBP and both ascitic (p = 0,02) and blood (p = 0,006) inoculation. Both visual detection of leukocytes and asitic inoculation in blood culture bottles had a sensibility and a specificity of 100 %. Conclusions: Our study showed that URITOP + TM is a rapid and effective method in the early diagnosis of SBP. Besides the inoculation of blood culture bottles with ascitic fluid improves the detection of SBP and ameliorate consequently the prognosis. Introduction: Although it is well known that lymphedema of limbs is associated with higher possibility of developing cellulitis, the Introduction: Botulism is a rare paralytic illness caused by the action of a potent neurotropic exotoxin produced by Clostridium botulinum. C. botulinum species produce seven serologically distinct toxins, A-G. Human botulism is primarily caused by toxin types A, B or E. In Europe, most wound botulism is now associated with injectable drug use. Symptoms are of descending paralysis, bulbar palsies and respiratory failure. Methods: This is a review of the clinical, microbiological and public health aspects of an outbreak of botulism in Scotland from December 2014 to June 2015. Data was collected via prospective patient interview and clinical data collation. Results: 47 cases of wound botulism were reported: 23 probable, 17 confirmed, 5 discounted and two remain possible; Median age 42, range 24-55 years, 65 % were male. 98 % of cases presented with bulbar palsy but very few with descending limb weakness. 52 % of patients required surgical drainage of wound-related abscesses and 65 % required mechanical ventilation. All patients received 3-doses of trivalent anti-botulinum toxin therapy within 24 hours of clinical diagnosis. The diagnosis was microbiologically confirmed in 17 cases due to presence of C. botulinum type B neurotoxin detected by molecular assay in tissues and/or by serum bioassay. The C. botulinum strains from 10 cases showed a common profile on fAFLP typing, thereby confirming a link to a common source. All had a recent history of injecting heroin which was obtained either in, or sourced, via Glasgow. The source of infection remains unconfirmed but is thought to be due to contaminated heroin, or cutting agent. There were four deaths, botulism contributing to two. Police Scotland was closely involved in risk management through increased drug seizures throughout the region, reducing the supply of potentially 'contaminated' heroin. Public health measures included; risk communication via distributing postcards widely to PWIDs via drug agencies and needle exchange centres to increase awareness of signs and symptoms, advising smoking drugs as an alternative form of administration and the avoidance of muscle popping. Conclusions: This is the largest outbreak of botulism among PWIDs in Europe. Prompt identification of cases and appropriate clinical management resulted in low mortality amongst the cases. Interdisciplinary cooperation was integral in management of the outbreak. Introduction: Extended-spectrum-beta-lactamase-producing Enterobacteriaceae (ESBL-E) strains are considered to be important, since few antibiotics currently remain active against these bacteria. However, the implications associated with the surveillance of fecal carriers of ESBL-E among ICU patients are unclear. The aim of this study was to determine the efficacy of carrying out stool screening for ESBL-E. We conducted a retrospective cohort study in the ICU of Fukuoka University Hospital from April 2013 to September 2014. The occurrence of infections or colonization with ESBL-E, the antimicrobial use density (AUD) values, and the clinical outcome were investigated for the described periods, and those parameters were compared between April-December 2013 (phase 1) and January-September 2014 (phase 2). Moreover, we routinely performed stool cultures for ESBL-E in phase 2. Results: Of the 1,245 patients who were admitted to our ICU during the study period, we identified 670 patients (phase 1) and 575 patients (phase 2) respectively. The rates of patients either infected or colonized with ESBL-E were significantly higher in phase 2 than phase 1 ( Introduction: ESBL and carbapenemase producing gram-negative pathogens pose a major therapeutic challenge to the healthcare providers, both in hospital and in the community. Morbidity, mortality and the cost of health care delivery are all increased by the emergence of resistant pathogens. Therefore, it becomes important to have a good knowledge of local microbial spectrum and sensitivity profile, so appropriate prophylactic antibiotics could be administered. This retrospective review was conducted to assess the prevalence of both ESBL and carbapenemase producers among the uropathogens in a newly opened tertiary care hospital in Chennai, India. Methods: We retrospectively analysed the urine culture reports in our hospital over a one-year period, from Sep 2014 to Oct 2015. Data on the total requests and samples received, number of samples tested positive, organisms isolated and their sensitivity pattern were all collected. Results: During this period, the microbiology lab received a total of 1991 urine samples, 77 % (1533) from outpatients and 23 % (458) from inpatients. 18 % of the total samples received (358/1991) were culture positive (19 % in outpatients and 13 % in in-patients). 21 % of the outpatient and 63 % of the inpatient isolates were ESBL producers. E coli and Klebsiella were the commonest organisms isolated (290/358; 81 %). 92 % of the in-hospital and 73 % of the outpatient ESBL isolate was E. coli. 5 % of the in-hospital isolates were carbapenemase producers. Conclusions: The incidence of ESBL and carbapenemase production among in-hospital uropathogens is very high at 63 % and 5 % respectively. This confirms the global increase in the incidence of ESBLs and carbapenemases. Although carbapenemase production in the community is only 0.6 %, sepsis caused by these isolates is very difficult to treat with high mortality rates. Appropriate antibiotic use, following a deescalation strategy and good antimicrobial stewardship are very important in improving outcomes and preventing the spread of further resistance. Instead of following the general antibiotic guidelines, developing robust local guidelines based on the microbiological data on antibiotic sensitivity of the organisms to guide prophylaxis will be helpful in achieving high rates of clinical cure. Reference Introduction: Here, we determined whether the Acinetobacter baumannii (A. baumannii) class 2 integron is in the soil around our hospital, and if the soil is the cause for increasing numbers of A. baumannii infections in our intensive care unit (ICU) patients. A. baumannii has emerged globally as a significant pathogen in hospitals. It is also present in soil and water. In a previous study, we discovered that the A. baumannii class 2 integron occurred most frequently. This cross-sectional prospective study was conducted in two ICUs at Loghman-Hakim Hospital, Tehran, Iran, from November 2012 to March 2013. Patient, soil, and hospital environment samples were collected. All isolates were identified using standard bacteriologic and biochemical methods. The phenotypes and genotypes were characterized. The standard disc diffusion method was utilized to test antimicrobial susceptibility. Integron identification was performed by multiplex polymerase chain reaction. Results: There were no A. baumannii bacteria isolated from the hospital environment or soil, although the bacterium was detected around the lip of one patient. Also, a total of 42 A. baumannii clinical strains were isolated from the lower respiratory tract (n = 20), blood (n = 6), urine (n = 8), and wound catheters and nasal swabs (n = 8). 65 % of the isolated species were classified as class 2 integrons. The strains were 100 % resistant to piperacillin, piperacillin-tazobactam, ceftazidime, ceftriaxone, cotrimoxazole, cefepime, ceropenem, and cefotaxime. However, all of the strains were sensitive to polymyxin B Conclusions: Further research is necessary to establish a relationship between A. baumannii and soil, (especially in re¬gards to its bioremediation), as well as to determine its importance in nosocomial infections and outbreaks in the ICU. Introduction: Multi-drug resistant (MDR) Bacteria are a worldwide threat especially for intensive care unit's patients (ICU). In Gram Negative Bacilli, the emergence and spread of Extended-spectrum Beta-lactamase (ESBL) and carbapenemase-producing bacteria (CPB) is one of the common causes of morbidity and mortality associated with ICU-acquired infections (ICU AI). The aim of this study was to determine the epidemiology and risk factors for ESBL and CPB infections as well as the resistance patterns of these bacteria isolated in a Tunisian multidisciplinary intensive care unit. Methods: We conducted a retrospective, case-control study including all patients admitted between January and October 2015. ICU AI were defined as those acquired no less than 48 h after ICU admission. We did not include patients with no bacteriological evidence of infection. Differences in ICU mortality, length of stay and duration of mechanical ventilation (MV) were tested across patients with and without ICU acquired MDR bacteria infections. We also assessed infected sites, most frequent bacteria for each group (ESBL and CPB), and looked for risk factors. Results: In the study period, 184 patients were admitted to the ICU, 67 (34,41 %) had ICU AI. From these 67 patients, 35 had an ESBL infection in 53 isolates, and 21 patients were infected with CPB isolated from 30 cultures. Global mortality in the study period was 44,59 %, the mortality associated to MDR infection was 51,16 %. In fact, mortality of ESBL infections was not so higher than the global one (48,57 %) whereas CPB infection weighed down mortality to reach 71,43 %. In the same way, length of stay was significantly longer in MDR infected patients (18,37 days ± 11.6 STD) than non-MDR infected ones (8,66 ± 4.60 STD, p < 0,001). MV duration was respectively 15,85 days (±12,51 STD) in MDR group and 6,62 days (±5,10 STD, p < 0,001) in the non-MDR infected group. Length of stay and duration of MV were found to be risk factors for acquiring MDR infection in ICU. Ventilator associated pneumonia was the most frequent acquired infection as well in non MDR infected patients as in both ESBL and CPB. Acinetobacter Baumannii was the leading isolate from CPB infection (60 %) followed by Klebsiella Pneumoniae (27 %). Concerning bacilli producing ESBL, Klebsiella Pneumoniae was the most frequently isolated (54,72 %) followed by Escherichia Coli and Enterobacter Cloacae (15,09 % each). Conclusions: The prevalence of ESBL and CPB is increasing day by day in nearly every center of different countries and is responsible for large number of hospital-acquired and nosocomial infections, with very few, if any, therapeutic options. Necessary steps to prevent the spread and emergence of resistance should be taken. The mortality rate was 13 % (n = 4). We found a statistical significant association between the SAPS II score and mortality: higher scores are linked to death (Mann Whitney U test). We also found association between higher leukocyte count and prolonged activated partial thromboplastin time (aPTT) at admission with death (Mann Whitney U test). We did not find association between any other analytical abnormality at admission and at death. We also did not find any bad outcome association with patients' symptoms, physical examination abnormalities, renal or ventilation support (Person's Qui Squared Test). Conclusions: Although MSF is generally considered a benign disease we had a high ICU admission rate and a considerable mortality rate. We found that the SAPS II score, the leukocytes count and aPTT are predictors of death in patients with MSF admitted to the ICU. Introduction: The Middle East Respiratory Syndrome coronavirus (MERS-CoV) has caused several hospital outbreaks and frequently leads to severe critical illness. To learn from our experience, we described the response of our intensive care unit (ICU) to a MERS-CoV hospital outbreak. Methods: This observational study was conducted at a 1000-bed tertiary-care hospital in Riyadh, Saudi Arabia which had a MERS-CoV outbreak in Aug-Sep 2015. Our Intensive Care Department covered 5 ICUs with 60 single-bedded rooms. We described qualitatively and, as applicable, quantitatively, the response of intensive care services to the outbreak. The clinical course and outcomes of hospital workers who had MERS were noted. Results: A total of 62 critically ill MERS patients were cohorted in 3 ICUs during the outbreak with a peak census of 27 MERS patients on Aug 25, 2015 and the last new case on Sep 13, 2015. Most patients had multiorgan failure requiring support. Eight hospital employees had MERS requiring ICU admission for a median stay of 28 days: 7 developed acute respiratory distress syndrome, 4 were treated with proning, 4 needed continuous renal replacement therapy and one had extracorporeal membrane oxygenation. The hospital mortality of ICU MERS patients was 53 % (0 % for the hospital employees). All MERS patients were admitted in single negative pressure rooms, which were promptly increased from 14 to 38 rooms. The nurse-topatient ratio was~1.2:1. Infection prevention practices were intensified. For example, the consumption of surface disinfectants and hand hygiene gel increased by~30 % and an average of 17 N95 masks were used per patient/day. Family visits were restricted to 2 hours/ day and the ICU physicians communicated with the next of kin by phone daily. During the outbreak, 2 ICU nurses and 1 physician tested positive for MERS-CoV, had mild disease and recovered fully. Although most ICU staff expressed concerns about acquiring MERS, none refused to report to work. However, 27.0 % of nurses and 18.4 % of physicians working in the MERS ICUs had upper respiratory symptoms, but tested negative for MERS-CoV. The total sick leave duration was 138 days for nurses and 30 days for physicians. Conclusions: Our hospital outbreak of MERS-CoV resulted in many patients requiring organ support and prolonged ICU stay. Their mortality rate was high but lower than previously reported. The response to the outbreak required facility and staff management and proper implementation of infection control and prevention practices. (1). We included all cases containing data about age and gender (at minimum) and excluded cases reported as clusters. Data on age, gender, comorbidities, healthcare worker, contact with camels, contact with a laboratory confirmed MERS-CoV case, date of symptom onset and date of laboratory confirmation were retrieved and analyzed. Results: A total of 219 outbreak news reports were released by the WHO, 977 cases were include in our study; 68.8 % males: mean (SD) age was 53.2 years (17.9), and 31.2 % females: mean (SD) age was 53.2 years (18.4), where the youngest patient reported was 10 month and the oldest patient reported was 109 year old. Comorbidities were reported in 46.9 %: mean (SD) age was 60.1 (15.4) and 71.8 % male and 28.2 % female. Healthcare workers were reported to be 15.6 %, mean (SD) age was 38.6 (11.3) and 53.3 % male, 46.7 % female and 19.7 % of them had comorbidities. 12.1 % reported with history of camel contact; mean (SD) age was 56.7 (15.1) and 91.7 % male, 8.3 % female and 77.1 % of them had comorbidities. 25.3 % were reported to have contact with a laboratory confirmed MERS case: mean (SD) age was 43.9 (17.0), 63.0 % male 37.0 % female, 36.0 % reported to have comorbidities and 30.5 % were healthcare workers. Date of symptoms onset and date of laboratory confirmation were reported in 86.2 % and 56.5 % respectively. The overall mean (SD) of time from symptoms onset to date of laboratory confirmation was 5.2 days (4.2) (95 % CI, 4.8-5.5). Conclusions: Our study was based on publically available global surveillance data and has the advantages of a large sample demographic description. Pneumonia remains a common cause of both admission to and deterioration in the intensive care unit. In many cases no organism is identified, resulting in empiric treatment for the presumed causative pathogen. Even when an organism is identified, the delay in obtaining results is typically 48 to 72 hours. Both theses factors can lead to unnecessary antibiotics, and of even greater concern, inappropriate initial therapy. Molecular diagnostic techniques have the potential to address both these problems. We present here three cases which illustrate the utility of molecular diagnostics, using a Taqman array card (TAC). Three patients admitted to a university hospital general intensive care unit with radiographically confirmed severe pneumonia underwent bronchoscopy and lavage, with the samples being analysed by both conventional microscopy and culture and nucleic acid extraction. The nucleic extract was then placed on a TAC able to detect 48 microbial genes using real time polymerase chain reaction (RT-PCR), directed towards a range of established respiratory pathogens. Results are available with a time to threshold value which gives an indication of the abundance of that gene in the sample. The cases occurred amongst three adults (age range 19-49) who presented with clinical and radiographic evidence of pneumonia. Two cases were community -acquired with the third being hospitalacquired. The organisms identified by the TAC were Mycoplasma pneumoniae, Aspergillus spp. Cryptosporidium parvum and Pneumocystis jiroveci (PCJ), with one patient having two organisms detected. M. pneumoniae and PCJ were not identified on standard cultures, whilst Crytposporidum and Aspergillus spp and were both identified on conventional culture/microscopy 5 days after TAC identification. In all three cases, the results of the TAC assay resulted in change in management, with institution of new antimicrobials targeting the organisms identified combined with rationalisation of pre-existing antimicrobial treatments. Discussion These three cases illustrate how conventional cultures may miss important or unexpected pathogens, or return results late on in the course of the disease. We believe that this technology has the potential to significantly impact on the management of critically ill patients with lung infections, and are currently planning a larger scale evaluation of its use in critical care. Introduction: Pneumonia is a major cause for ICU admission and mortality. Prompt investigation facilitates tailored antimicrobial strategy, guides management, and aids prognostication. The British Thoracic Society (BTS) [1] suggest specific tests are performed for patients with severe pneumonia: Legionella and pneumococcal urinary antigens, sputum and blood cultures, respiratory viral PCR swabs, and atypical serology testing. We suspected these tests were inconsistently undertaken or delayed in our ICU, and introduced a computerised pneumonia screen, 'BUNS' , (Blood cultures and viral serology/Urinary antigens/Nasal +/-endotracheal viral swab/Sputum sample) to consistently investigate this condition. Methods: All patients with a primary diagnosis of pneumonia admitted to a UK district hospital ICU over a one-year period up to 31/ 10/14 were retrospectively reviewed to determine which investigations had been requested within 24 hours of admission. These were compared to the BTS guideline [1] . We subsequently implemented the 'BUNS pneumonia screen' within our electronic investigation system. A single click auto-generated request labels for all tests in the BTS guidelines. After implementation and staff education, we repeated the data collection between 1/2/15 and 1/11/15. Results: See Table 10 . Conclusions: This study has shown that a computerised autogenerated set of investigation requests, aided by an easily remembered acronym, lead to increased proportion of patients (85 % vs 28 %) receiving prompt and consistent 'gold standard' investigations for pneumonia. We believe this reduced duplication and delay in obtaining diagnostic results, thereby improving patient care. We suggest the 'Bundle of BUNS' could be easily replicated in other ICUs, and a similar system could be introduced for other presenting conditions. Introduction: Despite the significant impact of nosocomial infections on the morbidity and mortality of patients staying in the intensive care unit (ICU), no study over the past 20 years has focused specifically on pneumonia following secondary peritonitis. Our objective is to determine epidemiological features and in-hospital mortality of pneumonia in this kind of patients. Methods: Prospective observational study involved 418 consecutive patients admitted in the ICU, who had undergone a laparotomy because of a secondary peritonitis. Univariate and multivariate analyses were performed to identify risk factors associated with mortality and development of pneumonia. Results: The incidence of pneumonia following secondary peritonitis was 9.6 %. Risk factors associated with the development of pneumonia were hospital-acquired peritonitis, >48 h of mechanical ventilation, and SOFA score. The onset of pneumonia was late in majority of patients (about 16.8 days after the initiation of the peritonitis), and the etiological microorganisms responsible for it were different than for peritonitis. The 90-day in-hospital mortality rate was 47.5 % of pneumonia patients. Independent factors associated with 30-90-day in-hospital mortality were pneumonia and SOFA score Conclusions: Pneumonia in patients who had undergone a surgery because of a secondary peritonitis, increases the 90-day mortality. The onset of pneumonia following secondary peritonitis usually is late (16.8 days) and with etiological microorganisms different than those responsable for secondary peritonitis Introduction: The "CURB-65 score" is a convenient, well validated tool to assess disease severity and aid in the clinical management of community acquired pneumonia (CAP) [1] . Recent literature however demonstrates that its use is variable and often inaccurate. Failure to use this tool may lead to over or under treatment and may result in inappropriate patient admission thereby increasing healthcare costs [2] . The aim of the initial audit was to determine the frequency of the use of the score in emergency medical care and correlate this with clinical decision-making. The data was analysed and common pitfalls were identified. After the implementation of various tools designed to address these issues a repeat audit was performed to assess for any improvements. Methods: 83 case notes of patients who presented with CAP were retrospectively reviewed between March-May 2014. Based on the data 3 key interventions were implemented. A teaching session regarding the calculation and practical application of the CURB-65 score was delivered to all emergency room clinicians. A CURB-65 pro forma was created and implemented for use in the electronic patient record system used at the hospital. Alongside this a pre-designed set of electronic antibiotic prescription sets were created and incorporated into the electronic prescribing system in use at the audit site. These accountated for allergies, patient weight and renal/ hepatic function. A further 50 case notes of patients presenting with a CAP were then retrospectively reviewed during December 2014-Jan 2015 in order to assess the impact of the above interventions. Results: There was a significant increase in the number of case records containing a documented CURB-65 score after the implementation of the above interventions (43.4 % VS 92.0 %). There was an associated improvement in the accuracy of calculated scores (77.8 % VS 92.0 %). In turn, more patients received appropriate antibiotic therapy (86.7 % VS 100 %). Of note, 6 patients prior to the interventions were admitted despite their calculated CURB-65 score being 0-1 without an alternate reason being documented. After the implementation of these interventions no patients were admitted with these scores. Conclusions: Staff education, tools to aid accurate CURB 65 calculation and pre-created antibiotic order sets lead to improvements in the initial assessment and subsequent management of patients presenting with CAP. These changes should be consider for use on a wider scale. Introduction: Community acquired pneumonia (CAP) guidelines acknowledge respiratory viruses as a cause for CAP, but provide few recommendations regarding diagnosis or treatment due to lack of data [1] . New technology to detect viral pathogens by highly sensitive nucleic amplification tests has allowed for identification of additional viruses through a small sample of respiratory secretions [2] . Our aim is to describe the incidence of viruses in mechanically ventilated patients with CAP in the ICU. Results: 266 patients were ventilated for more than two days. The HVAP and PVAP rates per 1000 at risk ventilation days (total diagnoses) were 5.0 (10) and 2.0 (4) respectively. Concordance between diagnoses was poor (Cohen's Kappa -0.02 (95 % CI: -0.04 to -0.01)). HVAP was statistically associated with longer stay and ventilation, in contrast to PVAP [1] and monitoring [2] . Dynamic linear/arborescent air-bronchogram is specific for VAP. Methods: 2 patients suspected for VAP were examined before and after fiberbronchoscopy (FBS) by LUS, focusing on subpleural consolidations, lobar/hemilobar consolidations and air-bronchogram. Bronchoalveolar-lavage (BAL) was performed. Results: Before FBS, LUS pattern wasn't specific for VAP: both patients had only bilateral consolidations with no air-bronchogram (Fig. 16a,c) . After FBS, LUS pattern changed. In patient 1, the tissuelike pattern turned into B-lines (Fig. 16b) . BAL was negative. In patient 2, the same tissue-like pattern was visualized after FBS; a dynamic arborescent air-bronchogram appeared (Fig. 16d ). BAL had P.Aeruginosa 10(6) CFU/ml. Conclusions: LUS allows early bedside VAP diagnosis. If the pattern is non specific, changes after FBS may be useful: reaeration orients to atelectasis; persistence of complete loss of aeration with dynamic airbronchogram appearance orients to VAP. Methods: This retrospective study recruited patients admitted to the Intensive Care Unit of the Anesthesiology Department, Marmara University Hospital (Istanbul, Turkey), from January 2014 to September 2015. Patients' progress was followed until the 28th day after the diagnosis of VAP, when they were considered survivors. Patients who died before the 28th day were non-survivors. Patients discharged from the ICU before the 28th day were also considered survivors. APACHE II score was assessed during first 24 h of admission; CPIS and CEPPIS score were assessed at the onset of VAP (day 1). SOFA score, serum CRP, serum PCT, pro-BNP were assessed on day 1,4, 7 of VAP diagnosis and were correlated with 28-day survival and mortality. Results: A total of 44 patients were enrolled. Of them, 23 (52.2 %) died before day 28 after VAP diagnosis. The SOFA score in our study was significantly lower in survivors at day 1, 4 and 7 compared with nonsurvivors. In terms of APACHE II score, CPIS and cause of admission; the two groups were comparable. There were no significant difference between the nonsurvivors and survivors in terms of PCT, CRP, leucocyte count at days 1 and 7. However, the leucocyte count, PCT and CRP levels on day 4 were significantly higher in nonsurvivors than survivors. Also pro-BNP levels at days 4 and 7 were significantly higher in the nonsurvivors than survivors. Conclusions: The biomarkers, PCT, CRP and pro-BNP, can predict mortality in VAP, as can SOFA score. Although, CPIS and CEPPIS has been described as useful scoring systems for diagnosis for VAP, they did no differentiate between survivors and nonsurvivors. In compare to CPIS, CEPPIS, a score based on chest ultrasonography and procalcitonin levels, may be a better predictor of diagnosis of VAP. Introductions: The aim of our observation retrospective study was to test the hypothesis that a correlation exists between pRBCs transfusion and the percentage of ventilated patients developed VAP (% VP) in our both medical and surgical ICU served in community hospital. Methods: From January 2006 to June 2014 admitted to our ICU 620 patients. We looked for the percentage of ventilated patients developed VAP and the following indexes according pRBCs transfusion per year from 2006 to 2014. Total, per patient, per hospitalization days (HD), per patient ventilated (pts V), per ventilation days (VD) Using linear correlation method, we looked for linear slope, correlation coefficient (r), and coefficient of determination (r2), and by linear regression method using ANOVA test we looked for p value, according % VP and pRBCs transfusion. Results: Conclusions: According to our data, there was no statistically significant correlation detected between the percentage of ventilated patients developed VAP and pRBC transfusion. Our data suggest that even though pRBC transfusion may have an impact on immunosuppression and infection disease developing, the impact on the percentage of ventilated patients developed VAP is not statistically significant. Introductions: Ventilator associated pneumonia (VAP) has a reported incidence of between 9 -27 % and a strong association with increased ICU length of stay and mortality [1] . Current evidence suggests that the application of a care 'bundle' may reduce the rate of VAP [2] . We aimed to assess the effects of a sequential multifaceted care bundle and audit program on VAP rates in our institution. Methods: The existing care bundle (head up positioning, daily sedation holds) was supplemented by a 3-step process of interventions between December 2008 and October 2015, each accompanied by a dedicated staff education initiative: 1) Guidance on stress ulcer prophylaxis, changing of respiratory circuitry and sedation holds; 2) Introduction of a new endotracheal tube with continuous subglottic suction and cuff pressure monitoring; 3) New oral hygiene guidance. Data was analysed using MS excel and a statistical process control chart designed for non-conformity attribute data with an unequal area of opportunity. Values are expressed as mean and upper control limits (UCLu). Results: A total number of 26480 consecutive ventilator days were assessed. Monthly VAP rates remained stable around a constant mean of 16.45 per 1000 ventilator days between December 2008 and September 2013. Following introduction of oral hygiene guidance, a statistically significant shift in process occurred identifying a special cause. This represents a reduction of monthly VAP rate to 7.44 per 1000 ventilator days (Fig. 17) . Conclusions: Introduction of a multidisciplinary stepwise VAP prevention initiative led to a significant reduction in mean monthly VAP rates over time. The EVADE study: Prevention of Nosocomial Pneumonia (NP) caused by P aeruginosa with MEDI3902, a Novel Bispecific Monoclonal Antibody, against P aeruginosa virulence factors J. Chastre 1 , B. Introductions: Pseudomonas aeruginosa is one of the leading causes of nosocomial infections in critically ill patients. The organism produces an array of factors that contribute to its virulence. Two such virulence factors are the Psl exopolysaccharide, which contributes to bacterial persistence, and the PcrV protein which contributes to the virulence of the organism. MEDI3902 is a novel bivalent, bispecific mAb that selectively binds to both factors, thereby inhibiting the cytotoxicity and immune evasion properties of the pathogen. Psl and PcrV target expression is independent of antibiotic susceptibility status; therefore, MEDI3902 also has the potential to be active against multi-drug resistant strains of P aeruginosa. MEDI3902 is being developed for the prevention of P aeruginosa pneumonia. We describe the rationale and design of the EVADE study to investigate the use of MEDI3902 in mechanically-ventilated (MV) subjects. Introductions: Colistin, a last-line therapeutic agent for the treatment of multidrug-resistance gram-negative infections, is limited in its use due to nephrotoxicity. This retrospective cohort study evaluates risk factors for colistin-induced severe acute kidney injury (AKI) in critically ill patients. Methods: Patients admitted to a university hospital ICU, were without pre-existing kidney injury, and received colistin therapy 72 or more hours were included. Patient demographics, the source of infection and outcome were collected. AKI was evaluated by using the Introductions: Despite the administration of body-weight standardized starting doses of gentamicin, a wide range in peak concentration (Cpeak) is generally observed in critically ill patients resulting from variable pharmacokinetics of gentamicin between patients and within a patient over time. This may hamper the efficacy of gentamicin treatment. The aims of this study were 1) to determine the percentage of patients that reached a target Cpeak level of 15-20 mg/L after the first dose as well as after subsequent doses during which therapeutic drug monitoring (TDM) was applied and 2) to quantify the impact of several patient parameters on the volume of distribution (Vd) of gentamicin, which is correlated with Cpeak. Methods: Blood samples were prospectively collected and analysed by nonlinear mixed-effects modelling (NONMEM) to estimate Vd and Cpeak for every administration during a course of gentamicin therapy. Clinical data and routinely collected serum gentamicin levels from critically ill patients were analysed retrospectively with NONMEM to quantify the impact of patient parameters on gentamicin Vd and Cpeak. Results: With a median starting dose of 4.9 mg/kg, 39 % of Cpeak were subtherapeutic after the first dose (n = 59 patients). In total, 52 % of 131 Cpeak were not within the therapeutic range of 15-20 mg/L during the course of therapy. Of the 20 cases with subsequent Cpeak values and no dose adjustment, 6 (30 %) changed from non-therapeutic to therapeutic and 3 (15 %) changed from therapeutic to non-therapeutic Cpeak. Even after performing TDM, 24 % of subsequent doses resulted in subtherapeutic and 24 % in supratherapeutic levels. Analysis was performed retrospectively on 303 gentamicin concentration measurements from 44 critically ill patients receiving 174 doses. Albumin serum level was significantly (p < 0.001) associated with Vd and with Cpeak. Most patients with albumin levels below 15 mg/L did not obtain therapeutic Cpeak. Conclusions: 39 % of gentamicin Cpeak were below the therapeutic range after a first median dose of 4.9 mg/kg in critically ill patients, suggesting that higher doses are indicated. Patients with hypoalbuminaemia are at increased risk for achieving suboptimal gentamicin Cpeak, suggesting that gentamicin dose should be increased for patients with hypoalbuminaemia. Since gross fluctuations in Cpeak were observed over time within patients, Cpeak should be measured repeatedly when patients receive gentamicin treatment for more than 2 days. Introductions: Cefepime is a widely used antibiotic with the potential for significant, but under-appreciated neurotoxicity due to its ability to cross the blood brain barrier and antagonize GABA activity in a concentration dependent fashion. Data suggest that 15 % of ICU patients treated with cefepime will experience neurotoxicity with symptoms that range from encephalopathy, myoclonus, seizures and coma. Recognition of this adverse event is difficult since these symptoms are commonly experienced by the critically ill. We reviewed published literature to further characterize this adverse drug event. Methods: A librarian-assisted search was conducted to create a list of all publications associated with cefepime associated neurotoxicity patients from 1980-2015. Our search employed the CINAHL and MEDLINE databases for article identification and the PRISMA-P protocol was followed to promote robust reporting of data. To determine eligibility for study inclusion, identified articles were independently assessed by two reviewers. Results: Thirty-five identified publications described cefepime induced neurotoxicity in 137 patients. Patient characteristics included both ICU (17 %) and non-ICU patients (4 %), but patient location was unspecified in most cases (79 %). Symptoms included encephalopathy (75 %), myoclonus (41 %), seizures (11 %), and coma (1 %). Renal dysfunction was present in 93 (68 %) patients and cefepime dosing was appropriate, excessive, or unable to be assessed for 20 %, 41 %, 39 % respectively. Symptom onset was between 1-15 days (median 4) after initiation of therapy with resolution within 3 days (median 2) of cessation of cefepime. Emergent hemodialysis was employed in one case, and 55 patients (40 %) required treatment with an antiepileptic medication. Serum levels were collected in 12 patients and were exceeding toxic thresholds (<20 mcg/ml) in 11 patients (mean 64 ug/mL). Ninety-one patients (66 %) survived, 13 patients died (9 %), and the individual outcomes of the remaining patients were unreported. Complete neurologic recovery was reported in 62 patients (45 %) and symptom improvement in 27 (20 %). All patients with reported neurologic recovery did not improve until cefepime was discontinued. Conclusions: Cefepime associated neurotoxicity is an underappreciated adverse drug event that is difficult to recognize in the critically ill. Early recognition of cefepime neurotoxicity is critical and may prevent potentially devastating consequences that occur with continued drug administration. Introductions: Patients admitted to an intensive care unit have a multifactorial disorder based on diseases in which cascades of immunomodulation mediators may be released as response to pathogenic organisms. To treat the inflammatory reaction an antibiotic therapy is fundamental [1] . Effects of disease pattern and intensive care measures (e.g. sepsis) on heart rate and variability are poorly understood. Haran B described that with a high resolution ECG even in patients without prior heart pathologies an electrical instability could be detected during repolarisation shortly before cardiac arrest, not recordable with a conventional ECG [2] . Therefore we analysed changes of beat-to-beat cardiac activity during antibiotic therapy of intensive care patients with a high resolution electrocardiogram. Obtained results may offer new insights in the development of alterations in cardiac electrical activity of critical ill patients due to antibiotic therapy. Methods: Administrated at 1000 Hz sampling rate the cardiac electric activity of 14 patients of the intensive care unit were analysed during their antibiotic therapy. The patients received a Unasyn®-infusion, which contains 1 g Sulbactam, 2 g Ampicillin and 230 mg sodium. Obtaining continuous ten-minute recordings (Lab SystemTh Pro -Bard electrophysiology U.S.A.) ten electrodes were fixed on the prepared skin for recording the leads I, II, III and V1 to V6 and reconstruct pursuant to Einthoven's equation aVR, aVL, aVF. Results: Results obtained from 14 treatments with Unasyn® demonstrate that from the onset of the infusion the QT-interval increases additionally up to 39 ms (p < 0,05). This variation persisted for the first three minutes of therapy and returns during the next two minutes to their pre-values. Other ECG data remained unchanged during the time of treatment. Conclusions: Haemodynamic alterations -QT-interval prolongationcould be detected with onset of antibiotic treatment with Unasyn®. The similar antibiotic Tazonam® showed in another study of us no significant beat-to-beat changes. With regard to comorbidities of ICU patients, it seems reasonable that changes in cardiac electric activity might be observed even earlier during their ICU stay. Introductions: Bacteriophages are environmental viruses. Lytic phages have proprieties to penetrate and multiply inside their targeted bacteria until the release of new virions. Many clinical empirical data are available from Georgia and Poland, but no control randomized clinical trial is available. The phagoburn study assess efficacy and tolerance of two cocktails of bacteriophages against Escherichia coli and Pseudomonas aeruginosa. Methods: Natural phages were isolated from hospital and Paris sewages. They were featured by phenotype and genotype to exclude transfer of antibiotic resistance genes and lysogenic proprieties by Pherecydes-Pharma. PP0121 is a cocktail of 12 phages against E coli, and PP1131 was performed against P. aeruginosa. Cocktails are applied by a topical way in burn patients infected by E. coli or P. aeruginosa applied through an alginate gauze (Algosteril) onto infected wounds. The control arm is treated by Silver Sulfadiazine (standard treatment). The main goal is the comparison of time to reduce bacterial load of 2 quadrants on plate by daily eSwabs. 220 patients are expected. All patients are hospitalized in a burn unit to ensure a safety environment. Results: Phagoburn was funded by European Commission (FP7) in 2013 for 3,8 million Euros. After transfer from research and development to a Master Phage Bank, bioproduction of phages was performed in GMPlike conditions by Clean Cells (Nantes-France). As a systemic absorption of phages is supposed, an isolator was necessary to ensure a sterile and apyrogenic solution. The 3 european regulators (French agency ANSM, AFMPS for Belgium, and Swiss-Medic for Switzerland) allowed the first inclusion in July 2015. In December, 15 patients were already included in 11 investigators centers. Conclusions: One century after phages discovery, Phagoburn is the first mutlicentric control randomized trial ever done in the world about phage therapy in humans. Results will be available in 2016. Phages could be an old but innovative way in the war against antimicrobial resistance. Antimicrobial resistance could be the leading cause of mortality in 2050. Introductions: The management of invasive candidiasis (IC) remains a major challenge. Delayed antifungal therapy is known as an independent mortality factor in Septic shock attributed to IC [1] . Thus, empiric antifungal therapy (EAFT) can be indicated in septic patients at risk of IC. However, the unreasonable administration of antifungal is implicated in emergence of resistant candida strains [2] . The purpose was to evaluate the impact of an EAFT on 28-day survival in septic patients without documented Candida infection. Methods: a retrospective cohort study. Two groups of septic patients without a documented fungal infection were compared according to whether they were treated or not by an EAFT. Included were hospitalized patients more than seven days that developed a sepsis and even the sepsis origin has not been determined. Excluded were patients treated with an antifungal adapted to documented fungal infection and immunocompromised. The analysis consisted in evaluating the impact of an EAFT on 28-day survival. Introductions: Red blood cell (RBC) transfusion is associated with increased morbidity and mortality in the critically ill. We hypothesized that MPs from stored RBC bags can induce endothelial activation, either directly or mediated by immune cells. Methods: MPs were isolated by high speed centrifugation from RBC transfusion bags. Human umbilical vascular endothelial cells (HUVECS) were incubated for 6 hours with supernatant from RBC bags either containing MPs or depleted from MPs, with or without addition of monocytes or granulocytes. Controls were incubated with PBS as a negative and TNF as a positive control. Expression of adhesion markers were measured on flow cytometry and markers of coagulation and inflammation were measured in the culture medium. Monocytes were cultured with stained MPs and adherence was studied using confocal microscopy. Results: Supernatant from RBCs containing MPs up-regulate endothelial expression of ICAM and E-selectin with 7.8 [2.8 -22.9]% and 2.6 [-1 -7.7]% resp. compared to baseline. Up-regulation was absent when stimulated with RBC supernatant depleted from MPs. Also, upregulation depended on the presence of monocytes. MPs adhere to monocytes, which was partly abrogated after co-incubation with antiCD11b or anti-CD18 antibodies (from 100 to 67[57-72]% and to 56[47-67]% of positive cells resp.). RBC-derived EVs also strongly induce endothelial shedding of vWF. Conclusions: Up-regulation of endothelial cell adhesion markers and shedding of vWF antigen following RBC transfusion is caused by MP, and requires the presence of monocytes. Thereby, MPs from RBC units induce a pro-inflammatory and pro-coagulant endothelial cell response. Introductions: The development of thrombosis in ICU patients is one of the most frequent causes for unfavourable outcome. The different types severe illness are accompanied by activation of cellular and biochemical processes. The aim of the study is to investigate the possible effect of cytokines on thrombosis in trauma and neurosurgical patients in ICU. Methods: Eighty-four patients, 46 with trauma and 38neurosurgical included in the study, divided into two groups according to thrombosis incidence. Demographic and clinical data were recorded and a single blood sample collected on admission. Interleukins (IL-) -6,-8,-10 and Tumor necrosis factor (TNF)-a, were determined in serum by ELISA. Statistics performed with Graphpad 5.0. Results: Patients with thrombosis showed statistically significant elevated proinflammatory cytokines (TNF-a, p < 0.01, IL-6, p < 0.05) and lower anti-inflammatory IL-10 release (p < 0.05) on admission, compared to those without thrombosis. IL-8 admission levels, had no difference between groups (p > 0.05), due to later expression. There were no significant differences between age and hospitalization days in both groups. Thrombosis found to be correlated with poor outcome (p < 0.05). Conclusions: The early activation of cytokines in trauma and neurosurgical patients seems to play an important role on thrombosis development and further patient's outcome. Introductions: The incidence of venous thromboembolism (VTE) in acute burn patients has been reported to range from 0.4 %-23%. [1] Standard enoxaparin prophylaxis dosing has been shown to provide anti-Xa levels below goal in a significant proportion of burn patients, resulting in a potential increase in thrombotic risk. [2] The objectives of this study were to evaluate the effectiveness of standard prophylaxis enoxaparin dosing within an inpatient burn unit and identify patient characteristics associated with lower initial anti-Xa levels. Methods: Patients admitted to the burn unit between November 2009 and July 2015 with an appropriately measured anti-Xa level were examined through retrospective chart review. Levels were considered appropriate if drawn 3-5 hours after at least three consecutive enoxaparin doses. Anti-Xa levels ranging from 0.1-0.4U/mL were considered to be within the goal prophylactic range. Patient demographics and injury data including age, sex, admission weight, and burn percentage of total body surface area (%TBSA) were documented. The incidence of VTE and adverse events, including bleeding events attributed to enoxaparin, were also recorded. Results: Thirty-three acute burn injury patients treated with enoxaparin had at least one measured anti-Xa level during admission. Only 21 of these patients (63.6 %) had an initial anti-Xa level within the specified goal range. An additional 6 patients (18.2 %) were initially subtherapeutic, but achieved therapeutic levels with dose adjustment, while 4 patients (12.1 %) never had a measured anti-Xa within goal range prior to the discontinuation of enoxaparin or before discharge. The median enoxaparin dose required to achieve goal anti-Xa levels was 40 mg every 12 hours. Patients with an initial therapeutic level had an average burn size of 23.2 % TBSA and mean admission weight of 106.7 kg, while those with an initial subtherapeutic level had an average burn size of 40 % TBSA and mean admission weight of 114.1 kg. There was one VTE event among the study population, occurring in a patient with therapeutic anti-Xa levels. There were no documented bleeding events leading to the discontinuation of enoxaparin in any of the study participants. Conclusions: Standard enoxaparin dosing was shown to provide initial subtherapeutic anti-Xa levels in many patients with acute burn injury. In the overall study population, the enoxaparin dosing strategy was associated with a low incidence of VTE and bleeding complications. Introductions: The purpose of this study was to determine optimal cut-off values of haemoglobin, platelet count and fibrinogen at 24 hours after injury associated with mortality in trauma patients. Methods: We performed a retrospective analysis of patients survived over 24 hours after injury from J-OCTET (Japanese Observational study for Coagulation and Thrombolysis in Early Trauma) database. J-OCTET was a retrospective multicenter study to investigate disorders of coagulation and thrombolysis in patients with severe trauma. Multivariable logistic regression models were developed to determine optimal cut-off values of hemoglobin, platelet count and fibrinogen at 24 hours after injury. We validated the models internally with bootstrapping to assess potential overfitting. Results: There were 722 trauma patients included, with median age of 57 years, median injury severity score of 22, median revised trauma score of 7.84, and an overall mortality of 6.5 %. The optimal models associated with mortality were hemoglobin < 10. Introductions: The mortality in the in patients with traumatic injuries in a case of bleeding is the most frequent cause of preventable death after severe injury. Methods: The study involved 91 patients who entered the Odessa Regional Hospital with traumatic injuries: concomitant skeletal trauma, fractures of femur and humerus. Patients were divided into 2 groups: 1st group (n = 46) as a treatment was received PCC in a dose of 1 ml/ kg (25 IU/kg) and TXA in a loading dose of 1 g during 10 minutes followed by an infusion of 1 g during 8 hours at time of admission to the intensive care unit (ICU); 2nd group (n = 45) received FFP in a dose of 15 ml/kg and TXA in a loading dose of 1 g during 10 minutes followed by an infusion of 1 g during 8 hours. Evaluation of the functional state of the hemostasis system was carried out using lowfrequency piezoelectric thromboelastography (LPTEG) on admission to hospital and 24 hours after the patient's admission to the ICU. Results: According to LPTEG indicators traumatic injuries patients has a statistically significant abnormalities in all parts of hemostatic system: platelet aggregation -Intensity of contact coagulation (ICC), the coagulation -Intensity of coagulation drive (ICD), clot maximum density (MA) and fibrinolytic activity -Index of retraction and clot lysis (IRCL Methods: Our research involved 51 patients with massive postpartum bleeding after Ñesarean section that were divided into 2 groups: 1st group contained 10 patients as a treatment of massive bleeding with coagulopathy was scheduled PCC in a dose of 1 ml/kg (25 IU/kg), packed red blood cells (PRBC), 2nd group (41 patients) received fresh frozen plasma(FFP) in a dose of 20 ml/kg and PRBC. The functional state of the hemostasis system was carried out using lowfrequency pyezoelectric thromboelastography(LPTEG) on admission to hospital and every 2 hours after admission. Results: According to LPTEG indicators patients with massive postpartum bleeding had abnormality in all parts of hemostatic system: platelet aggregation -Intensity of contact coagulation(ICC) was reduced by 45.64 %, the coagulation -Intensity of coagulation drive(ICD) was less than normal at 59.32 %, clot maximum density (MA) was reduced by 88.15 % and fibrinolytic activity -Index of retraction and clot lysis ( Introductions: Peri-and postoperative bleeding is a major cause of morbidity and mortality in cardiac surgery. This study evaluates the value of thrombin generation (TG) and a new platelet function analysis in the prediction of haemostatic problems. Methods: We studied 82 patients undergoing cardiac surgery in 2015. Blood samples were collected before surgery (T1) (n = 82) and upon arrival at the ICU (T2) (n = 67). TG was measured by Calibrated Automated Thrombography (CAT). Blood loss was recorded until 24 hours after surgery. Platelet function was assessed by a platelet activation test (PAC-t-UB) which estimates P-selectin expression and binding of fibrinogen to activated alpha IIb beta 3 after platelet activation of GPVI, PAR-1 and P2Y12 receptors, by specific agonists: P1Y12-ADP, PAR-1-TRAP, GPVI-CRP. Patients were divided in a high and low blood loss group, with a cutoff value of 1 liter during ICU stay. Results: TG parameters lagtime and time-to-peak in pre-operatively sampled whole blood, showed significant prolonging in the high blood loss group. (2.53 ± 0,52 vs 2.88 ± 0,87: p = 0.027 and 4,67 ± 0,61 vs 5,12 ± 1,00: p = 0.017 respectively). The PAC-t-UB test showed significant differences between groups in P-selectin expression by GPVI, PAR-1 and P2Y12 agonists at T1 (Table 16) but not in fibrinogen binding. Conclusions: Preoperative assessment of TG measured by CAT was predictive for blood loss following cardiac surgery. These findings may help in identifying patients at risk for postoperative bleeding complications. Introductions: Bleedings in severe thrombocytopenia (TCP) vary from mild to severe and may be even life-threatening when accompanied with hemodilution as a result of trauma and subsequent fluid resuscitation. The aim of this study was to evaluate the effect of fibrinogen and activated prothrombin complex concentrate (FEIBA) on clot quality and resistance to fibrinolysis in a model of TCP and hemodilution. Methods: Blood was obtained from healthy volunteers who signed an informed consent form approved by the local research ethics committee. TCP [(16 ± 4) x 10 6 mL -1 ] was created by mixing plateletpoor plasma with packed cells. The samples were diluted or not to 60 % with TRIS/saline buffer and subjected to clotting in the presence of tPA. Blood was spiked with fibrinogen (3.0 mg/mL) and/or FEIBA (1 U/mL). Maximum clot firmness (MCF, mm) and lysis onset time (LOT, min) were evaluated using rotational thromboelastometry and presented as Mean ± SD. Results: MCF in nondiluted TCP blood was 18 ± 5. Spiking the blood with fibrinogen and FEIBA increased it to 29 ± 4 and 27 ± 7, respectively (P < 0.001). Combined use of these agents increased MCF to 34 ± 4 (P < 0.01). Dilution of TCP blood reduced MCF to 9 ± 5 (P < 0.001). Spiking diluted blood with fibrinogen increased MCF to 15 ± 4 (P < 0.01), spiking with FEIBA increased it to 22 ± 4 (P < 0.01), and combination of both agents had the greatest effect on MCF that reached 30 ± 6 (P < 0.01) and did not differ from its level in nondiluted blood following combined spiking. LOT in nondiluted TCP blood was 13.1 ± 5.1. Spiking the blood with fibrinogen had no effect on LOT while spiking with FEIBA prolonged it to 21.5 ± 7.3 (P < 0.05). Combined use of both agents further increased LOT to 33.0 ± 4.6 (P < 0.01). Dilution of blood as well as spiking diluted blood with fibrinogen had no significant effect on LOT while spiking with FEIBA prolonged it to the same extent as in non-diluted blood (P < 0.05). Compared to FEIBA, combination of both agents slightly prolonged LOT, but the difference did not reach statistical significance. Conclusions: Both in nondiluted and diluted TCP blood: (1) fibrinogen and FEIBA synergistically improve clot quality (firmness); (2) fibrinogen alone does not increase clot resistance to fibrinolysis but potentiates such effect of FEIBA if added in combination. Both agents could serve as alternative therapeutic approach to bleeding patients with severe TCP. Introductions: Activated clotting time (ACT) is currently used for dose adjustment of heparin during CPB, and has been thought as gold standard in this field. However, in difficult case, such as prolonged cardiopulmonary bypass, consumption coagulopathy might also occur, and there might be need for blood components. Monitoring ACT only is not adequate in such case. The CoaguChek XS system is a portable instrument for monitoring oral anticoagulation therapy. It determines the PTINR value from a drop of capillary blood specimen, and it has been reported that accuracy seems to be better for hematocrit (Ht) >30 %. We compared PTINR measured by CoaguChek XS (CCINR) with PTINR analyzed by laboratory test (LabINR) after CPB. Methods: 34 patients who underwent CPB were enrolled. After weaning from CPB and heparin reversal by protamine administration, CCINR, LabINR, ACT, aPTT and Ht were measured. The correlations between CCINR and LabINR, ACT and aPTT, were assessed. Results: CCINR correlated well with labINR (r = 0.81, Y = -0.163 + 1.191 X). Moreover, evaluating subgroup whose Ht <30 % (N = 27, mean Ht 24.6 %), correlation was thought to be acceptable (r = 0.706, Y =0.047 + 1.058 X). On the other hand, correlation between ACT and aPTT, widely considered as "the Gold standard h was inferior (r = 0.567, Y =0.047 + 1.058 X). Conclusions: CCINR is an useful alternative to labINR for guiding blood component therapy after CPB. Introductions: The International Society of Thrombosis and Haemostasis (ISTH) DIC scoring system has been developed as a simple tool to diagnose DIC. It is based on platelet count, prothrombin time (PT), fibrinogen level and fibrin degradation products (FDP) or D-Dimer results. However, the score is less useful in critical illness as D-Dimers and FDP are usually elevated due to the underlying disease. Hence neither DIC scoring nor FDP or D-Dimer measurements are routinely performed in ICU. Triggers to request D-Dimer measurements have not been established. Methods: This retrospective study included ICU patients admitted to the RLUH over a 4-year period. Demographic details were collected at admission along with routine blood results for the first 7 days. A truncated DIC score was calculated for all patients who had coagulation tests done for 7 days: Platelet count >100x10 9 /L = 0 points, >50 to <100x10 9 /L = 1 point, < 50x10 9 /L = 2 points, PT <3 s prolonged = 0 points, >3 s but <6 s = 1 point, >6 s = 2 points, fibrinogen >1.5 g/L = 0 points, <1.5 g/L = 1 point. Patients with a truncated DIC score of ≥2 were considered high risk for DIC. In this patient cohort, highly elevated D-Dimers would establish a diagnosis of overt DIC. Patients admitted after elective surgery and those who stayed on ICU <7 days were excluded. Results: 464 patients were included in the analysis. ICU mortality was 25.4 %. Mean APACHE II was 19.8 ± 7.1. 272 patients had a maximum DIC score of 0 (167 patients) or 1 (105 patients) and were not at risk for overt DIC. However, 192 patients had a maximum DIC score of ≥2, with 14 patients displaying a DIC score of 5 even without D-Dimer results available. In none of the patients with a DIC score of ≥2, D-Dimer or FDPs were analysed. Only one patient with a DIC score of ≥2 had normal platelet counts and two patients showed mild thrombocytopenia. All patients with severe thrombocytopenia and 73 of 122 patients with moderate thrombocytopenia had a DIC score of ≥2. Conclusions: DIC, although associated with outcome, remains underdiagnosed in critically ill patients. In patients with moderate or severe thrombocytopenia a truncated DIC score should be calculated. If the score is between 2 and 4, measurement of D-Dimers or FDP should be requested to fully establish the diagnosis of DIC. Introductions: Diagnosing thromboembolic disease represents a challenge in older patients with COPD. The ADJUST-PE study has recently validated a new age-adjusted D-dimer cutoff (age x 10) in patients older than 50 years old. When the results of this study are cautiously analyzed, it seems that the validity of this cutoff in patients with COPD is unreliable. The objective of the current study was to specifically validate or invalidate the use of the age-adjusted D-dimer cutoff in patients with COPD aged 50 or older. Methods: Files from patients who visited the emergency department of Hôpital Sacré-Coeur de Montréal between March 2008 and May 2014 were retrospectively reviewed. Patients aged 50 and older with COPD and with a non-high pretest probability of thromboembolic disease were included. These patients all had a D-dimer measurement and a radiologic exam (pulmonary angioscan, ventilation/perfusion lung scan or a lower extremity doppler). The results are presented in proportions with 95 % confidence intervals without continuity correction and with sensitivity measurements. Results: 195 patients were included according to those criteria. 5 were excluded from analysis because of a non-confirmed diagnosis. Among the 190 patients analyzed, 15 had a diagnosis of thromboembolic disease, resulting in a prevalence of 7,89 % (IC95% = 4,49 %-12,69 %). 4 had a negative D-dimer level with the conventional cutoff (<500 μg/L) and 15 (all of them) had a negative d-dimer level with the age-adjusted cutoff. The sensitivity of the conventional cutoff was 73,33 % (IC95% = 44,90 %-92,21 %) and the sensitivity of the age-adjusted cutoff was 0 % (IC95% = 0,00 %-21,80 %). Conclusions: In conclusion, the new age-adjusted D-dimer cutoff cannot be used safely in patient with COPD older than 50 years old who have a non-high pretest probability of thromboembolic disease. Introductions: Aim of this study was to evaluate the impact of low hemoglobulin (<7 g/dl) during cardiopulmonary bypass (CPB) because of hemodilution, on postoperative mortality and morbidity. Methods: A total of 667 patients who underwent elective cardiac surgery procedures under the use of CPB in our department during a 2 years period were retrospectively studied. The lowest value of hemoglobulin (Nadir Hb) throughout CPB was recorded. Group A consisted of patients with Hb < 7 g/dl, while group B was the control group. The following perioperative outcome indices were compared between 2 groups: Increase of Lactate (Lac) value at the end of CPB, acute kidney injury (AKI) defined by RIFLE criteria, postoperative use of non invasive ventilation (NIV), prolonged mechanical ventilation (>48 hours), perioperative infarction (PMI), stroke, total red blood transfusion > 3units, atrial fibrillation(AF) and mortality. Statistical analysis was based on chi-square test. Results: 162 patients consist group A and 505 control group. Results are shown on Table 17 . Conclusions: Optimal hemoglobulin during CPB has not been defined. Severe hemodilutional anemia (Hb < 7 g/dl) has statistical significant correlation with increase of Lac value at the end of CPB, postoperative AKI, use of NIV, prolonged mechanical ventilation, transfusion with >3 red blood cell units and has strong correlation with mortality. Introduction: Some studies have found a correlation between increased RDW and mortality in ICU patients. However, RDW values can be increased by transfusion (RBCT), erythropoiesis or hemolysis. In this study we investigated whether the prognostic value of RDW was influenced by RBCT in ICU patients. Methods: All patients admitted to ICU over 8 months were enrolled in the study. Patients were analyzed in two groups (A if no RBCT was given and B receiving RBCT before ICU admission). We measured RDW, reticulocyte index (RI) and haptoglobin on admission and 48 h thereafter. RDW was categorized in tertiles: <14 %; 14,1-15,8 %; >15,8 %. We collected 28-day mortality, changes in SOFA score and length of stay in ICU Results: 102 patients were enrolled(64 in group A and 38 in group B).Clinical variables are shown in Fig. 25 . Cox-regression analysis shows that high RDW values were significantly associated with 28-day mortality (Fig. 26) . Patients in Group B had higher level of RDW on admission (16.7 ± 2 % vs. 15.1 ± 2.2 %; p = 0.03). Nevertheless, the accuracy of RDW to predict mortality was similar in the two groups (ROC curve 0.697 for group A and 0.711 for group B). Patients with higher RDW had also a lower RI but similar levels of haptoglobin when compared to others. Conclusions: Among critically ill patients RBCT were associated with higher RDW but this did not influence the prognostic value of RDW. High RDW values could not be secondary to an increased erythropoiesis, as suggested by low RI. Reasons for admission in the paediatric intensive care unit and the need for blood and blood products transfusions Introductions: An important part in the puzzle of treatment of Pediatric Intensive Care Unit (PICU) patients (pts) is the transfusions of blood and blood products in order to stabilize the critically ill children, and improve their outcome. The purpose of the present study is to investigate the reasons for admission and the need for blood products transfusions in a cohort of PICU pts, and in specific diagnostic categories (sepsis, traumatic brain injuries, etc) Methods: We studied retrospectively all consecutive admissions in the PICU during a two years period (from 01.01.2012 to 31.12.2013). Data were collected through chart reviews for the following parameters: Demographics, reason for admission, underlying diseases, treatment characteristics, blood and blood products transfusions, and outcome of the patients. The criterion for transfusions followed the Sepsis Surviving Campaign 2012 guidelines for the pediatric population and the 2013 European guidelines for the Management of Bleeding and Coagulopathy following major trauma. In special cases the criterion for transfusion was clinically significant hemorrhage. The definitions of transfusion units were equivalent to 10 ml/kg for Red Packed Cells (RPC) and Fresh Frosen Plasma (FFP), and 0.1 unit/kg for Platelets (PLT) and Cryoprecipitate (CP). Results: A total of 233 PICU patients (124 boys and 109 girls) were recorded. There were 53 children with respiratory failure, 39 children with pathology of the nervous system, 31 with sepsis, mainly admitted from the pediatric oncology department, 24 with traumatic brain injuries, and others with miscellaneous reasons for admission. Almost 25 % of the children with respiratory and neurologic failure suffered from underlying diseases. A total of 1795 units (U) of transfusions were given which were grouped as follow: RPC (385 U), FFP (453 U), PLT (804 U), CP (153 U). Conclusions: The most common reasons for admission in the pediatric intensive care unit were respiratory failure and pathology of the nervous system. These critically ill children presented in many occasions with anemia and thrombocytopenia, hemodynamic instability, clotting and bleeding disorders (while they were in infection) and the administration of blood and blood product derivatives were paramount to them. The immediate recognition of transfusion indications, and the early prompt interventions lead to better and faster recovery of small patients. The implementation of a massive haemorrhage protocol (mhp) for the management of major trauma: a ten year, single-centre study R. Mothukuri, C. Battle were studied, using the data collated for the Trauma Audit and Research Network. Variables included sex, age, Injury Severity score (ISS), PS14 code (injury severity score), injury mechanism, use of Tranexamic acid (TXA), volume of fluids/blood components given at the scene or in the ED, in-hospital mortality hospital and ICU and hospital length of stay. Patients were divided into two groups; patients admitted pre-MHP and patients admitted post-MHP introduction. Differences were analysed using the Fisher Exact test and Mann-Whitney U tests. Results: A total of 731 patients were included, 376 in the pre-MHP and 355 in the post-MHP group. There were no significant differences in age, sex, ISS, PS14 code, injury mechanism and mortality between the two groups. The use of TXA was significantly higher in the post-MHP group (p < 0.001). The median amount of fluids given at the scene and in the ED was significantly higher (1.5 litres versus 1.0 litre) in the pre-MHP group (p < 0.001). A significantly higher number of units of prothrombin complex concentrate (PCC) and fresh frozen plasma (FFP) were given in the post-MHP group in the ED (both p < 0.05). There was no significant difference in the number of units of blood / plasma reduced cells given in the ED between the two groups. Hospital and ICU length of stay were significantly reduced following the introduction of the MHP (both p < 0.001). Conclusions: The results of this study have demonstrated that a well-defined MHP may contribute to reducing both ICU and hospital length of stay. In the post-MHP group, patients received significantly lower amounts of fluids at the scene and in the ED and there was a significant increase in the use of TXA. A significant increase in the amount of PCC and FFP given to patients in the ED was reported, with no associated increase in blood/plasma reduced cells. Introductions: Irrespective of aetiology, major haemorrhage is associated with poor patient outcome and its management presents considerable clinical and logistic challenges. The introduction of inhospital major haemorrhage protocols, following damage control resuscitation (DCR) principles, has demonstrated improved patient outcome and blood product delivery with reduced wastage of blood products. 1,2 However, simple adaptation of hospital-derived protocols is inadequate for use by retrieval teams and a tailored approach is required to address the specific challenges encountered. Methods: A protocol for major haemorrhage management by retrieval services was developed, based on the best available evidence. Within the protocol integrated guidance was provided for retrieval teams and clinical coordinators. The clinical challenge of inter-mission variability of blood product availability was addressed by the use of two regimes within one retrieval team algorithm. The "Primary" regime is based upon the resources carried by the retrieval team and represents a minimum standard of major haemorrhage management for all patients. The "Inter-Facility Transfer" (IFT) regime provides guidance on delivering hospital-standard DCR, where possible, using locally available resources. The logistic challenges encountered by retrieval teams are extensive. The retrieval clinical coordinator is instrumental in reducing delays, maximising use of resources and maintaining handover of information. Their role was clearly defined and a checklist developed in order to avoid omissions. The protocol was adopted by Retrieval Services Queensland for state-wide use by fixed and rotary wing retrieval teams. Conclusions: This novel protocol addresses many of the specific clinical and logistic challenges faced by retrieval teams. The protocol contains specific guidance yet provides flexibility to encompass the variety of major haemorrhage scenarios encountered, which may range from roadside polytrauma to the inter-facility transfer of an intensive care unit patient. The goal is seamless transition of care with ongoing damage control resuscitation from point of referral, during transfer, and upon arrival at the receiving centre. The impact of transfusion thresholds on mortality and cardiovascular events in patients with cardiovascular disease (non-cardiac surgery): a systematic review and meta-analysis Conclusions: Restrictive transfusion practice is associated with higher rates of ACS in patients with cardiovascular disease. This supports the uncertainty highlighted in current guidelines, and indicates the need for further research into best practice for this group. The relationship between poor pre-operative immune status and outcome from cardiac surgery is specific to the peri-operative antigenic threat S. Introductions: We have previously demonstrated that there is an inverse relationship between pre-operative levels of endotoxin core antibodies (EndoCAb) and the development of complications following cardiac surgery [1, 2] . It is not known whether this relationship is due to a specific protective effect against specific antigens or if it is a reflection of a generalised hypoimmunity. During the peri-operative period patients are at risk of exposure to endotoxin and staphylococcus with an incidence of surgical site infection of approximately 1-3 %. Methods: 62 patients scheduled to undergo first time aortic valve replacement had serum analysed for levels of EndoCAb, antibodies to staphylococcus (teichoic acid, á-toxin) and also to varicella as a nonspecific immune measure. The primary outcome variable was postoperative length of stay with secondary measures of incidence of post-operative infection. All patients were risk scored by means of Euroscore. Conclusions: An inverse relationship is observed as previously between pre-operative EndoCAb levels and outcome. A similar relationship can be seen also between the anti-staphylococcal antibody levels and outcome with a more pronounced infective component. The absence of a relationship between the varicella antibody levels may imply from this study that low pre-operative immunity relating to post-operative morbidity is specific to the peri-operative threat. Introductions: In view the adverse consequence of post-operative atrial fibrillation (POAF) encompassing hemodynamic deterioration, thromboembolic hazards beside the intolerable fast rate, and mortality; preventing POAF seems to be an ideal goal. The POAF events incidence reported to be 30 and 40 % in coronary artery bypass and valvular surgeries respectively. [1, 2] . We aim to assess the impact of a simple clinical practice guideline for POAF events prevention and highlighting the incidence in Asian population. Methods: Single center retrospective study conducted over one year, we enrolled all patients subjected to cardiac surgery (267), and patients with existing or prior AF were excluded. Patients were divided into two groups (group I) before and (group II) after implementing the CPG. Results: Both groups were matched regarding the age, gender, smoking history, preoperative ejection fraction, EuroSCORE, urgency of surgery and underlying surgery type valvular or coronary. There was a statically significant difference in POAF events was14.9 % in group I and vs 8.3 % in group II (P = 0.05). The overall total rate of Critical Care 2016, Volume 20 Suppl 2 POAF in all patients was 11.4 % that is below the rate in European centers. [1] Patients with POAF significantly required more inotropic support (p = 0.04), in addition they were associated with prolonged length of mechanical ventilation and ICU stay (p = 0.001 and 0.02 respectively). Conclusions: The argument of rhythm control in POAF would favor preventive measures that reduce post-operative transition from sinus rhythm with expected favorable effects of lessening the blood stasis and left atrial thrombosis chances. Introductions: Aim of this study was to investigate any beneficial effect on postoperative mortality and morbidity when a single dose of dexamethasone is administered during cardiopulmonary bypass (CPB) in cardiac surgery patients. Methods: A total of 782 patients who underwent elective cardiac surgery procedures under the use of CPB in our department during 28 months period were retrospectively studied. In 499 patients a single dose of 8 mg dexamethasone was given during CPB( group A) while it was not administered in 283 patients (group B). The following factors were compared between 2 groups: Low cardiac output syndrome (LCOS), prolonged mechanical ventilation (>48 hours), septicaemia, postoperative stroke, acute kidney injury (AKI) defined by RIFLE criteria, AKI requiring dialysis treatment, atrial fibrillation(AF), sternal wound infections and mortality. Statistical analysis was based on chisquare test Results: Group A consist of 499 pts, mean age 65.38 ± 9.8, mean EURO score II,1.86 ± 1.9. Group b consist of 283 pts, mean age 65.33 ± 11.7, mean EURO score II 1.7 ± 1.4. Results are shown on Table 18 . Conclusions: Dexamethasone administration during CPB, has no protective impact on postoperative LCOS, septicemia, stroke, AKI, atrial fibrillation, sternal infections and mortality. There is a tendency for correlation with prolonged mechanical ventilation protection and the severe form of AKI requiring dialysis but not statistical significant. Introductions: The intra-aortic balloon pump (IABP) is used in a variety of clinical settings in which myocardial function is reduced. In cardiac surgery, its role on clinical outcomes is debated due to conflicting results of retrospective analysis and limitations of a recent prospective study. The IABCS study aims to evaluate the effectiveness of prophylactic IABP in high-risk patients undergoing cardiac surgery. Methods: The IABCS study is a prospective, single-center, randomized controlled trial in high-risk patients scheduled to elective cardiac surgery at the Heart Institute/University of São Paulo. Inclusion criteria were additive EuroSCORE > = 6 or left ventricular ejection fraction (LVEF) < = 40 %. Eligible patients were randomly assigned, in a 1:1 ratio, to IABP group or control group. Removal of IABP catheter was accomplished after 24 hours of the procedure under the following circumstances: cardiac index > = 2.2 L/min/m2 and dobutamine infusion dose < = 5 μg/kg/min. The catheter was immediately removed if a severe adverse event related to the procedure was detected. The primary outcome was the composite endpoint of mortality and major morbidity in 30 days after cardiac surgery, according to the modified Society of Thoracic Surgeons definition, which included: prolonged mechanical ventilation (>24 hours), stroke, mediastinitis, need for reoperation, cardiogenic shock, and acute renal failure. Results: A total of 116 patients were enrolled from April 2014 to September 2015. Fifty-two patients were assigned to IABP group and 64 patients to control group. The mean age was 64 ± 8 years in the IABP group and 67 ± 9 years in the control group (P = 0.06). The median LVEF was 40 % (31-45) in the IABP group and 40 % (35-55) in the control group (P = 0.873) and the median EuroSCORE was 6 (4-7) vs. 6 (4-7), P = 0.873, respectively. The primary outcome was observed in 40.4 % in the IABP group and 37.5 % in the control group (P = 0.751 Introductions: Acute kidney injury (AKI) after cardiovascular surgery is associated with an increase in morbidity and mortality. Atrial natriuretic peptide (ANP) is a potent endogenous natriuretic, diuretic, and vasorelaxant peptide. However, its effectiveness on AKI is uncertain. The objective of this study was to evaluate the effects of ANP on renal function and medical costs in patients with AKI undergoing cardiovascular surgery. Methods: We conducted a multicenter prospective randomized placebo-controlled study in patients with AKI who underwent Definition of AKI used in this study was an increase in serum creatinine level > =0.3 mg/dl from preoperative level within 48 h after surgery. The patients were randomly assigned to receive a continuous infusion of low-dose ANP (0.02microg/kg/min) or placebo. The infusion of ANP or placebo continued until the serum creatinine level decreased to preoperative level. The primary endpoints were 1) changes in renal function during 90 days measured by serum levels of creatinine and cystatin C, and creatinine clearance or estimated glomerular filtration rate, and 2) need for renal replacement therapy during 90 days. The secondary endpoints were 1) length of ICU, 2) length of hospital stay, and 3) medical costs during 90 days. Results: Of the 77 enrolled patients, 37 were assigned to the ANP group (median age 72 yrs, male:female 24:13) and 40 to the placebo group (median age 74 yrs, male:female 31:9). ANP infusion did not significantly increase creatinine clearance or estimated glomerular infiltration rate compared to placebo. There was no significant difference in serum levels of creatinine and cystatin C, renal replacement therapy rate (1 of 37 patients in the ANP group vs 3 of 40 patients in the placebo group), length of ICU and hospital stay, or medical costs for 90 days. Conclusions: Low-dose ANP infusion did not show renoprotective effect nor cost-saving effect in the treatment of cardiac surgeryassociated AKI. Clinical trial registration number (UMIN 000006812). Acute kidney injury influence on high sensitive troponin measurements after cardiac surgery Introductions: The risk assessment of cardiac troponin and other cardiac biomarkers in end-stage renal disease is not equivalent where clinical decision making in patients with renal diseases based on cardiac biomarkers needs justification in relation to patient management or outcomes [1] . Long-term outcome could be influenced by acute kidney injury (AKI) in cardiac surgery [2] , but cardiac troponins need exploration in theses settings. We aim at assessing the diagnostic performance of high sensitive troponin T (hsTnT) in the settings of cardiac surgery-induced AKI. Methods: Single center observational retrospective study. A database was available for all patients. Based on the AKI Network definition patients were divided into 2 groups, group I without AKI (259 patients) and group II with AKI (100 patients) where serial of hsTnT and creatine kinase (CK)-MB followed. Both groups compared and statistically analyzed. We enrolled 359 patients, patients with ESRD were excluded. Results: A total of 359 patients were enrolled with mean age of 55.1 ± 10.2 years. Both groups were matched regarding the age, gender, body mass index, the association of diabetes or hypertension, and Euro score. AKI group had significantly higher mortality 6 % vs group I 1.7 % (p = 0.026). The hsTnT significant changes between both groups remain all over the course, which unparalleled to those of CK-MB. The AKI group with more associated with heart failure 17.9 % vs 4.9 % (p = .000); and post-operative atrial fibrillation, 12.4 % vs 2.9 % (p = 0.005). Lengths of ventilation, stays in ICU and in hospital were significantly higher in the AKI group. Conclusions: Unlike the CK-MB profile, the hsTnT showed significant changes between both groups all over the course denoting possible delayed clearance in patients with AKI that needs to put in consideration in interpreting post-operative myocardial events. Complex evaluation of endothelial dysfunction markers for prognosis of outcomes in patients undergoing cardiac surgery Introductions: The specific role of endothelin include vasoconstriction, cardiac hypertrophy and remodeling [1] . Endothelial dysfunction is reasonable to believe that there is diagnostic and prognostic value in its evaluation for outcome [2] . Methods: In a prospective trial (30 male of 54 ± 5) the method of prognosis of outcomes was examined. The dynamics of endothelin-1 (ET-1) level, metabolites of nitric oxide were analyzed in perioperative period; we studied it's correlations with clinical and laboratory data. Statistical analysis was carried out in SPSS 17.0. Results: Group I included 5 patients with multiple organ dysfunction syndrome in the early postoperative period. Group II included 25 patients with uncomplicated postoperative course (Table 19 ). The high concentration of ET-1 and low level of nitric oxide metabolites in perioperative period indicates the predominance of vasoconstrictor effect, which in its turn leads to vascular wall damage and hypoperfusion of organs and tissues. Conclusions: High levels of ET-1 and nitric oxide metabolites imbalance, which characterize the functional state of the endothelium, are predictors of postoperative morbidity in cardiac surgery. New-onset atrial fibrillation in intensive care: incidence, management and outcome Introductions: Atrial fibrillation (AF) is a common arrhythmia in critically ill patients. However, data evaluating AF in the noncardiac intensive care setting are limited. The objective of this study was to describe the incidence, management and outcome of new-onset AF in patients admitted to non-cardiac intensive care. Methods: We performed a retrospective review of consecutive patients admitted to the intensive care unit (ICU) of a district general hospital over a 10-month period. Patients with pre-existing AF and ICU admissions for routine postoperative monitoring were excluded. Results: The study population consisted of 330 critically ill patients admitted to the ICU during the 10-month period. The incidence of new-onset AF was 10 % (n = 33). 55 % of patients with new-onset AF were male and mean age was 73 ± 11 years. The medical notes were available for review for 26/33 patients. Patients with new-onset AF frequently had evidence of sepsis (81 %), respiratory failure (62 %), circulatory shock (38 %) and acute kidney injury (31 %). Mean ICU day when AF occurred was 2.6 (range 1 -10). Rhythm control strategy was used as first-line therapy for the majority of patients: 81 % received intravenous amiodarone and 12 % underwent electrical cardioversion for haemodynamic instability. 23 % received beta-blockers and 23 % digoxin as second-line therapy. 69 % of patients were anticoagulated with heparin infusion or therapeutic-dose low-molecularweight-heparin. 31 % remained in AF at time of ICU discharge. New-onset AF was associated with longer ICU length of stay (median 6 days versus 3 days, p = 0.04) but not hospital mortality. Conclusions: New-onset AF occurs in 10 % of patients admitted to non-cardiac intensive care and is associated with longer ICU length of stay. Further multicentre studies are needed to determine the optimal management and anticoagulation strategies for new-onset AF in critically ill patients. One Methods: Pulmonary hypertension was unexpectedly observed as a complication after intravenous infusion of contrast agent through the central venous catheter. This contrast agent can induce allergic reactions leading to massive blockage of the pulmonary arteries and increase pressure in the pulmonary capillaries. A similar response was observed in three other animals. Mean arterial pressure (MAP), right atrial pressure (RAP), and pulmonary artery pressure (mPAP) were recorded continuously. One single spot measurement of the sublingual mci was performed with the Cytocam-IDF imaging fixed on a metal arm support. Results: Acute pulmonary hypertension was characterized by an increase in the mPAP, followed by high RAP and low MAP. The time course of the sublingual mci alterations was recorded simultaneously with the hemodynamic changes (Fig. 29 ). The progressive raise in RAP induced blood to flow backward to the large veins, inducing venous stasis and reducing microcirculatory perfusion. Despite the adrenergic response with normalization of MAP, no recovery in sublingual mci was observed. Conclusions: Abnormalities in the sublingual mci during severe pulmonary hypertension were mainly associated with decrease perfusion pressure due to high RAP and hypotension. These different phases of acute hemodynamic changes could be detected with Cytocam-IDF imaging fixed in one sublingual spot. This study was supported in part by an Innovation grant from the Dutch Kidney Foundation (grant nr. 14OIP11). Assessment of levosimendan as a therapeutic option to recruit the microcirculation in cardiogenic shockinitial experience in cardiac ICU A. Taha Introductions: The presences of microcirculatory inadequacy despite resuscitation of hemodynamics strongly correlate with microcirculatory failure and poor outcome. (1) We sought to evaluate the response of the levosimendan, as an inotrope and vasodilator, that can improve perfused capillary density in previous studies. (2) and whether these beneficial effect can be utilized during cardiogenic shock state. Recruitment of microcirculation can be detected by veno-arterial pCO2 difference (Δ pCO2) which increases earlier as the intracapillary blood velocity falls and could reflect the adequacy of microvascular blood flow. (2) Methods: Retrospective observational study including 40 patients with cardiogenic shock admitted to our cardiac ICU were enrolled between January 2013 and March 2015. Arterial and mixed-venousblood gases and hemodynamic variables were obtained at admission and every 2 hours for intial 6 hours. dPCO2, Ca-CvO2, dPCO2/Ca-CvO2 and Cv-aCO2/Da-vO2 quotients was calculated. Results: Completed data sets were obtained from 40 patients , CI, cardiac output (CO), systemic vascular resistance (SVR), serum lactate, mixed venous saturation, arterio venous oxygen content difference and dPCO2. Subsequently, Ca-CvO2, (Cv-aCO2/Da-vO2) and the CI was examined and both found to be statistically significant different after starting levosimendan in cardiogenic shock patients (p < 0.001), comparing the serial measurements show statistical significant difference between initial samples and after 6 hrs. From starting levosimendan (P < 0.05) Conclusions: In our clinical series, dPCO2, Ca-CvO2 in patients with cardiogenic shock was significantly improved after starting levosimendan in cardiogenic shock patients, within 6 hours of initial resuscitation; therefore, it might be reasonable to consider levosimendan as a potential therapeutic option for microcirculatory recruitment during cardiogenic shock state. Further studies needed to confirm these findings Introductions: Brain death is a serious disease that affects brain parenchyma but also all other organs. Intensivist's task is to maintain organic homeostasis to allow PMD to reach withdrawal multiorgan in the best conditions, protecting from possible damage post neurovegetative storm (NS). NS causes severe hemodynamic instability resulting in peripheral hypoperfusion, hypotension, high values of lactates (LA), polyuria, depletion of catecholamine and stunning heart. Aim of our prospective study was to evaluate efficacy of terlipressin vs. norepinephrine in maintaining hemodynamic stability after NS [1] . Methods: 10 adult patients hospitalized in intensive care with acute brain injury, diagnosis of brain death, post NS, mean arterial pressure less than 65 mmHg. The patients were divided into two groups: 5 patients, group norepinephrine (infusion at 0.1 mcg/kg/min for 6 hours) and 5 patients, group terlipressin (infusion at 1.5 mcg/kg/h for 6 hours). These parameters were collected for each patients: mean arterial pressure (MAP), values of LA. The results were expressed as mean with standard deviation. For the comparison between the groups was used the Fisher test, considering significant for a p < =0.05. Results: After 6 hours of treatment group norepinephrine: MAP 85 ± 4mmgh, LA 4.1 ± 0.5mMol/L while the group terlipressin: MAP 108 ± 4mmhg, LA 2.1 ± 0.2mMol/L. Comparing the two groups the p-value for each variable was: MAP p = 0.034, LA p = 0.021. We have shown that the use of terlipressin has improved patient's hemodynamic stability in brain death after NS, with an increased mean arterial pressure and decrease of lactates, compared to the use of norepinephrine. This has allowed an improvement in peripheral perfusion. Comparing the two groups we have reached statistical significance for each variable calculated. Echocardiography in the potential heart donor exposed to substitution hormonotherapy F. Righetti, E. Colombaroli Intensive Care Unit, St. Boniface Hospital, Verona, Italy Critical Care 2016, 20(Suppl 2):P147 Introductions: The brain death is a serious disease that affects the brain parenchyma and also the other organs. Intensivist's task is to maintain a normal organic homeostasis for the potential donor. In this way, it is possible to reach multiorgan withdrawal optimally protecting from possible damage. The aim of this prospective randomized study was the evaluation of cardiac function of potential heart donor underwent substitution hormonotherapy by serial echocardiography [1] . Methods: 40 adult patients(pts), admitted to the intensive care unit with acute brain injury, bilateral non reactive mydriasis, Glasgow Coma Scale 3, absence of any reflection and potential progression to brain death. Pts were divided randomly into two groups: Hormonotherapy group (20 pts) treated with substitution hormonotherapy (triiodothyronine, vasopressin, methylprednisolone, insulin) and Control group (20 pts) with standard treatment. In all pts serial transesofageal echocardiography evaluations were performed with focus on systolic function, evaluated with ejection fraction of the left ventricle with Simpson's method. The echocardiographic exams were performed at entrance of the pts, post neurovegetative storm and within 12 hours after diagnosis brain death. The results were expressed as mean with standard deviation. For the comparison between the two groups was used the Fisher test considered significant with p ≤ 0.05. Results: Pts in the two groups were statistically matched for sex (p = 0.28) and age (p = 0.37). All patients have evolved in brain death. In Hormonotherapy group baseline ejection fraction was 54 % ± 3%, 28 % ± 8% post neurovegetative storm, and 43 % ± 6% within 12 hours after brain death. In the control group the baseline ejection fraction was 49 % ± 7%, post neurovegetative storm 25 % ± 6%, and 34 % ± 5% within 12 hours after brain death. Comparing the two groups we have reached statistical significance (p = 0.028). Conclusions: Comparing the two groups, we noticed both starting from a similar baseline cardiac function. During the neurovegetative storm, we recorded a similar myocardial stunning. However, in group who used substitution hormonotherapy is shown a recovery of systolic function to almost basal level within 12 hours after brain death while in the control group remains a dysfunction in systolic function. Considering these pts as potential heart donors we have shown that substitution hormonotherapy improves systolic function, despite the stunning post neurovegetative storm. Introductions: Bedside alarms sound when vital sign (VS) exceed thresholds due to real instability (alerts) or artifact, leading to alarm fatigue. We used machine-learning (ML) algorithms to classify alarms as alerts or artifacts in VS data streams to assess impact of such a filter on alert rate. Methods: Patients had VS monitoring data (heart rate [HR], respiratory rate [RR], oximetry [SpO2]) recorded at 1/20Hz. We partitioned data into training/validation (294 admissions; 22,980 hrs) and test sets (2,057 admissions; 156,177 hrs). Alarms are VS deviations beyond stability thresholds, and annotated as real alerts or artifact. We trained a random forest ML algorithm to discriminate alerts from artifact for long alarms (>3 min) and evaluated performance in the test set on both long and short alarms (<3 min) according to how many artifact alarms can be reduced among the total number of alarms (M1); and how many artifact alarms can be correctly filtered from the total artifact alarms (M2). Results: Figure 30 shows calculation of M1 (overall alarm duration reduction = 31 %) and M2 (artifact alarm duration reduction = 94 %) The number of artifact alarms in the complete data set is estimated from artifact prevalence in the annotated sample. Figure 31 is the ROC diagram for ML algorithm for both short and long alarms. Introductions: The use of peripherally-inserted central catheters (PICCs) has notably increased in hospitalized, critically ill and ambulatory patients. However, it is important to understand the rationale and risks associated with this new technology. The aim of this study was to describe patterns of use and complications of PICCs, which were placed in the ICU setting. Methods: Prospective observational study carried out in a 500-bed general hospital with a 14-bed ICU, in which a specialized unit for catheter insertion is available. PICCs were implanted always in the middle third of the arm, following a protocol that includes strict sterile insertion procedure and echo-radiological control. Data of patients with PICCs inserted from January 2013 to April 2015 were collected, including demographics, characteristics of insertion, complications and outcomes. Descriptive data included frequencies and percentages for categorical variables and mean and standard deviation (SD) or median and interquartile range (IQR) (25th-75th percentile) for quantitative variables. Introductions: A normal central venous pressure (CVP) waveform contains five components. These components include three peaks (a, c, v) and two descents (x, y). The characteristics and amplitude of the CVP waveform components change significantly during cardiovascular distress and its visualization can provide invaluable additional hemodynamic information. We described CVP tracings recorded during an episode of pulmonary hypertension in a porcine model of shock, aiming to characterize the CVP waveform morphology in conjunction with the simultaneous hemodynamic alterations. Methods: This is an animal case-report in which the CVP waveform was recorded during an ongoing experimental porcine shock model monitored with Swan-Ganz catheter. Pulmonary hypertension was unexpectedly observed as a complication after intravenous infusion of a contrast agent through the central venous catheter. A similar response was observed in three other animals. Together with the CVP tracings, recordings also included cardiac output, systolic/diastolic arterial pressure, heart rate, and systolic/diastolic arterial pulmonary pressures. Results: The episode of pulmonary hypertension lasted for 3-5 min with spontaneous recovery. The CVP changed from a baseline value of 4-6 mmHg to 16-18 mmHg. The CVP waveform tracings showed abnormal large 'a' wave complex resembling a cannon 'a' wave of CVP. Figure 32 shows the different patterns of the CVP morphology with the different components of the tracing recorded during the entire episode of pulmonary hypertension. A simultaneous ECG is shown to demonstrate the timing of the different components. Conclusions: The abnormal CVP waveform morphology during an episode of pulmonary hypertension shows a series of waves that can provide invaluable additional hemodynamic information. This study was supported in part by an Innovation grant from the Dutch Kidney Foundation (grant nr. 14OIP11). Introductions: The use of ultrasound is recommended to secure central venous catheter placement in the ICU. However several surveys showed that only 50 % of the CVC procedures are performed using the ultrasound guided technique. The reason why physicians continue to use landmarks were the lack of formation and the absence of ultrasound device. We designed a survey to elicit information on physician's characteristics, experience in CVC placement, training on ultrasound technique, the use of ultrasound for CVC placement, reasons for non-use of ultrasound and their opinion on the necessity to continue to teach the landmark technique to residents. Methods: This survey (14 questions) has been electronically addressed by Email to every physician belonging to the BoReal group of research (8 university's and 20 community's ICU in the North-west of France). The survey was sent to the 289 physicians (166 seniors and 121 residents) working in those ICUs. Results: We received 190 responses (response rate 66 %). Among the respondents, 66 % were less than 40 years old, 34 % were residents, 41 % presented an experience in CVC placement < =5 years and 53 % declared putting more than 1 CVC per week. 71 % of the residents declared having learned both landmark and ultrasound guided techniques during their residency. Only 18 % reported using always the ultrasound to put CVC. The main reasons why they don't use ultrasound were 1) they think they don't need it (36 %), 2) the ultrasound was not available (33 %) while 3 % declare the absence of ultrasound device in their setting and 3) 11 % reported the absence of formation on ultrasound technique. Coagulation abnormalities (64 %), obesity (54 %) and anatomical difficulties (52 %) were the main motivations to use ultrasound reported. 53 % of the respondents declared having been confronted during the last 12 months to an urgent situation where the ultrasound was not available quickly enough. Finally 91 % think that the landmark technique should still be taught to the residents. Conclusions: Our survey shows that a large majority of the intensivists still perform landmark procedures while almost every physician received a formation on the ultrasound technique. Ultrasound device is present in the large majority of the institution however their availability is still scarce. Despite the several guidelines, physicians still don't use ultrasound for every procedure and think that landmark technique should still be taught to residents. Introductions: Central venous-to-arterial carbon dioxide difference (Pv-aCO2 or CO2 gap) is recognized as an index of global tissue perfusion. Additionally, the ratio of Pv-aCO2 by arterial-to-central venous O2 content difference (Ca-vO2), has been described as indicative of ¡®anerobiosis¡¯in shock. The aim of this study was to demonstrate the predictive abilities of the Pv-aCO2 gap and Pv-aCO2/Ca-vO2 ratio in patients undergoing resuscitation from shock. Methods: Critically ill patients with shock who required vasopressors or inotropes were included in a prospective, observational study. Serial measurements of Pv-aCO2 gap, lactate, and Ca-vO2 were obtained at the time of starting the vasopressor/inotrope (time 0), then at times 3, 6 and 12 hours. The Study protocol was approved by the Hospital Research Ethics Committee (RAC No. 2151 048). Results: Fifty patients were enrolled. The mean APACHE II score was 26.9 ¡À 7.3. Thirty patients (60 %) had septic shock, 15 patients (30 %) had combined septic and cardiogenic shock, 3 patients (6 %) had hemorrhagic shock, and two patients (4 %) had cardiogenic shock. 28-day mortality rate was 62 % (31 patients). Fifteen patients (30 %) developed acute kidney injury. Patients surviving to day-28 had a lower Pv-aCO2 gap at time 0 (7.5, IQR 7 vs. 4.8, IQR 5, p = 0.007), and demonstrated greater clearance of the Pv-aCO2 gap during resuscitation (-0.75 IQR 4.8 vs. +0.75 IQR 5.2, p =0.024). A cut-off value of ¡Ý 6 mmHg significantly predicted 28 day mortality (75 % vs. 45.5 %, p =0.033). In a subgroup of patients in whom a resuscitative target of ScvO2 70 % was achieved, a higher Pv-aCO2 gap was significantly associated with lower 28-day survival (6.7 IQR 6 vs. 3.7 IQR 4, p =0.009). Pv-aCO2/Ca-vO2 ratios were significantly lower in patients surviving the ICU and until day-28 of follow up; this relationship was observed Conclusions: This study suggests that Pv-aCO2 gap and Pv-aCO2/ Ca-vO2 ratio can be used to predict mortality in shock, especially in patients for whom the usual goals of resuscitation have been achieved. A Pv-aCO2 gap value of 6 mmHg may be used as a reasonable cut-off point. Introductions: Measurements of cardiac output (CO), pulmonary artery occlusion pressure (PAOP) and systolic pulmonary artery pressure (SPAP) remain important parameters for the management of critically ill patients. Pulmonary artery catheter (PAC) with thermodilution technique has been considered the clinical standard in measurements of CO. However, it is an invasive technique that can be associated with complications. Transthoracic echocardiography (TTE) is a non-invasive technique allowing to calculate the CO and the PAPS. We decided to compare the CO and SPAP measured by TTE and by PAC. Methods: We prospectively included all patients monitored with PAC, sedated and under mechanical ventilation in our ICU. We excluded patients with arrhythmias, severe valvulophathy and poor echogenicity. The CO, PAOP, right auricular pressure (RAP) and SPAP were measured using the PAC (-PAC). CO-PAC was an average of three thermodilutions. The TTE CO (COTTE) was derived from the Doppler stroke volume, the E/A ratio from the transvalvular mitral flow and the Ea from the mitral annular TDI. SPAP was estimated from tricuspid gradient (SPAPecho) plus RAP (estimated by inferior vena cava diameter). Measurements made with TTE and PAC were performed simultaneously by two experienced operators. A linear regression and a Bland-Altman analysis were performed. Results: 22 patients (36 measurements) were monitored with PAC, age 64y/o(60-70), SAPS 61(60-79), ejection fraction (EF) 58 %(46-61). The linear regression between COTTE and COPAC was R2 = 0.78 (P < 0.001). The Bland and Altman analysis showed a bias of -0.4 l/min and limits of agreement from -2.8 to 1.9 l/min. The linear regression between SPAPecho and SPAP-PAC was R2 = 0.52 (P < 0.001), the linear regression improves when SPAPecho was calculated from tricuspid gradient plus RAP measured by PAC R2 = 0.62 (P < 0.001). The Bland and Altman analysis showed a bias of -3.7 mmHg and limits of agreement from -18.1 to 10.7 mmHg. Linear regression between E/A and PAOP, E/Ea and PAOP were weak, R2 = 0.33 (P < 0.001) and R2 = 0.42 (P < 0.001) respectively. Conclusions: This study confirms that COTTE correlates well with the measurement of COPAC in critical care patients sedated under mechanical ventilation. The TTE estimation of SPAP appears also reliable. But the TTE estimation of PAOP (E/Ea and E/A) in this population of patient with unselected EF is not consistent. Introductions: The CNAP technology (CNSystems Medizintechnik AG, Graz, Austria) enables noninvasive continuous recording of the arterial pressure waveform based on the volume clamp method. Recently, the manufacturer released a novel algorithm for measuring cardiac output (CO) using pulse contour analysis of the arterial waveform. In this study, we compared CO measurements with CNAP (CNCO) with invasive CO measurements using intermittent pulmonary artery thermodilution (pulmonary artery catheter (PAC); PAC-CO) in postoperative cardiothoracic surgery patients. Methods: In this interim analysis, simultaneously obtained CNCO and PAC-CO measurements were analyzed in 30 patients during the first hours after cardiothoracic surgery. We performed 3 independent sets of 5 consecutive thermodilution measurements each per patient. The average of the 3 closest of the 5 PAC-CO measurements was used for comparison with the average of the corresponding CNCO values. Three pairs of measurements were excluded due to artifacts resulting in 87 paired measurements for analysis. In addition, 25 cardiac output-modifying maneuvers were analyzed to evaluate trending ability. We conducted 2 separate comparative analyses: 1) CNCO calibrated to the first simultaneously measured PAC-CO value (CNCOcal) vs. PAC-CO and 2) CNCO auto-calibrated to biometric patient data (CNCObio) vs. PAC-CO. Agreement between the two methods was statistically assessed by Bland-Altman analysis and by calculating the percentage error (PE). For evaluating trending ability, the concordance rate (CCR) was calculated. An exclusion zone of 0.5 L/ min was applied. Results: For CNCOcal, the Bland-Altman analysis resulted in a mean difference of -0.2 L/min, a standard deviation of ±0.5 L/min and limits of agreement of -1.1 to +0.7 L/min. The PE and CCR were 18 % and 100 %, respectively. For CNCObio, the Bland-Altman analysis resulted in a mean difference of +0.5 L/min, a standard deviation of ±1.1 L/ min and limits of agreement of -1.6 to +2.6 L/min. The PE and CCR were 43 % and 94 %, respectively. Conclusions: In this clinical study in patients after cardiothoracic surgery, CNCOcal showed good agreement (PE 18 %) and good trending ability (CCR 100 %) when compared with intermittent pulmonary artery thermodilution. For CNCObio, we observed a higher PE (43 %) but acceptable trending ability (CCR 94 %). Introductions: A reliable hemodynamic monitoring and a correct interpretation of the data are of crucial importance in the patient (pts) with SS to choose the strategy of treatment in the Intensive Care Unit (ICU). The aim of this prospective study was to compare two techniques of hemodynamic monitoring to assess which is more reliable in the management of pts in SS [1] . Methods: 20 adult pts in ICU for SS. All pts were monitored with CPCCO Pulsion Medical Systems and serial TEE have been performed. The cardiac output(CO;L/min) was measured continuously with the method CPCCO(triple injection of 15 ml of 0.9 % saline solution at 4°C in the CVC, thermo dilution curve measured in the femoral artery through the catheter thermistor dedicated) and simultaneously with TEE(LVOT area multiplied the integral speed/time of aortic flow multiplied by the heart rate) at the entrance, after fluid challenge with 500 ml of crystalloid and during the use of vasopressors and/or inotropic drug(VId). Same operator measured echocardiographic CO without being aware of the CO supplied by CPCCO. The results were expressed as mean with standard deviation. For comparison and correlation between measurements were used linear regressions according to the Bland-Altman method. Statistical significance was considered at p ≤ 0.05. Results: 120 measurements were performed. 60 with CPCCO and 60 with the TEE. 40 measurements were realized at the entrance, 20 with CPCCO(1.8 ± 0.2) and 20 with TEE(1.75 ± 0.3), 40 after fluid challenge, 20 with CPCCO(2.5 ± 0.3) and 20 with TEE(2.7 ± 0.2), 40 during the use of vasopressors and/or inotropic drug including 20 with CPCCO(4.5 ± 0.3) and 20 with TEE(4.4 ± 0.2). There is a good correlation and a limited bias between measurements provided by CPCCO and by TEE, both at the entrance(r = 0.992,bias = 0.15), after fluid challenge (r = 0.954,bias = 0.24), both during the use of VId (r = 0.967,bias = 0.33). These values were statistically significant (p = 0.026). Conclusions: This study showed that there is a good correlation between the data provided by CPCCO and TEE. Both methods can be used reliably for hemodynamic monitoring in SS. and was compared to stroke volume estimated from the difference between the end diastolic and end systolic volume measured by Simpson biplane method (S2D). We excluded patients with atrial fibrillation and any significant valvulopathy. Low end diastolic value (EDV) was considered as a volume <70 cc and low ejection fraction (EF) was defined <45 %. Results: We evaluated 147 exams. Twenty patients were excluded because of atrial fibrillation and 41 because of a significant valvulopathy. A total 86 pairs of stroke volumes measured by the two methods were evaluated. The r value was 0.51, r^2 = 0.25 and p < 0.01 by regression analysis. The mean bias of stroke volume measured with DT over that estimated with S2D was 29 % (95 % CI 21.4-37.5, p < 0.01). There was a negative correlation of this bias with mean SV (r^2 = 0.09, p < 0.01). A negative linear correlation was found between the bias and the end-diastolic volume and EF but not with end-systolic volume (r^2 = 0.21 p < 0.01, r^2 = 0.15 p < 0.01 and r^2 = 0.01 p = 0.93, respectively). A higher bias was observed in patients with low EF (46 % ± 33 vs 25 % ± 36 p0.03) and in patients with low end-diastolic volume (64 % ± 33 vs 20 % ± 31 p < 0.01). Conclusions: A significant overestimation of stroke volume measured with Doppler ultrasound by transthoracic echocardiography compared to S2D was observed in critically ill patients. This overestimation was more prominent between the patients with low enddiastolic volume and/or low EF. Given that none of the two methods that we used can be considered as gold standard a further evaluation of DT should be made in this group of patients. Conversely, in every day clinical practice, the combination of these two methods may be an easy way to identify possible errors in cardiac output measurements. One of these studies provided a correction formula (1). As a consequence, one of the manufacturers of commercially available TPTDdevices requires information about the venous catheter position, and correction of GEDVI in case of femoral access can be assumed. Methods: It was the aim of our study to compare GEDVI derived from TPTD using jugular venous access (GEDVI-jug) to uncorrected (GEDVI-fem-uncor) and corrected (GEDVI-fem-cor) GEDVI as calculated by the PiCCO device and to GEDVI-form calculated from the previously suggested formula (1). We recorded 31 datasets in 15 patients equipped with both jugular and femoral venous access. Each dataset consisted of three triplicate TPTDs using the jugular venous access as the gold standard and the femoral access with and without the information about the femoral indicator injection. (Bland-Altman analysis; Wilcoxon-test for unpaired samples; SPSS 23.0). Results: 15 patients, 9 male, 6 female, age 60 ± 15 years, 174 ± 9 cm, 83 ± 17 kg. GEDVI-fem-uncor was significantly higher compared to GEDVI-jug (1026 ± 263 vs. 765 ± 145 ml/m 2 ; p < 0.001) with a bias of +260 ± 182 ml/m 2 and a percentage error (PE) of 40.6 %. GEDVI-femcor was slightly different to GEDVI-jug (811 ± 159 vs. 765 ± 145 ml/m 2 ; p = 0.025) with a PE of 29.6 % and a bias of 46 ± 117 ml/m 2 which was significantly lower (p < 0.001) than the bias for GEDVI-fem-uncor. GEDVI-form was not significantly different from GEDVI-jug (797 ± 162 vs. 765 ± 145 ml/m 2 ; p = 0.063) with a PE of 26.1 % and a bias of 31 ± 108 ml/m 2 which was significantly lower (p < 0.001) than the bias for GEDVI-fem-uncor, but not than the bias for GEDVI-fem-cor (p = 0.557). Conclusions: This study confirms that GEDVI derived from uncorrected femoral indicator injection TPTD markedly overestimates GEDVI and also results in inacceptable imprecision. Correction by the PiCCO-device and by the formula recently suggested (1) Introductions: The purpose of this study is to describe the timing for achieving targets of the post-operative goal-directed therapy (GDT) protocol in postoperative patients admitted to ICU in St George's Hospital, in order to explore if the duration of this intervention can be reduced. Methods: This is a retrospective study of the post-operative GDT protocol, which basically consist of administration of fluids and inotropes in order to achieve an oxygen delivery index of 700 mL/min/ m2. Data from a randomly selected month were collected and compared with the data from the audit in 2009. Data was graphically and analytically explored. Continuous variables are presented as means and standard deviation, categorical data is presented as number and proportions. there were divergent conclusions on IJV area and poorly correlation for IJV ratio. The IJV ratio showed a moderate-good validity in predicting low CVP; the AP-IJV showed an high sensitivity and specificity for hyper and hypovolemia; an increase of IJV area effectively ruled out elevated CVP. Two studies respected more than 80 % of the STARD items and four more than 80 % of QUADAS items. Conclusions: Because few reports have been published on this topic the conclusions of this review should be confirmed. Anyway the quality of reporting and methodology of the studies collected were good. The IJVUE seems to have a good reliability and accuracy in predicting low or high CVP . The AP-IJVD shows the best correlation with CVP. Introductions: Dehydration is a relatively unrecognized disorder in medical ICU especially among older patients. Our aim was to determine the incidence of dehydration upon admission as well as the risk of developing dehydration during hospitalization and its effect on patient outcome. Moreover, several lab parameters were studied in order to assess the correlation to the diagnosis. Methods: All patients entering ICU in a 6 month period in 2015 were enrolled and screened for dehydration on day 1 and day 3 using clinical and lab parameters (BUN, Cr, BUN/Cr ratio, Hct, serum Na, urine specific gravity). BUN/Cr >20 was used as a diagnostic criteria. Serum osmolality was not used due to unconsistent eligibility. In total of 332 patients, 38 were excluded due to GI haemorrhage. Finally, 294 patients were included, M/F 152/142, median age 74. Statistical methods used were McNemar test (differences in dehydration on day 1 and day 3), binary logistic analysis (association of lab parameters with dehydration) and multivariate adjusted binary logistic regression model (outcome). Results: Of 294 patients 126 (42.9 %) were dehydrated upon admission. Dehydration clinically diagnosed at admission and defined by BUN/Cr >20 were significantly associated (McNemar test; χ 2 = 42.01; n = 246; p < 0.001; ϕ =0.24). Dehydration was correctly clinically recognised in 45/126 (35.7 %). Patients with BUN/Cr >20 had 3 times higher odds to be accurately clinically diagnosed. No correlation could be found between dehydration and values of Hct, serum Na and urine specific gravity. Statistically significant positive association was demonstrated for BUN (OR = 1.08; 95 % CI 1.04-1.11; p < 0.001) and Cr (OR = 1.00; 95 % CI 1.00-1.00; p = 0.048). On day 3 133/ 249 (54.1 %) patients were dehydrated which represents significant increase (McNemar test, χ 2 with continuity correction = 7.01; n = 246; p = 0.008; ϕ = 0.45). Overall, dehydration at any point of stay was associated with adverse outcome (death) (OR adj = 2.45, 95%CI 1.38-4.34, p = 0.002). Conclusions: Dehydration is a common finding in ICU and can be clinically easily overlooked due to insensitivity of clinicians for milder forms of dehydration which could explain a relatively high rate of misclassification. Clinical and lab parameters are insufficient for diagnosis which is consistent with previous observations. According to these data, hydration status changes during ICU stay and it's fluctuations are associated with increased mortality suggesting that maintaining an adequate hydration status is of core interest for patient outcome. Methods: Patients were randomly allocated to receive 2, 3, 4 or 5 ml /kg (body weight) of intravenous crystalloid over five minutes. Pmsf was measured using the arterial pressure after stopping blood flow in the arm with a pneumatic tourniquet inflated for one minute. Mean arterial pressure (MAP), central venous pressure (CVP), cardiac output (CO) and stroke volume (SV) were recorded at baseline and immediately after the fluid infusion. CO was measured with LiDCOplus monitor. A positive response was defined as an increase of 10 % from baseline. From previous data, the least significant change (LSC) for Pmsf was 14 %. Medians were compared using the independent samples media test, and proportions were compared using an exact fisher test. Statistical significance was considered when p < 0.05 Results: 40 patients were included, 40 % of them were responders. The changes (Δ ) of Pmsf and CO in each group are described in Table 20 . There is marginal evidence of Δ Pmsf is related to the dose of fluids, but the Δ CO and the proportion of responders are not affected by the dose of fluids infused Conclusions: The dose of fluids does not affect the change in CO or the proportion of responders to a fluid challenge in septic patients. Fluid bolus practices in a large Australian intensive care unit B. Avard 1 , P. Zhang 2 1 The Canberra Hospital, Hughes, ACT, Australia; 2 Australian National University Medical School, Canberra, Australia Critical Care 2016, 20(Suppl 2):P167 Introductions: Fluid bolus administration is a ubiquitous therapy within Intensive Care, however recent evidence has suggested a more judicious approach may improve patient outcome. We aimed to describe the fluid bolus practices within our large Australian Intensive Care Unit to inform future focus of education of staff. Methods: We conducted a retrospective audit of 234 critically ill patients who received fluid boluses and collected information on fluid type, volume, time administered, patient severity and prescriber details. Ethics approval was obtained. Data was analysed using IBM SPSS Stat 21. Results: We found that a balanced salt solution was the most commonly used for fluid bolus therapy in our unit, independent of patient disease or severity. The majority of fluid was prescribed by junior doctors between 4 pm and 8 am, within 48 hours of ICU admission, though there was an additional peak of bolus administration at day 5 for those who remained in the unit at this time. Patients received on average 1.3 L of fluid in boluses. Conclusions: Fluid bolus use in the clinical environment is highly variable and does not always align with theoretical principles. Introductions: Liberal late fluid management (LFM) was associated with a higher morbi-mortality in different critically ill populations. In trauma, restrictive fluid strategy is largely recommended early resuscitation phase, however LFM has not been really studied. The main goal of present study was to assess in a severe trauma population the impact of liberal LFM on mechanical ventilation duration. Methods: A retrospective analysis of all consecutive severe trauma patients admitted to our regional trauma center from 2010 to 2013 was performed. Introductions: Emergency department (ED) patients admitted to the hospital with community-acquired pneumonia (CAP) may be at particular risk for volume overload related complications due to their infected lungs. This study used the Premier administrative database containing 17 % of US hospital discharges in 2013 to examine the association between a higher amount of fluids administered on day 1, hospital mortality and ventilator free days (VFD). Methods: CAP was defined as a bacterial pneumonia diagnosis present on admission in ED patients receiving chest x-ray and parenteral antibiotics on Day 1. Day 1 total fluid volume was stratified into quartiles. The primary outcome was hospital mortality, and secondary outcome was VFDs over a 30 day period. To adjust for potential confounds, patients were stratified into 5 severity groups based on their modelled predicted mortality, and a propensity model for giving fluids was built using clinical characteristics, severity of illness scores, and ICD-9-CM codes present on admission for important acute and chronic conditions. Patients were then grouped by severity of illness quintiles of predicted risk of death for the CAP population. The association of fluid administration with outcome was assessed within each quintile of mortality risk using chi-square analysis. Results: 192,806 adult ED patients with CAP were analyzed. Patients were 51.3 % female with a mean age of 69. Overall mortality rate was 7.4 % with an average of 24.6 VFDs. After propensity adjustment by severity of illness, mean amount of fluid administered on day 1 ranged from 842 ml to 2,189 ml. Within the fifth severity quintile we found a significant difference in the hospital (22.5 %) vs. predicted (18.3 %) mortality rate in the highest fluid quartile. Similarly, within the fifth severity quintile, we found a significant difference in the number of VFDs in the highest fluid quartile (16.1) compared to the rest of the quartiles (19.3-20.3). Conclusions: For adult ED patients with CAP, we found significant associations between day 1 fluids and increased mortality and decreased VFDs in the highest fluid quartile and severity group, after adjusting for severity of illness and acuity level. This study may have identified the subset of ED patients who may benefit from a more restrictive fluid strategy when presenting to the hospital with CAP. Introductions: Administration of fluids is routine in management of ICU patients. However, controversies still exist with regard to the ideal type, dose and timing of iv fluid and a large amounts of evidence has shown that fluids can have deleterious effects on several organ functions, both from excessive amounts and from their non-physiological electrolyte composition (1). The aim of our study was to assess the negative impact of positive fluid balance on outcome in critically ill patients. Methods: Retrospective study for one year (July 2014 to June 2015), including all patients admitted in medico-surgical ICU for more than 48 hours. Demographic data, SAPS II and APACHE II scores were registered at the first day. Length of stay, duration of mechanical ventilation, use of extra renal therapy, RIFLE score and mortality were noted. Daily and cumulative fluid balance was registered at 48 h, 7th, 14th, 21st days and at discharge. Results: A total of 183 patients were included. Mortality rate was 41 %. Severity scores were 49 ± 17 and 19 ± 9 points respectively in SAPS II and APACHE II. Table 21 resume cumulative fluid balance. We noted a higher rate of septic shock, longer length of stay and mechanical ventilation duration in non-survivors group. The use of extra renal therapy was higher in non-survivors group (1 vs 11 p = 0.01). RI-FLE score and diuretics use were also higher in non-survivors group (1 vs 2 p = 0.03 end 7 vs 16 p = 0.041). Non-survivors also showed higher accumulated positive fluid balance. Conclusions: Fluid accumulation is very common throughout the course of a patient's ICU stay with the evidence of the potentially deleterious effects of fluids and electrolytes on many organ systems. A more conservative approach regarding fluid management strategy in ICU patient seems to be less deleterious, but randomized controlled trials addressing this issue were needed. Regarding the cFB and the respiratory system, it was found that patients with a ratio PO2/FiO2 < =200 had a higher cFB between the 4th and 8th day. In the group of patients who required vasopressor support, cFB was higher in the 2nd and 3rd days (p = 0.04 and p = 0.036, respectively). The group of patients with a GFR <75 ml/min/1.73 m2 presented cFB slightly higher than the group with higher GFR (2nd and 4th days p = 0.049 e p = 0.041, respectively). Conclusions: This study demonstrated that the cumulative fluid balance was higher in patients who had more severe dysfunctions of the systems reviewed, and these were associated with increased mortality rate. What remains to be established is whether fluid overload is only a biomarker that puts patients under an increased risk of death or an iatrogenic condition from critical care that should be considered in daily care and actively treated and avoided. Conclusions: Compared to Plasma-Lyte, a 1 L fluid bolus of saline was at short term associated with a small increase of Na and Cl, and decrease in SID, SIG, and AG. These changes were statistically significant, but limited and were not associated with differences in hemodynamics or UO. The , p = 0,04). There wasn't significant difference between groups in total volume of fluids administered in the first 3 days (Image 1). Conclusions: This study did not show any negative influence of 6 % HES solution on mortality, incidence of AKI and need for RRT. Addition of HES solution in the early infusion therapy is not associated with the reduce of total fluid requirement. ) were performed by two blinded, independent investigators following the "electron microscopic tubular injury" (EMTI) score. The score contains the parameters 1) vacuolar alterations, 2) dissociation of epithelium and basal membrane, 3) epithelial cell injury, 4) intratubular precipitation and 5) intact microvilli cuticular layer. Each component of the score was quantified from 0 (no damage) to 3 (severe damage). Results: All groups developed USKD over time with consecutive increases of EMTI scores. At BL2 + 6 h, the EMTI score of the HES-group was significantly lower as compared to crystalloid-and control-group (each p < 0.05) and at BL2 + 12 h the EMTI scores of albumin-and HES-groups were significantly lower as compared to the control group (each p < 0.01). 24 h after BL2, the EMTI score of the human albumin-group was significantly lower as compared to the controland crystalloid-group (each p < 0.05). There were no significant differences between albumin-and HES-group at any time. Conclusions: In the present ovine model of septic shock, all groups developed sepsis-associated USKD. Animals of the colloid-groups (albumin and 6 % HES 130/0.4) showed significantly decreased injury scores as compared to the animals of the control-and crystalloidgroup. Ultrastructural kidney damage was most pronounced within crystalloid-and control-group in this study. Alterations of conjunctival microcirculation in a sheep model of haemorrhagic shock and resuscitation with 0.9 % saline or balanced tetrastarch P. Introductions: Early clinical trials and experimental data have reported that the use of unbuffered fluids may adversely influence gastrointestinal (GI) function when compared to buffered fluids. GI dysfunction and nutritional feeding intolerance is common amongst critical unwell patients. No previous studies have assessed the GI effects of different crystalloid fluids in critically ill patients. Objective: To evaluate the effect of Plasma-Lyte 148 (PL-148) compared with 0.9 % saline (saline) on gastrointestinal (GI) feeding intolerance in mechanically ventilated ICU patients receiving nasogastric (NG) feeding. Methods: We conducted a single centre pilot study nested within a multicentre, double-blind, cluster-randomised, double crossover trial. All adult patients who required crystalloid fluid as part of the SPLIT trial [1] , expected to require mechanical ventilation for >48 hours and receiving enteral nutrition exclusively by a NG tube were eligible. The study ICU used either blinded saline or PL-148 for four alternating seven week blocks. The primary outcome was the proportion of patients with GI feeding intolerance, defined as high gastric residual volumes (GRV), diarrhoea or vomiting while receiving NG feeding in the ICU. The proportion of patients with each of high GRV, diarrhoea, and vomiting were secondary outcomes. Conclusions: Among mechanical ventilated patients receiving NG feeding, the use of PL-148 compared to saline did not reduce the proportion of patients developing GI feeding intolerance but was associated with a decreased incidence of high GRV. Introductions: Using buffered crystalloid fluids, such as Plasma-Lyte 148 (PL-148), may reduce post-operative blood loss and transfusion requirements compared to 0.9 % saline (saline) [2, 3] . However, no studies have assessed the effect on postoperative blood loss of using different crystalloid fluids after heart surgery. Objective: To evaluate the effect of PL-148 compared with saline on bleeding following heart surgery. Methods: We conducted a single centre pilot study nested within a multicentre, double-blind, cluster-randomised, double crossover trial. All adult heart surgery patients who required crystalloid fluid therapy in ICU as part of the SPLIT trial [1] were eligible. The primary outcome was chest drain output from time of arrival to ICU until 12 hours post operatively. The key secondary outcomes were total volume of fluid in chest drains from arrival to ICU until 24 hours if still in ICU and proportions of patients requiring any of the following blood products (packed red blood cells, platelets, fresh frozen plasma or cryoprecipitate). Results: 251 patients were enrolled, with 131 assigned to PL-148 and 120 to saline. There was no difference in postoperative chest drain output at 12 hours (ratio of geometric means, 1.06; 95 % CI, 0.93 to 1.20; P-value = 0.42) or at 24 hours. In the PL-148 group, 28 of 131 patients (30.5 %) required blood products in ICU compared with 22 of 120 patients (18.3 %) in the saline group (OR, 1.96; 95 % CI, 1.08 to 3.54; P = 0.03). Conclusions: Among patients admitted to ICU following heart surgery, the use of PL-148 compared to saline did not reduce the amount of post-operative blood loss but was associated with increased use of bloods products. Introductions: Approximately 1 to 2 % of patients attending the Emergency Department have hypernatremia (1) . Hypernatremia over 160 mmol/L is considered to be very severe and as such needs to be treated with caution (2) . Extreme cases of hypernatraemia are rare (where sodium levels are exceptionally high), especially in adults and the most common pathophysiology in such cases would be related to severe dehydration. We see this more in the elderly population, children, the institutionalised and patients with dementia. We report a unique case of extreme hypernatremia of 196 mmol/L and severe sepsis in a young 39-year-old adult with Huntington's dementia, which presented a challenge in fluid management. Results: The hypernatraemia was thought to be caused by chronic severe dehydration from poor intake and the sepsis was thought to have started as an inadequately treated urinary tract infection. The patient was initially treated aggressively with hypotonic saline and intravenous antibiotics, but was subsequently managed using a slower correction rate after identifying the chronic nature of the natremia. To our knowledge, this is the first reported case of extreme hypernatremia and severe sepsis manifesting concomitantly in such a young patient. Conclusions: We highlight the difficulties of balancing the risks and benefits of rapid versus slow fluid resuscitation in such complex clinical situations. We highlight the difficulty of balancing the risks and benefits of rapid fluid resuscitation, necessitated by severe sepsis and acute kidney injury, against the complexities of fluid resuscitation in correcting such an extreme hypernatremia. We advocate early identification of chronic hypernatremia and effective administration of intravenous fluids in the Emergency Department that is decided on a case-by-case basis. In cases of severe and extreme hypernatraemia in particular, accurate calculation of free water deficit is essential to manage the patient effectively. In line with this, we must ensure that such patients are always weighed on admission. Introductions: HAGMA is a commonly encountered acid-base disorder in clinic practice. Multiple mnemonics were proposed to cover the common underlying causes -"GOLDMARK(1)", "KUSMALE", "KAR-MEL" and, of course, "MUDPILES". The "I" in "MUDPILES" represents some rare causes of HAGMA including "I"ron, "I"soniazid and "I"nborn errors of metabolism. We present a rare case of HAGMA secondary to isoniazid (INH) overdose. Methods: A 41 year old male was intubated for status epilepticus. He had a medical history of pulmonary tuberculosis (pTB) and had been treated with a course of rifampicin, INH and ethambutol since 6 months ago. Results: Arterial blood gas in the Emergency Department showed severe metabolic and respiratory acidosis; pH 6.76, pCO2 80, pO2 62, HCO3 11, BE -24. The initial anion gap was 26. Physical examination was unremarkable except for fever and hypotension. Neurological examination did not reveal any lateralising signs. CT scan of the brain was normal. Mild renal impairment, and lactate acidosis resolved rapidly with initial fluid resuscitation. Serum and urine toxicology for commonly encountered drugs were also negative. Patient regained orientation on day 3 of intubation and was promptly extubated. Further history revealed that he had consumed a bottle of isoniazid tablets after an altercation with his wife the same night. Conclusions: Tuberculosis is endemic is South East Asia. INH is one of the four drugs used in the treatment of this disease. Side effects of isoniazid range from mild hepatotoxicity to the potentially fatal INH hepatitis. HAGMA refractory to conventional therapy is one of the hallmarks of INH toxicity. Though it is one of the rare causes of HAGMA, prompt diagnosis and treatment with high dose pyridoxine (2) reduces mortality and morbidity. Introductions: Acute kidney injury (AKI) is independently associated with increased mortality in the critically ill, especially when renal replacement therapy (RRT) is required. "Early initiation" of RRT and its possible beneficial effect on mortality, has been the focus of many research protocols. However, published papers show no uniformity in the definition of early initiation. The main problem to determine this best moment of initiation is the lack of a distinctive parameter that can predict RRT requirement. The question rises if differences between patients with any stage of AKI and patients with RRT-requiring AKIN stage III can be identified looking at the "classical parameters" (serum creatinine (SCr), urea, potassium, bicarbonate, pH, cumulative fluid balance), plasma Neutrophil Gelatinase-Associated Lipocalin (pNGAL) and urinary output (UO) at the time of first rise in SCr (first AKI). Methods: This is a retrospective subset-analysis performed on the NGAL-study database [1] in adult critically ill patients with developing AKI. Data were collected and analyzed at the time of first AKI and at the highest level of serum creatinine (peak AKI). Mann-Whitney-U test was used to detect differences between both groups. Results: A total of 59 patients developed any stage of AKI during the first 7 days after ICU admission, 15 patients required RRT eventually. At the time of first AKI, pNGAL and UO were the only parameters that differed significantly (P = 0.02 and P = 0.04 respectively) between the two groups. The "classical parameters" showed a significant difference between groups later on at the time of peak AKI (=time of RRT initiation) (Fig. 37) . Conclusions: Critically ill patients with RRT requiring AKI have a significant lower urinary production and significant higher pNGAL concentration compared to patients with non-RRT requiring AKI at the time of first rise in SCr. The "classical parameters" fail to make this distinction at this very early stage of AKI. Therefore, plasma NGAL and UO might have the potential (perhaps in combination with the Furosemide stress test) to become valuable parameters in future prospective protocols that intend to study the value of early RRT initiation on hard outcome measures in critically ill patients. Introductions: The choice for the right renal replacement therapy (RRT) for severe acute kidney injury in critically ill patients has been investigated many times in the last two decades. Although some questions have been answered, in current practice many different approaches are still used in the ICU. The authors review the treatment of Acute Kidney Injury (AKI) in critically ill patients, indications for SLED and continuous renal replacement therapy (CRRT) in critically ill patients in four Portuguese ICUs. Methods: Prospective observational study conducted in 4 ICUs in 3 Portuguese hospitals for seven months. All patients with AKI who needed a renal replacement therapy were enrolled. Data on demographics and clinical characteristics of patients were collected at baseline. Results: 127 patients were enrolled, with an average age of 61,3 ± 15,26, and mortality rate was 46 %. 42 patients performed SLED (33,07 %), and 85 % performed CRRT (63,97 %). The most common indications for CRRT was fluid overload (47,76 %). Most of the patients start the both procedures in the first two days in ICU. Most of the patients (59 Ð 60 %) were in organ failure, using the RIFLE criteria when they started RRT. Patients in CRRT were more hemodynamically unstable (norepinephrine in CRRT Ð 1,99 ± 5,36 and in SLED -1,49 ± 4,56). Mortality rate Ð 52,94 % in CRRT and 30,95 in SLED. Conclusions: CRRT seems to be the ideal mode of renal replacement therapy for hemodynamically unstable with fluid overloaded. However, in expert hands, the two treatments provide similar outcomes and can be complementary. We wished to evaluate current practice in the region and consider the need for regional guidance on the management of acute kidney injury in our units. Methods: An electronic survey was sent to all the consultants in intensive care within the region. Responses were anonymously collecting using internet-based survey software. Questions were based upon the currently available guidance and evidence. Results: Invitations were sent to 139 consultants who have some intensive care commitment. There were 55 responses giving a response rate of 39.6 %. Unit size varied greatly, from under 10 beds, to over 20. The majority (43.6 %) had between 10 and 20 beds. 46.3 % of respondents had less than 10 patients receiving RRT per month, 40.7 % had between 10 and 25 patients per month and 13 % had more than 25 patients per month. 43 % gave positive comments regarding the support they receive from renal services. Some areas where interactions between renal physicians and critical care could be improved were identified. Only 67 % of respondents were aware of an up to date guideline for the use of RRT in their unit. 1.9 % of respondents preferentially use the subclavian vein for siting renal access lines. 9.3 % preferentially used the femoral veins. 31 % never have visits from a nephrologist. Conclusions: There is a wide variety in practice, with some areas where practices could be improved , notably awareness of guidelines available, choice of renal access line site, and working relationships between renal and critical care. As part of a quality improvement programme, we plan to create a regional best practice recommendations document that is relevant to the systems in which we work and specific to critical care. By doing this, we hope to improve outcomes for those receiving renal replacement therapy in our units. Introductions: Acute kidney injury necessitating continuous renal replacement therapy (CRRT) in critically ill patients is associated with high mortality. Timing of CRRT remains a matter of debate. We investigated the effect of timing of CRRT on 28-day mortality. Methods: A post-hoc analysis is performed on the multicenter data from the earlier published CASH study. In critically ill patients receiving CRRT, between 2005 and 2011, the effect of variables at initiation of CRRT on 28-day mortality were evaluated. Univariate and multivariate cox regression analysis were performed to determine renal and patient-specific variables associated with 28-day mortality. Results: Of the 139 patients evaluated for inclusion, 13 patients were excluded because of a history of chronic kidney disease, 5 patients were excluded because no creatinine at CRRT initiation was available, and 5 patients were excluded because an admission creatinine below 50 μmol/L. In the 116 included patients, 28-day mortality was 37 (32 %). In univariate proportional hazards cox regression analysis, ICU admission after cardiopulmonary resuscitation ( HR 4.403, p = 0.043), SOFA score ( HR 1.117, p = 0.024) and creatinine at CRRT initiation (HR 0.997, p = 0.020) were associated with 28-day mortality. Multivariate analysis demonstrated that age (HR 1.092, p = 0.050), admission weight ( HR 1.036, p = 0.016) ICU admission because of respiratory failure (HR 9.275, p = 0.29) and creatinine at CRRT initiation (HR 0.990, p = 0.022) were associated with 28-day mortality. After correction for disease severity scores, such as APACHE 2 score and SOFA score, and markers for fluid overload, such as hematocrit and cumulative fluid balance, creatinine at initiation remained associated with lower 28mortality. Even after correction for admission creatinine, only creatinine at initiation remained associated with 28-day mortality. In ROC curve analysis a CRRT initiation creatinin of 318 μmol/L was the best predictor for 28-day mortality. Conclusions: In this multicenter cohort, creatinine at CRRT initiation is an independent predictor of 28-day mortality, even after correction for admission creatinine, disease severity and hemodilution. Therefore these data argue in favour of a time-dependent effect of CRRT on 28day mortality. These data suggest that late CRRT initiation may have better outcome then earlier CRRT initiation. However further research is needed to confirm this hypothesis and reveal underlying mechanisms. The impact of Karnofsky performance scale on outcomes in acute kidney injury patients receiving renal replacement therapy on the intensive care unit I. Elsayed 1 , N. Ward 1 Introductions: CRRT with regional citrate anticoagulation (RCA) provides long filter running times and effective dialysis doses. Thus, the risk of hypophosphatemia (hypoPO4) is high and PO4 substitution is required. We investigated whether separate infusion of PO4 or a PO4-containing dialysis fluid is more effective to avoid severe hypoPO4. Methods: Retrospective-prospective observational study in a university hospital. CRRT with RCA-CVVHD (30 ml/kg/h). PO4 substitution via a separate infusion (PO4-I; retrospective data); use of a PO4containing dialysate (CiCa dialysate K4, Fres. Medical Care)(PO4-D, prospective data). Primary outcome: incidence of severe hypoPO4. Secondary outcome: maintenance of normal PO4-levels, duration of mechan. ventilation and length stay in the ICU (with approval of local ethics committee, NCT01946113 ally ill patients requiring CRRT but contraindicated for heparin. Exclusion criteria:liver disease, liver injury, noradrenaline >0.5mcg/kg/min, ideal body weight >90 kg, receiving >6 u/hr of insulin, sodium <120 or >160 mmol/l, pH >7.5 or bicarbonate > 40 mmol/l. Primary outcome measure-filter time Secondary outcome measures:calcium replacement; Post filter ionised calcium (iCa); acid base status and serum sodium levels. A dose choice of 25 or 35 mls/ kg/min with parallel changes to the filtration fraction from 20 to 25 % as required to manage metabolic alkalosis. If metabolic alkalosis persisted, RCA was stopped. Acid Citrate Dextrose Forumula-A (ACD-A) (113 mmol/L of citrate) is administered at a weight related rate aiming for a citrate concentration of 2.5-3 %. Replacement fluid is Accusol 35 (1.75 mmol/l calcium). A post filter solution of 10 mmol/l calcium chloride is administered according to post filter ionized calcium (iCa).Bicarbonate, iCa and sodium measured at 3 to 6 hourly intervals during therapy. Introductions: The Ci-Ca CVVHD is a successfully implemented protocol in several European countries to conduct CRRT in patients with AKI [1] . This protocol recommends a calcium chloride solution (CaCl-S) for calcium supplementation. Due to non-availability of CaCl-S, we modified this approach and used a calcium gluconate solution (CaGl-S) instead. Methods: The Ci-Ca CVVHD protocol [1] relies on modifying the calcium substitution to control s-iCa, which we applied analogously in our modified protocol. Further in the Ci-Ca CVVHD protocol, the ratio between dialysate flow (Qd) and blood flow (Qb) is linked to the effect on the acid-base status and typically set to 33 %, which we essentially applied throughout the study using 3 l/h dialysate flow at 150 ml/min blood flow. Systemic ionized Ca (s-iCa)and acid-base parameters were closely monitored to ensure safe application of our modified protocol. We retrospectively analyzed 14 critically ill patients. Results: With respect to systemic iCa, our patients group had low values at start of Ci-Ca CVVHD, which might already explain the higher required amount of CaGl-S compared to literature data assuming that a calcium deficit in the patients is filled up, see Fig. 39 . Concerning acid-base results, we did see a slight tendency to alkalosis after prolonged Ci-Ca CVVHD with CaGl-S, see Fig. 40 . This might well be attributed to gluconate metabolism yielding some additional bicarbonate. Conclusions: We successfully established use of a CaGl-S with the Ci-Ca CVVHD protocol. Noting the slight trend to alkalosis, we have decided to slightly reduce the blood flow to 130 ml/min. This will reduce citrate needs and less citrate will be metabolized by the patient to bicarbonate. Table 24 as median (lower and upper quartiles). All patients were non-responding to the Standard of Care for the treatment of severe sepsis/septic shock. Therefore, CytoSorb was used as adjunctive therapy in combination with continuous renal replace therapy (CRRT), in order to control the cytokines storm and improve the hemodynamic stability. It was installed in series connection after the dialyser in the CRRT circuit for 24 h (median duration of the treatment: 48 h). Clinical parameters were collected before and after every treatment with CytoSorb. Introduction: This study quantifies the ability of CytoSorb® hemoadsorbent polymer beads to adsorb a broad selection of inflammatory pathogen-and damage-associated molecular pattern molecules (PAMPs & DAMPS) and cytokines from whole blood in an in vitro recirculation system. PAMPS cause either direct damage or trigger an immune response in the host to fight infection leading to the production of high levels of cytokines and the release of DAMPS into the bloodstream, which can trigger a maladaptive systemic inflammatory response syndrome (SIRS) that can contribute to organ injury. The benefits of cytokine reduction using extracorporeal blood filtration with hemoadsorbant porous polymer beads has been demonstrated in septic animals, yet the adsorption of other toxins and inflammatory factors may also contribute to the observed benefits. Methods: Purified proteins were added to whole blood and recirculated through a CytoSorb® device or control device for 5 hours. Plasma was analyzed by ELISA. Results: Hemoperfusion of whole blood through porous polymer bead devices for 5 hours removed substantial quantities of a broad spectrum of DAMPS, PAMPS (Table 25 ) and cytokines (ns). Levels of the inflammatory proteins were reduced by <20 % during the five hour hemoperfusion through a control device (ns). Conclusions: This study demonstrates that CytoSorb® hemoadsorbent polymer beads are capable of reducing a broad range of toxic DAMPS and PAMPS from blood providing a means, in addition to cytokine reduction, of reducing the uncontrolled immune response that contributes to a maladaptive SIRS response, organ injury and death in patients with a broad range of life-threatening inflammatory conditions such as sepsis, trauma, lung injury and many others. Further study to elucidate the potential clinical impact is warranted. Observations in early vs. late use of cytosorb therapy in critically ill patients Introduction: There is still uncertainty about whether interventions for blood pressure is effective to treat new-onset acute kidney injury (AKI) in critically ill patients. Moreover, little things are known about the significance of blood pressure in AKI without sepsis. The purposes of this study are to explore which parameter of blood pressure after the onset of AKI is associated with AKI stage progression, and to investigate whether the meaning of blood pressure in new-onset oliguric AKI is different between patients with or without sepsis. Methods: This is a retrospective cohort study including adult patients whose stay in the ICU (Jikei University Hospital, Tokyo, Japan) for 24 hours or more after the new onset of oliguric AKI. Oliguric AKI was defined as urine output less than 0. Introduction: Glomerular filtration rate (GFR) describes the flow rate of filtered fluid through the kidneys. Creatinine clearance (CrCl) is the volume of blood plasma that is cleared of serum creatinine (SCr) per time unit and is an important measure to approximate "true" GFR. CrCl can be estimated either by population based mathematical equations or it can be measured using serum and urine creatinine concentration after collection of urine over a certain period of time. Deterioration of CrCl/GFR identifies patients with acute kidney injury and is important for appropriate drug dosing. In our ICU practice the MDRD-eGFR equation is standard of care for a "reliable" estimation of GFR. Recently, the CKD-EPI-SCr equation was introduced as a more accurate measure especially in patients with higher GFRs. However, in critical ill patients there is no steady state of SCr, which is a prerequisite in order to estimate GFR irrespective of the mathematical formula used. We regularly treat patients with demonstrably highly impaired kidney function in spite of "normal" estimated GFR leading to toxic levels of renally cleared drugs. We investigated the actual practice of GFR approximation in Dutch ICUs. Introduction: The serum-creatinine based estimation of Glomerular Filtration Rate (eGFR) using the MDRD-equation is inaccurate in critically ill patients and tends to severely overestimate the "real time clearance capacity" especially in catabolic states. Recently, CKD-EPI-Serum-Creatinine and CKD-EPI-Cystatin-C equations were introduced as more precise alternatives for eGFR. Originally, all these equations have been validated in patients with stable chronic kidney disease. Most Dutch hospital laboratories, ours included, still report MDRD-eGFR in their electronic patient data system, however many intend to, or already have switched to CKD-EPI-Serum-Creatinine-eGFR. To date, (e)GFR is a key element in clinical decision making, even in the ICU setting. However, indiscriminate application of these mathematical equations may be questionable in the critically ill. Methods: In order to calculate GFR applying MDRD, CKD-EPI-Serum-Creatinine and CKD-EPI-Cystatin-C formulae and assess their mutual comparability in an all-comer adult academic ICU cohort we performed a post-hoc analysis from the NGAL-study. MDRD-eGFR, CKD-EPI-Serum-Creatinine-eGFR and CKD-EPI-Cystatin-C-eGFR were calculated using admission-day data. Man-Whitney U test was used for comparison between groups. Introduction: Patients hospitalized for septic shock have an increased risk to develop acute renal failure (ARF) and require a renal replacement therapy (RRT). The best delay to start RRT is a matter of debate. A new urine test, the Nephrocheck TM has been validated. It corresponds to the urine concentration product of 2 tubular suffering markers (TIMP2 and IGFBP7) associated with a risk of developing an ARF within 12 hours. The aim of our study was to analyze the capacity of the urine TIMP2*IGFBP7 to predict the absence of evolution toward the AKIN 3 in patients hospitalized for a septic shock who present an ARF AKIN 1 or 2. Methods: Every patient admitted for a septic shock in our medical ICU who presented an ARF AKIN 1 or 2 within the first 6 hours following the start of catecholamines was prospectively included. Exclusions criteria were pre-terminal chronic renal failure or chronic dialysis, anuria, patients AKIN 0 or AKIN 3. TIMP2*IGFBP7 urine assay was performed at baseline (H0) and H12. Clinical and biological parameters were collected at H0, H12 and H24. The performance of the Nephrocheck TM to predict the absence of evolution toward the AKIN 3 level within the first 24 hours was tested for the baseline measurement and for the evolution of the Nephrocheck TM between baseline and H12 (delta H0-H12 were less likely to recover to a lower RILFE category compared to non-septic patients (improvement between day 3 and day 7 in 22 vs. 32 %, p < 0.0001). Dialysis incidence was higher in septic patients compared to non-septic patients (21 % vs. 5 %, p < 0.0001). Also in-hospital mortality rates were higher (33 % vs. 14 %, p < 0.0001), which increased to 51 % in septic and 35 % in nonseptic AKI-F patients. Hospital mortality of septic AKI-F patients was similar compared to septic patients with pre-existent CRF (HR 1.6, 95%CI 1.3-2.0 compared to 1.8, 95%CI 1.5-2.1 in septic AKI-F patients). Septic patients with AKI-R or AKI-I that improved between day 3 and day 7 after admission showed a similar mortality compared to septic patients who did not develop AKI, while mortality remained elevated in AKI-F patients that improved between day 3 and 7. Conclusions: AKI in sepsis is more severe, less likely to recover and associated with a higher mortality, compared to non-septic AKI. Introduction: Gynae-oncological surgery involves a patient population that is elderly, undergoing intra-abdominal surgery sometimes with tumour extension to the ureters; long anaesthetic and surgical times with extensive blood loss. All of these factors have been highlighted in the Kidney Disease: Improving Global Outcomes (KDIGO) as factors for developing acute kidney injury (AKI) [1] . The incidence of AKI in the gynaecological population is thought to be about 13 % [2] . Enhanced recovery after surgery (ERAS) pathways incorporate the use of goal directed fluid therapy with cardiac output monitoring perioperatively. Our study aim was to look at the incidence of AKI before and after the introduction of an ERAS pathway in our centre. Methods: We retrospectively looked at patients undergoing major gynae-oncological surgery before and after the introduction of an ERAS pathway. There was no patient exclusion in our cohort. We used the KDIGO definition of AKI using the patient's preoperative serum creatinine as baseline. Serum creatinine on admission to the critical care unit was taken as day 1. Data was collected for day 1, 3, 7 and 30. The main outcome of the study was the incidence of AKI pre and post ERAS introduction. Contingency analysis using Yates Chi-squared was performed. Results: Our single centre cohort included 459 patients, of which 127 were in the group before the introduction of ERAS (pre ERAS) and 332 were post introduction group (post ERAS). The overall incidence of AKI was 10.24 % in the pre ERAS group and 9.04 % in the post ERAS group which was not statistically different. At day 1 the incidence of AKI was 3.94 % in the pre ERAS group and 1.52 % in the post ERAS group (p-value 0.22). At day 3 it was 8.2 % in the pre group and 6.48 % in the post group (p-value 0.67). Analysis showed there was no statistical difference between the incidence of AKI in the pre and post ERAS groups at day 1,3,7 or 30. Conclusions: The incidence of AKI in our cohort of patients was lower than cited at about 10 %. There was no statistical difference in the incidence of AKI after the introduction of an ERAS pathway. Thus, we observed no reduction in the incidence of AKI with the introduction of an ERAS pathway. Introduction: Acute kidney injury (AKI) is an independent contributor to morbidity and mortality [1] . In view of this we evaluated factors that may be associated with the development of AKI after hip and knee arthroplasty and studied the subsequent effect AKI had on length of stay (LOS) and mortality. [2] . It is possible that this may be secondary to the routine post-operative admission of patients with risk factors to the HDU environment. Risk factors found to be associated with AKI were consistent with the results of previous studies [3] . Even Stage 1 AKI resulted in an increased mortality at one year. Interpretation of our findings is limited by confounding. Early recognition of risk factors will aid identification of patients at increased risk of AKI. Subsequent pre-optimisation and normalisation of reversible risk factors where possible as well as planning of appropriate post-operative care may reduce the burden of AKI. Introduction: Acute Kidney Injury (AKI) has been identified as a common and clinically significant complication after major trauma requiring ICU admission [1] . Methods: We performed a retrospective, single centre observatory study of major trauma admissions to the Royal London Hospital Adult Critical Care Unit over a 30-month period to assess the incidence and associations of AKI in this population. AKI was defined by the related KDIGO definitions using creatinine criteria. Traumarelated AKI was defined as AKI by the KDIGO creatinine criteria occurring in the seven days after admission. In the absence of known baseline creatinine the admission value was used. Results: 858 patients admitted to the ICU after major trauma were included in the study, 81 % were male. Median Age was 41 (IQR: 27-56), median New Injury Severity Score was 34 (22-50) and median ICU admission APACHE-2 was 12 (8-16). Overall, AKI incidence was 16 %. AKI was associated with greater hospital mortality (33 % vs. 16 % p < 0.001) and hospital length of stay (Fig. 45) . In multivariable logistic regression, accounting for illness severity scoring AKI remained associated with increased in-hospital mortality (OR = 1.9, 1.2-3.2) however when broken down by AKI category only AKI-3 was significantly associated (OR = 3.4, 1.6-7.3). Conclusions: Acute Kidney Injury is a common finding in critically ill patients after major trauma and is associated with greater illness severity, rates of death and length of stay. Introduction: We have demonstrated that even mild and transient perioperative acute kidney injury is associated with significantly worse survival in the year after major surgery (see accompanying abstract). However, confounding effects of major illness on serum creatinine levels may cause severity of AKI to be underestimated and recovery from AKI over-estimated [1] . Methods: From a population of 1897 patients undergoing major surgery we examined of which 128 had AKI we examined baseline and hospital discharge creatinine-base estimated GFR (CKDEpi formula) and the trajectory of serum creatinine in patients who were hospitalized for > =5 days after surgery. Results: In 1836 patients who survived to hospital discharge median duration of hospitalization was 7 days (IQR: 4-11) and 11 days (6-25) in those patients who had post-operative AKI. In all hospital survivors mean eGFR rose significantly from baseline to discharge: 88.4 to 94.7 ml/min/1.73 m 2 (p < 0.001). However in those with AKI discharge eGFR was similar to baseline 76.4 to 74.5 (p = 0.6). 1225 patients stayed > =5 days after surgery in those with AKI mean creatinine rose rapidly and eventually settled to near baseline however overall creatinine fell progressively in the five days after surgery and was 10 % below baseline at discharge (Fig. 46) . In a regression analysis a higher discharge eGFR significantly correlated with longer hospital length of stay (p < 0.001), however no such relationship existed for baseline eGFR (p = 0.79). Conclusions: Baseline reduction in serum creatinine after major surgery may confound our ability to identify AKI and to assess its severity and recovery. Importantly, these effects may be more pronounced in patients with longer hospital admissions who are at higher risk of complications including AKI. Introduction: Despite of the advances in the critical care diagnosis and treatment, the incidence of acute kidney injury (AKI) remains high. In cardiac surgery AKI is reported to be around 30 % with need for renal replacement therapy up to 1 %. [1] We aim to characterize the differences in premorbid status, etiology, intraoperative and postoperative influencing factors for AKI development and the patients' secondary outcomes. Methods: Single center retrospective observational sturdy, AKI was determined from Acute Kidney Injury Network as an acute (within 48 h) deterioration in renal functions when serum creatinine concentration entirely increase to value of greater than 0.3 mg/dL (26.4 μmol/L) from baseline or 1.5-fold increment from baseline) [2] . Patients divided into 2 groups, group I without AKI (544 patients) and group II with AKI (181 patients). The patients admission database were recorded as well as the secondary outcome measures. Results: The mean age was 52.7 ± 11.6 years. We reported high AKI association in our settings (25 %). The AKI group match the other group with respect to age, gender, body mass index, the association of diabetes or hypertension. AKI group had significantly higher additive Euro Score 6 ± 4 vs 4 ± 2, lower ejection fraction 47 ± 11 vs 50 ± 9 ( P = 0.00 and 0.00 respectively. Lengths of ventilation, stays in ICU and in hospital were significantly higher in the AKI group. The AKI group with post-operative atrial fibrillation 29.8 % vs 10.5 % (p = 0.001). AKI group had significantly higher mortality 6 % vs group I 1.7 % (p = 0.026). Interestingly, AKI was more prominent in Asian group than Arab 92(50.8 %) vs 77(42.5 %) P = 0.04 The independent risk factors for AKI in our population were additive Euro Score, cardiopulmonary bypass time, post-operative low hemoglobin, postoperative white cell count and total blood loss. Conclusions: Cardiac surgery induced AKI is highly predominant and prognostically substantial. Therapies targeting preoperative anemia, cardiopulmonary bypass time, and perioperative red blood cell transfusions, may display guard against this complication. Introduction: Acute kidney injury is a common complication after major surgery that has been associated with adverse outcomes. Methods: A retrospective study of major non-cardiac surgery in a teaching hospital over an 18 month examining the incidence of AKI with one-year follow-up. AKI was defined by KDIGO creatinine criteria. Major surgery was defined as inpatient procedures of >1 h duration. To assess the independent effect of AKI on survival, a multivariable Cox proportional-hazard survival model was developed. Results: 1897 patients were included, post-operative AKI occurred in 128 (6.8 %), of these 101 had KDIGO stage 1 (5.4 %), 19 stage 2 (1 %) and 8 stage 3 (0.4 %). Patients who developed AKI were significantly older, had a lower baseline eGFR, were in a higher ASA class and spent longer in hospital post-operatively compared to those without AKI. Hospital Mortality occurred in 17/128 (13.3 %) of patients that developed AKI post-operatively compared to only 16/ 1741 (0.9 %) of those that did not (p < 0.001). However by a year of surgery 34/94 (27 %) of patients who developed post-operative AKI had died compared to only 106/1741 (6 %) who had not had AKI (Fig. 47) , p < 0.001. In the multi-variable model accenting for multiple confounders AKI was associated with a hazard ratio for death of 3.0 (1.91-4.72) in the year after surgery. Importantly AKI defined by only a 26 μmol/L creatinine rise and AKI with recovery remained associated with on-going ibcreased risk of death to one year. Conclusions: In major surgery patients we have demonstrated an association between even mild acute kidney injury and increased risk of death in the next year. Introduction: Early detecting acute renal failure in intensive care units is important by the aspect of determination of the disease's severity and grade of the organ dysfuncion, discharging the patient from the intensive care units and decreasing mortality rate. The Acute Dialysis Quality Initiative workgroup proposed a classification system for acute kidney injury (AKI) identified by the RIFLE (Risk, Injury, Failure, Loss of kidney function, and End-stage kidney disease). The purpose of our study is to assess acute renal failure development in intensive care units patients, factors affecting it and the effect of the factors over mortality via using RIFLE score. Methods: The age, height, weight, gender, diagnosis, comorbid diseases, admission reason to the intensive care and intensive care stay were recorded. Also in the first day, in the seventh day and in the 14.day after admission to the intensive care, the APACHE II score, SOFA score, RIFLE score, biochemical parameters (albumin, prealbumin, urea, creatinin, cholesterol, HCO3-level), triceps thickness and waist circumference measurement were all recorded. Patients was grouped into AKI and non-AKI. Patients in non-AKI group had no oliguria and obvious creatinine rise, while patients in AKI group had oliguria and obvious creatinine rise. The AKI group was assessed by RIFLE score due to hourly urine output, creatinin rise and seperated into three groups as R = Risk, I = Injury, F = Failure. Results: Our study was carried out with 502 patients. 39,2 % of the patients was in the acute kidney ýnjury group while %60.8 was in non-acute kidney ýnjury group. acute kidney ýnjury group was consisting of % 21,3 (n = 107) R group, % 12,4 (n = 62) I group and % 5,6 (n = 28) F group. The renal failure development is related with high age, short body height, excessive weight, existence of chronic disease and long intensive care units hospitalization period. As a result of our study renal failure was also related with high urea, creatinine, HCO3-levels, low cholesterol, albumine and prealbumine levels. Renal failure was not related with triceps thickness measurement. Intensive care units hospitalization stay and mortality were higher in patients who develop renal failure. Conclusions: we think that patients especially in intensive care units with high age, excessive weight, chronic diseases, high urea, creatinine, HCO3-levels, low cholesterol, albumine and prealbumine levels are prone to renal failure. Those patients should be carefully followed for renal failure. 96.9 ± 0.002 % and 96.7 ± 0.002 %, respectively. Mean survival time of the graft was 47.87 ± 0.07 months in these patients and survival rates of the graft for 3,6,9,12,24,36 and 48 months were 99.2 ± 0.001 %; 99 ± 0.001 %; 98.7 ± 0.001 %; 98.5 ± 0.001 %; 97.6 ± 0.002 %; 96.7 ± 0.002 % and 96.3 ± 0.003 %, respectively. Conclusions: In recent years, there is a significant increase in livekidney transplantations in our country, due to inadequte obtaining of organ from cadaver. We observed a quite high patient and graft survival times and a low chronic rejection incidence in our livekidney transplantation patients. Although there is a high life quality and better graft function in live-kidney transplantations compared to cadaver-kidney transplantations, cadaver-kidney transplantations should be increased. Introduction: Difficult intubation and extubations remains the most important problem in anaesthesia of patients with neck phlegmon [1, 3] . Old guidelines for airway flow [2, 3] does not fully satisfy the anesthesia procedures in patients with the neck phlegmon. We should create safe algorithm for intubation and extubation in that patients. Methods: Uncontrolled prospective cohort clinical study was made in 75pts. with neck phlegmon. In main group (n = 38) intubation was performed using fibrobronhoskope or Flaplight laryngoscope with video-adapter. Tracheostomy was not performed. Sedation was by deksmedetomidine infusion. In control group (n = 37) intubation was made by conventional laryngoscope and early tracheostomy followed by surgery. Sedation by thiopental Na or propofol. Results: The laryngoscope Flap-Flight upgraded with video-adapter was more efficient in case of difficult intubation. Early tracheostomy was associated with absolutely risk of mediastenitis (28,0 %) (95%CI: 0,14-0,47) in comparison with patients without tracheostomy. Dexmedetomidine was more effectiveness method for sedation which reduced reaction of cardiovascular system from 86 % to 9,5 % (χ =24,44; Ð < 0,01). In control group was 13,5 % of postextubation ARF in comparison with 0.3 % in main group. Postponed extubation was performed from 10 to 72 hours after decreasing inflammatory, larynges edema, and trizm. Intubation algorithm: 1. Evaluate risk difficult intubation, trizm and stridor; 2. Perform laryngoscopy and classify by Kormack-Lehane: III-IV grade -use fibrobronchoscope; 3. In case of opening mouse to 1,5 cm, unable to use fibrobronchoscope -perform tracheostomy under LA. 4. In case cannot ventilate-cannot intubate surgery airway. Exctubation algorithm:1. Postponed extubation (10-72 hrs) is recommended; 2. For sedation use dexmedetomidin infusion rate 1,5-3 mqg/kg/hrs. Conclusions: The algorithm of anesthesiologist's actions was created to provide the safe airway flow in intra-and postoperative period of patients with neck phlegmon, which reduced the incidence of serious intubations from 51,4 % to 7,9 %, failed intubations from 19 % to 0,3 %, mediastinitis from 27 % to 2,6 %, deaths from 32,4 % to 2,6 % and mortality due to traheostomy from 32 % to 6,3 %. Nasal high flow oygen for acute respiratory failure: a systematic review R. Pugh, S. Bhandari Glan Clwyd Hospital, Rhyl , UK Critical Care 2016, 20(Suppl 2):P220 Introduction: Nasal high flow oxygen (NHFO) is an attractive therapy for acute hypoxemic respiratory failure (AHRF), with reported comfort, and theoretical wash-out of CO2 and generation of PEEP. However, its clinical efficacy is uncertain. We undertook a systematic review of current evidence to support its use in adults. Methods: Medline and EMBASE were searched using the terms: nasal AND (high-flow OR high flow) AND (oxygen OR oxygen therapy. RCTs were included if comparison was made between NHFO and standard O2 therapy, or NHFO and CPAP or NIV, for AHRF in adults, and primary outcomes (intubation and mortality) reported on. Results: We identified 468 potentially relevant studies. 4 RCTs met inclusion criteria. There were important differences in patient groups, interventions and outcome measurements, precluding meta-analysis. There were no significant differences in intubation rate or mortality between intervention groups, with the exception of Frat (2015) , who demonstrated significantly reduced 90d mortality in NHFO compared with standard O2 and NIV groups, and lower intubation rate on post hoc analysis of patients with lowest PF ratio. Conclusions: There is only limited evidence from RCTs at present to support use of NHFO rather than conventional therapy for acute hypoxemic respiratory failure in adults, and further study of patients at risk of requiring invasive ventilation is needed. Setting optimal flow rate during high flow nasal cannula support: preliminary results T. Mauri 1 , C. Turrini 2 , T. Langer 1 , P. Taccone Introduction: High Flow Nasal Cannula (HFNC) is a non-invasive respiratory support that might impact major clinical outcomes of acute respiratory failure patients [1] . We present preliminary data from a clinical study that aims to describe physiological effects of HFNC at different flow rates. Methods: We performed a prospective randomized cross-over study on This indicates that each set of data (n = 10) are normally distributed and therefore dosing is consistent. It may be concluded from this study that delivery of aerosol to adults at both high and low gas flow rates via NHF therapy is an efficient and highly reproducible means of administration. Introduction: Here we present an improved theoretical version of a valve for transtracheal ventilation as a bi-directional manual respiratory pump where a combination of low flow during inspiration, by reducing gas supply to the valve, and increased flow during expiration, by increasing gas supply to the valve, permits more effective venturi effect and efficient expiration, with low total gas consumption. Methods: The theoretical performance of the valve was modeled mathematically and the model was tested in vitro with a standard valve but by variable flow rates (Fig. 52) . Results: It was shown that, by increasing flow during expiration, the valve would permit to shorten the expiratory time and achieve higher minute volumes (i.e. volumes of 7 L/min of gas or higher), as compared with the ventilation with the similar transtracheal cannula without variable flows (volumes achieved were about 4 L/min). Variable flow would provide shortening of the inspiratory time and efficient expiratory aid, and permit I:E ratios of 1:1, or even the inverse ratio ventilation. Conclusions: Satisfactory lung ventilation can be assured with transtracheal ventilation with a bidirectional manual respiration valve with variable gas flow. Conclusions: Long-term survival in patients selected for PMV with tracheostomy after failure to wean was not significantly different from that of weaned patients with tracheostomy. This suggests that survival of these patients depends more on the underlying disease(s) than on the ventilation itself. Despite the physical QoL scores being low in both groups, mental QoL was still acceptable. Recently, ultrasound has emerged as a potentially useful tool to assist PDT as it helps to identify the most appropriate location for the tracheal puncture site, guide needle insertion into the trachea, and may significantly reduce procedure-related complications. Methods: An open-label, parallel, noninferiority, randomized controlled trial was conducted comparing ultrasound-guided PDT with bronchoscopy-guided PDT in mechanically ventilated critically ill patients indicated for a tracheostomy. The primary outcome was procedure failure defined as a composite end-point of conversion to surgical tracheostomy; associated use of bronchoscopy in the case of ultrasound-guided PDT; associated use of ultrasound in case of bronchoscopy-guided PDT; or occurrence of a major complication. Introduction: Many of the patients admitted to the ICU receive a tracheostomy and are discharged whit it to the general ward. Insufficient skills and experience of staff caring for tracheostomy patients may lead to suboptimal care and increased morbidity and mortality. Tracheostomy outreach teams and an appropriate training to the staff are the key to improve quality of care, decrease adverse events and make decannulation with safety in these patients. We conducted a retrospective study in a medical ICU in a tertiary care hospital that doesn't have a step-down unit. A two intensivists team followed up daily the patients discharged with tracheostomy from the ICU to the general ward until they were discharged from the hospital. Training sessions to the ward,s staff (doctors, nurses and nursing assistants) were made about tracheostomy cares every week. We analized clinical and epidemiologic variables, time to decannulation, complications related to the tracheostomy, decannulation failures, readmissions and mortality. Results: From February 2012 to October 2015, a total of 2568 patients were admitted to our ICU; during this period, 100 patients were discharged to the regular ward with tracheostomy tube in place. Of these, 68 % had neurological damages (due to head injury, ischemic and hemorrhagic events), 14 % chronic obstructive pulmonary disease, 8 % severe miopathy, 6 % postanoxic encephalopathy, 2 % Guillain Barre and 2 % amyotrophic lateral sclerosis. 10 % of the patients with a high length of stay had complications because of the tracheostomy (tracheal stenosis, granuloma and malacia), and all of them were solved. There were 2 readmissions to the ICU due to cannula obstruction. Decannulation at the ward was possible in 36 % of the patients, none of them had to be recannulated and all survived. Global mortality was 23 %, but excluding the patients under palliative cares, the mortality decreased to 5 % and the causes of death were not directly related to the tracheostomy. Conclusions: In hospitals without a step-down unit and a high healthcare demand the monitoring of the tracheostomized patients by a skilled team let to decannulate the patients at the ward with safety. It would allow discharge sooner this kind of patients from the ICU and therefore decrease the length of stay, the complications associated to the ICU admission and accordingly the costs. On the other hand, once at the ward, the team can detect early the complications, solve them and create a safety atmosphere for the patient, the family and the staff in charge. In the 2nd group, our study revealed the following complications: complication of subcutaneous emphysema -6 cs (17.1 %), but pneumomediastinum -2 cs (5.7 %), lobar emphysema -1 cs (2.8 %), suppurations -2 cs (5.7 %), situations that were resolved. All children received a special critical care under the Protocol of pneumonia prophylaxis associated with artificial ventilation: gentle suctioning of secretions with closed suction systems after cuffinflation. Placing on 30°and oral hygiene for 2-3 times/day being mandatory for each one. Also, a daily hydroelectrolitic rebalancing, calculation of individual energy need and symptomatic therapy was performed. Along with the improvement of general condition, appearance or improvement of the deglutition reflex, the oral nutrition with semi-solid preparations recommenced in 5 children (22.5 %) from the first day of tracheostomy application, in 7 cs the nutrition recommenced on 3-5 day together with partial parenteral nutrition until the full resumption of oral nutrition. It is worth noticing also the significant decrease in length of stay on artificial respiration in the 2ndgroup of children (22 ± 0.9 days) compared to the 1stgroup (37 ± 1.7 days). Despite the complex therapeutic effort, due to severity of basic disease in the 1 group 15 (24.5 %) children died and in the 2nd group 3 children (8.5 %) died. Conclusions: 1.The application of tracheotomy in patients with endotracheal intubation and prolonged artificial ventilation reduces the infant mortality rate. 2. In the 2 group we have noticed the decrease of the duration of treatment, which leads to a reduction in the prime cost of the child treated in the intensive care unit. The Introductions: Acute cardiogenic pulmonary edema (ACPE) is a common cause of acute respiratory failure. Non invasive continuous airway positive pressure (CPAP) has been validated as effective treatment in addition to pharmacological therapy. The aim of study is to compare non invasive pressure support ventilation (PSV-NIV) and CPAP in this setting of patients. Methods: From 1 september 2015 to 15 november 2015, 24 patients were admitted to emergency department (ED) for ACPE. All patients were treated with standard medical therapy. In addition, 12 patients were treated with CPAP (group A) and 12 were treated with NIV-PSV (group B). Arterial hemogasanalysis was performed at admission (t0) and after 30 minutes from the beginning of ventilation treatment (t30′) to evaluate pH, pO2/FiO2, pCO2 and lactate cleareance (lactate t0-lactate t30′/lactate t0). Results: We enrolled 24 patients (10 men and 14 women). The median (o average) age was 74 years. Baseline characteristics of patientes were: sistolic blood pressure 180 mmhg, diastolic blood pressure 102 mmhg, pO2/FiO2 181, Ph 7,21, pCO2 53 mmhg, lactate 3,4 meq/l. After 30 minutes of CPAP, we noticed in group A patients an improvement of 52 % in pO2/FiO2 ratio, an increase of 0,02 points in pH value, a reduction of 10 % in pCO2, a lactate cleareance of 28 %, in group B an improvement of 120 % in pO2/FiO2 ratio, an increase of 0,13 points in pH value, a reduction of 20%in pCO2, a lactate cleareance of 35 %. No patients required endotracheal intubation. Two patients in A group, after 30 minutes of CPAP, shifted in PSV-NIV because of increase of pCO2. Conclusions: The preliminary results of our ongoing trial have shown a better outcome at t30′ in patients affected by ACPE treated with PSV-NIV compared to those treated with CPAP in lactate clearance, improvement of pO2/FiO2 ratio, pH and pCO2 value. No differences in side effects between two groups were shown. Introduction: Non-invasive ventilation (NIV) is commonly used as a first line therapy for immunocompromised patients with acute respiratory failure, although it may not be appropriate for every patient. Failure of NIV is an independent predictor of mortality and delayed endotracheal intubation may worsen prognosis. We report our center's experience and outcomes for patients with active haematologic malignancy treated with NIV. Eight patients had a "do not intubate" status due to their comorbidities. The most frequent reason to perform NIV was bacterial pneumonia (44 %) followed by pneumocystosis (30 %), ARDS (16 %), flu (6 %) and tuberculosis (2 %). Median time of NIV was 3,5 days. We had a NIV success rate of 60 % (14 patients were intubated, 5 did not have intubation indication and 1 patient died from another cause). The mortality rate was 28 %. We found statistically significant association between the severity scores SAPSII and APACHE and the NIV result and death (higher scores were associated with failure of NIV and death). We also found that patients with hematological neoplasia under chemotherapy had higher rates of failure of NIV and death. HIV infection was associated with death but not with NIV failure. Conclusions: The use of NIV is nowadays more frequent apart from its classical indications. We found that higher severity scores and hematological neoplasia under chemotherapy were associated with unsuccessful NIV. However NIV can be used in these patients as a comfort measure. In this prospective and observational study, we aimed to investigate the effect of fragility on the application and results of NIV in the intensive care unit (ICU). Methods: We included in the patients over 50 years old who were admitted to Medical ICUs of our hospital. For detection of patients fragility Clinical Fragility Score (CFS) and comprehensive geriatric assesment were used. Patients who have CFS > =5 were considered as fragile. The successful NIV group (NSG) is defined as achieving success in at least two of the followings: PaO2 > 60 mmHg, PaCO2 < 50 mmHg, pH = 7,35-7,45, improvement of respiratory effort, recovery of consciousness. The relationship between entubation and fragility was also assessed. Application problems were defined as; cooperation problems, hearing problems, delirium/agitation (Richmond agitation sedation scale > 1) , claustrophobia, alzheimer, problems related with chin, mouth tooth and leak (>35 L/min). Results: We included in 85 patients who were underwent to NIV in two ICUs. Mean age of patients was 72,5 ± 10,6, 60 % of them were male and mean APACHE II score was 20,6 ± 5,7 (8-34). Mean CFS of the patients was 4,6 ± 1,6 and 42 % of them were fragile (CFS > =5). There were 63 (75 %) patients in NSG and 22 (25 %) in NIV failure group (NFG). APACHE II and SOFA scores of NFG were higher and NSG had more non-fragile patients (CFS < 5) (41 vs 22, p = 0.011). Fifty-three (62,4 %) patients had NIV application problems and mean CFS of this group was higher (4,9 ± 1,6 vs 4,0 ± 1,7, p = 0,028). Using NIV previously at home (p = 0,025), APACHE II (p = 0,001), SOFA (P = 0,012) were detected as factors affecting the NIV success in univariate analysis. While fragile patients(CFS > =5) had significantly higher application problems(80 % vs 57 % p:0,038), entubation (31 % vs 8 %, p:0,007) and mortality rates (30 % vs 4 %, p:0,001), significantly lower NIV succes rates (61 % vs 85 %, p: 0,011). In logistic regression analyses fragility was only independent risk factor for NIV failure (OR:3.5, p:0,02). Conclusions: These results showed that fragility is associated with NIV application problems and failure in the ICU and increases NIV failure risk more than 3 times. Effects Conclusions: MA may be seen as high as 30 % in patients with hypercapnic respiratory failure and requiring NIV and may cause to increase in risk of longer hospital stay in these patients . Asynchrony index and breathing patterns of acute exacerbation copd patients assisted with noninvasive pressure support ventilation and neurally adjusted ventilatory assist N. Kongpolprom , C. Sittipunt Chulalongkorn University, Bangkok, Thailand Critical Care 2016, 20(Suppl 2):P238 Introduction: Noninvasive ventilation(NIV) is practically used for patients with acute exacerbation of COPD(AECOPD) with acute respiratory failure(ARF). Several new modes of NIV are currently provided for ventilatory assistance, however it is not known what the best one is. Theoretically, neurally adjusted ventilatory assist(NAVA) improves asynchrony and preserves natural breathing [1] , which might benefit to dynamic hyperinflation. Methods: A pilot crossover study comparing asynchrony index and breathing patterns between AECOPD patients assisted with NIV-PS and those assisted with NIV-NAVA was conducted. AECOPD patients with ARF using NIV-PS were recruited. The clinically stable patients were randomly ventilated with NIV-PS or NIV-NAVA for 30 minutes, and then switched to the other mode after a wash-out period. One patient could be enrolled more than 1 comparative sequences. Breathing patterns, 6 types of asynchrony and electrical diaphragmatic activity(EAdi) were continuously recorded during the 20 last-minute periods. Continuous data were reported as medians (IQR) or means (SD) and compared by using unpaired t-test or the Table 32 . Conclusions: Despite limitations to our study (retrospective, observational and with limited data availability), we think our lower HFOV mortality despite worse baseline oxygenation when compared to OSCILLATE and OSCAR is notable. This may reflect lower HFOV mean airway pressures in shock patients as well as having experienced personnel available 24 hours a day in our institution. We conclude there may still be a role for HFOV in severe ARDS if patients are properly selected and careful attention is paid to mean airway pressures. Introduction: Multiple studies have shown that low tidal volume, lung protective ventilation strategies may reduce pulmonary complications in non-ARDS ICU patients. In busy ICUs, discrepancies may arise between ventilator management goals and actual practice. We developed a goal-directed mechanical ventilation order set that included physician-specified lung-protective ventilation and oxygenation goals to be implemented by our respiratory therapists (RTs). Our primary outcomes were to determine if a simple respiratory therapist-driven order set with predefined goals regarding oxygenation and ventilation could be implemented with good compliance and to determine compliance with the ARDSNet PEEP/ FiO2 Introductions: Over 60 % of patients admitted to an Intensive Care Unit (ICU) will require Endotracheal Tube (ETT) intubation and mechanical ventilation [1] . The aim was to perform an audit of ventilation regimes in critically ill patients admitted to Whiston Hospital. Its main objectives were to evaluate the current management of ventilated patients on ICU at Whiston hospital in comparison to the current local standards of best practice, with the main emphasis surrounding IBW (Ideal Body Weight) documentation and reduction in initial tidal volume (TV) from traditional guidelines. [2] . This study assesses the ability of the lung aeration scoring [3] using LUS to predict successful weaning in critically ill patients. Methods: This prospective, observational and non-interventional study was performed in a general ICU in a large university hospital over a period of 6 months. We included all patients who needed MV for > 48 hours (n = 50), and were eligible for Spontaneous Breathing Trial (SBT). LUS was performed during SBT and lung aeration scoring was calculated in 12 lung regions. Points were allocated according to the worst ultrasound pattern observed: N (normal lung pattern) = 0, B1 lines ( multiple well-defined B lines) =1, B2 lines (multiple coalescent B lines) = 2, c (consolidation or atelectasis) = 3. Relevant data were collected from patients, including demographic and LUS aeration scoring, in a standardized case report form (Table 33) . Patients were divided into two groups: Group F and Group S based on the failure and success in weaning, respectively. The LUS aeration scoring was applied to both groups to assess the predictability of weaning failure. Results: Lung aeration scores were significantly higher in patients in Group F compared to Group S (p < 0.01). The sensitivity, specificity for failed extubation for a LUS score of > 18 is 88.89 and 86.96 respectively. Conclusions: Lung aeration score using lung ultrasound can predict weaning failure in critically ill patients. Our study showed that a LUS score of >18 has a good prediction value. However larger randomised controlled trials are required to validate this pilot study. Introduction: Clinicians are routinely faced with the challenges of managing the ventilatory care and weaning process during the clinical course of the illness. Almost 40 % of mechanical ventilation time is spent on weaning [1] and in the last decade, automatic ventilator control and ventilation modes were designed to help weaning from mechanical ventilation. In many studies automated weaning significantly decreased weaning time in critically ill patients when compared with physician controlled weaning process [2, 3] . We conducted a prospective randomized trial comparing automated weaning with a standardized weaning protocol in a multidisciplinary ICU. Methods: From April 2014 to Mars 2015, we enrolled critically ill adults requiring more than 48 hours of mechanical ventilation. Patients were randomized since admission considering availability of SmartCare equipped ventilators (8 Evita XL in our unit: 4 with SmartCare and 4 without). We included patients who tolerated at least 24 hours of pressure support with PEP equal or less than 5 cmH2O and were not ready to undergo a spontaneous breathing trial, they were assigned to be weaned either by SmartCare (SC Group) or by the physician controlled weaning according to local guidelines (PCW Group). Weaning duration, total duration of mechanical ventilation and the success of weaning were the primary endpoints. Results: Forty-four patients were enrolled, 22 in each group. There was no difference between the two groups concerning age, sex, neurological or respiratory history, diagnosis or severity at study entry. Weaning duration and total mechanical ventilation were similar between the two groups and were respectively 6 +/-7 days in PCW Group vs 5 +/-4 days in SC Group (p = 0.63) and 11 [3-37] days vs 10 [3-23] days in SC Group. In the PCW Group, we noted only one extubation failure and there were no reintubation in SC Group. Conclusions: Although automated weaning using SmartCare did not show significant benefit for patients concerning duration of mechanical ventilation and duration of weaning when compared to physiciancontrolled weaning according to local protocols; it has certainly offer a gain of time for the medical staff in an ICU where there is no respiratory physiotherapist neither critical care specialty nurses. Introduction: Mechanical ventilation (MV) is a therapy for vital support used in a significant proportion of critically ill patients. The right time to succesfully discontinue this therapy is a challenge for the intensivist The ecocardiographic evaluation of the diastolic dysfunction, the diaphragm and the lung have become an invaluable tool for weaning from MV protocols, especially in pacients with dificult or prolong weaning from MV. We propose a mathematichal model of an ultrasound protocol for weaning from MV that integrates the three modalities Methods: Based on current literature, we develop a score justified by a mathematical model based on inequations Results: When the risk of failure in the weaning process rises, for these reason the weaning process should be suspended. When the risk of failure is low and the weaning process should be continued. Conclusions: The use of math models for decision making is of great importance, it sets an objective parameter within the existing evaluations. We proposed the use of inequations, to set intervals of solution with the three points of care for ultrasound guided weaning from MV. With this the inequations propose generate an area of certainty within the proposed values an the solution intervals. Introduction: Our aim was to assess diaphragm ultrasonography as a tool to predict the outcome of weaning. Methods: We enrolled 5 intubated patients who were admitted to our Intensive Care Unit at the Saint Antony Hospital of Padua; a Nutrivent nasogastric tube was placed in each to measure the diaphragm contractility as the transdiaphragmatic pressure (ÄPdi) with the following formula: ÄPdi = gastric pressure-esophageal pressure. We performed a diaphragmatic sonography with the M-mode technique. We calculated the diaphragm contraction speed as the slope (Scdi) of the curve provided by the diaphragm contraction during the inspiratory phase of the spontaneous breathing trials. The correlation between the mean ÄPdi and the mean Scdi was evaluated using the Pearson's method according with a linear regression analysis. Results: We found a significant correlation between the ÄPdi and the Scdi with a Pearson coefficient ñ = -0.851 and a linear determination index R = 0.718 as shown in Fig. 60 . Conclusions: the Scdi calculated during trials of spontaneous breathing represented a bedside, standardized and reproducible tool to predict the outcome of weaning. Methods of titrating PEEP to safely recruit the lungs have included achieving adequate SpO2, attaining best compliance, or increasing PEEP while monitoring esophageal manometry to establish a positive end-expiratory transpulmonary pressure (Ptp). Bedside ultrasonography is available and useful as a diagnostic tool in the care of critically ill patients. Preliminary work identified marked differences in excursions between the dorsal and ventral diaphragms during lung-protective tidal ventilation in ARDS patients. We hypothesize that lung recruitment in the supine patient may be detected non-invasively by increased dorsal diaphragmatic excursions, indicating that dependent lung regions may be recruiting with adequate PEEP. Methods: In this proof-of-concept study, we enrolled 7 ARDS patients treated with invasive mechanical ventilation and neuromuscular blockade in the supine position. We measured Ptp using esophageal manometry and ventral/dorsal diaphragm excursions using anatomic M-mode ultrasonography as the applied PEEP was changed (set by the treating physician) as follows: +3,+6, -3, and -6 cmH2O. A standard lung history was established by recruitment maneuvers between each PEEP change, and the sequence of PEEP changes was randomized for each subject. We used linear mixed effects models to evaluate for an association between applied PEEP and Ptp and dorsal diaphragmatic excursion (DDE). Results: Acceptable ultrasound images of diaphragm excursion were obtained in 6 of 7 patients; one patient had a large hepatic cyst precluding diaphragm ultrasound, data from this subject was excluded from the analysis. At enrollment, the mean(+/-SD) P/F ratio was 156 +/-61 mmHg, mean baseline PEEP was 12.7+/-2.2 cmH2O, and the mean SpO2 was 94+/-2.2 %. Ventral diaphragmatic excursions were unchanged by PEEP. Increasing PEEP resulted in corresponding increases in Ptp and DDE (p = .0006, p = .005). The transition from a negative to positive Ptp with increasing PEEP occurred as DDE was observed to markedly increase. Conclusions: This exploratory study suggests an association between Ptp and DDE as dorsal lung is recruited by increasing PEEP. If validated in a larger sample, ultrasound evaluation of dorsal diaphragmatic movement may be a non-invasive method for estimating lung recruitment in ARDS. Introduction: Pulse oximetry (SpO2) is intensively used as a surrogate for arterial oxygenation (SaO2) however, little is known about its accuracy in daily ICU practice. Disturbed perfusion may lead to a poor signal as indicated by a lower perfusion index value (PFI) and so compromise its accuracy. We studied SpO2 accuracy in its relationship to the PFI. Methods: In a 5 month period, 281 patients were retrospectively studied resulting in 1281 concomitantly measured values of SpO2 (Philips M1191BL/M1194A), SaO2 and PFI. Inotropic use, pH, MAP, mechanical ventilation and body temperature were studied as independent variables. PFI values were categorized as low or poor with PFI < 1.0, intermediate or moderate with 1.0 < PFI < 2.5 and as reliable or high with PFI > 2.5. Data collection: All SaO2, SpO2 and PFI measurements during the first 3 days after admittance were collected unless the patient was invasively ventilated, then all measurements were collected until extubation. Statistical analysis: A linear correlation of the SpO2 and SaO2 values was assessed by Pearson's method and the mean SpO2-SaO2 difference (Δ ) was assessed by using the Bland and Altman method. Results: Statistical analysis showed an overall correlation (r = 0.685; p < 0.01) between SaO2 and SpO2. The Bland and Altman analysis revealed that the limit of agreement within the general mean and ±2SD showed a Δ of ±6 % (Fig. 61) . Of all values SpO2 overestimated the SaO2 value in 48.2 % of all cases and underestimated SaO2 in 31.4 %. Over a wide range of PFI values (median 1.4, range 0.1-19.2) we found moderate agreement between SpO2 and SaO2 with slightly better agreement for higher PFI values (Fig. 62 ). Even with a PFI > 2.5, 15.9 % of all measurements showed a difference between SpO2 and SaO2 of more than ±2 %. The other independent variables showed only very weak associations (r < 0.2) with the Δ SpO2-SaO2, except for mechanical ventilation (ANOVA p = 0.002). Conclusions: We found a clinical relevant lack of agreement between SpO2 and SaO2 measurements. In contrast to general expectations SpO2 exceeds SaO2 in 48.2 % of all measurements. PFI is of little value as differences may be great as even with a high (good) PFI value, 15.9 % of concurrent measurements of SpO2 and SaO2 differ > ±2 %. These findings may influence daily practice on how to adjust mechanical ventilation by SpO2 measurements. (Fig. 63a) . Correlation was poor between the capnograph's RR & EtCO2 measurements: normal EtCO2 coincided with adequate RR just 24.9 % of the time & none of the low RR measurements were revealing of a high EtCO2 (Fig. 63b ). When using RR as a proxy for MV it was also noted that low MV is observed at a wide range of RRs, with only 15.5 % of all low MV events captured by a low EtCO2-based RR (Fig. 63c) . Conclusions: This study confirmed that EtCO2 is an inadequate proxy for MV in non-intubated patients. EtCO2-based RR is a poor proxy for EtCO2 & an even worse proxy for MV. Introduction: Capnography (EtCO2) is unreliable for monitoring respiratory status in non-intubated patients. We used a noninvasive respiratory volume monitor (RVM), which provides accurate measurements of minute ventilation (MV), tidal volume & respiratory rate (RR) in non-intubated patients, to compare the relationship between EtCO2 & MV in intubated patients with general anesthesia (GA), non-intubated patients with spinal anesthesia (SA) & awake volunteers. Methods: RVM data (ExSpiron, Respiratory Motion Inc., Waltham MA) were collected from 153 patients in 3 groups: Group 1, 54 patients, age:65.2 ± 12.1 yrs, BMI:31.2 ± 6.3 kg/m2, surgery with GA; Group 2, 60 patients, 68.9 ± 9.0 yrs, 30.5 ± 5.8 kg/m2, surgery with SA; Group 3, 39 volunteers, 48.5 ± 13.8 yrs, 27.9 ± 8.6 kg/m2, coached to breathe at varied RRs. Groups 1&2 EtCO2 data were collected via ventilator (Draeger Apollo), with an ET tube in Group 1 & sampling nasal cannula in Group 2. Group 3 EtCO2 data were collected via capnograph (Capnostream20, Covidien) with nasal cannula. Deming regression quantified the relationship between EtCO2&MV. EtCO2 sensitivity&mean EtCO2 were compared across cohorts by unpaired t-tests. Results: The EtCO2 sensitivity was significantly higher in intubated patients vs volunteers (-83.5°± 9.7°vs-30.1°± 16.1°, p < 0.001, Fig. 64a ). In non-intubated patients the sensitivity was not normally distributed & spanned the range of intubated patients & volunteers. Measured EtCO2 values were higher in Group1 than Groups 2&3 (37.2 ± 4.4 mmHg vs 23.0 ± 5.2 mmHg vs 31.0 ± 4.0 mmHg, respectively, p < 0.001, Fig. 64b ). EtCO2 measurements were normally distributed in all groups. Conclusions: Our results show that EtCO2 lacks adequate sensitivity to changes in MV&introduces measurement bias with the nasal cannula. Respiratory volume monitoring provides early warning of respiratory depression and can be used to reduce false alarms in non-intubated patients Methods: RVM and SpO2 data were collected from 240 patients (age: 66.8¡À10.3 yrs, BMI:29.6¡À5.7 kg/m2) at 1-min intervals. ¡°Predicted¡ ± MV (MVPRED) was calculated for each patient, based on body surface area. ¡°Low SpO2¡ ± (alarm condition) was an SpO2 < 90 %; ¡°desaturation event¡ ± was ¡°Low SpO2¡ ± for ¡Ý2min; and ¡°false alarm¡ ± was ¡°Low SpO2¡ ± for <2 min. ¡°Un-Safe MV¡ ± was defined as MV < 40%MVPRED. MFANOVA was used to evaluate differences in the clinical populations, & unpaired one-sided t-tests were used to compare measurements across groups. Results: Of 80 alarm conditions, 62 (78 %) were false alarms. The other 18 events (¡Ý 2 min) occurred in 15 patients (Fig. 65a) . The RVM showed that 11/18 desaturation events were from patient motion & high MV & only 7 events were ¡°true desaturation¡ ± events, (Fig. 65b) . Each true desaturation event was preceded by ¡°Un-Safe MV¡ ± by 16.7¡À4.6 min (mean¡ÀSEM) & the severity of ¡°Un-Safe MV¡ ± was strongly correlated with the time delay (r = 0.77, p < 0.05). While MFANOVA found no difference in the demographics of the populations with true desaturation events vs. false alarms (p > 0.2 for ht, wt, age, BMI, sex), the length of stay in the PACU for true desaturations group was significantly longer (176¡À9min vs. 134¡À18min, p < 0.05). Conclusions: This study showed that >90 % of SpO2 alarms in the PACU were likely false. MV monitoring gives advanced warning of respiratory depression with fewer false alarms. Introduction: Edi-derived indices provide reliable predictors of successful separation from the ventilator only during the SBT [1] [2] [3] . Their performance is not better than the performance of RSBI [3] . We previously investigated in a physiological study the relationship between Edi peak and Edi area under the curve (P/I Index); we concluded that P/I Index can give important information on the balance between respiratory drive and respiratory demand, and may predict weanability at clinical support level [4] . The aims of the study are: 1) to validate P/I Index as weaning predictor at clinical ventilatory support and 2) to compare P/I Index sensitivity during NAVA in predict SBT success respect to: Edi peak, Neuro-ventilatory efficiency (NVE) and RSBI. Methods: We prospectively enrolled 54 patients ventilated with NAVA meeting criteria for SBT. For each patient, we identified 2 step: 1) enrollment, T[sub]0[/sub]; 2) end of SBT. We recorded RSBI, Edi peak, and two Edi-derived weaning indices: NVE , P/I Index. SBT failure and success were defined according to clinical protocol in use. Results: 34 patient completed SBT and were successfully extubated; 11 patients failed SBT ; 9 patients failed the first weaning step before the trial (subgroup A). At the end of SBT differences between S and F group were statistically significant for RSBI and P/I. Conclusions: P/I Index predict ability to sustain SBT under ventilator assistance and during SBT. Subgroup A may represent a group of patients not adequately assisted, where classic indices and NVE are not able to predict the inability to start SBT. Further investigations are needed to understand whether P/I Index may provide information on adequacy of assistance during NAVA ventilation. Introduction: Opioid use decreases respiratory drive and can cause opioid-induced respiratory depression (OIRD), which is challenging to recognize in non-intubated patients from respiratory rate (RR) alone. Using a non-invasive respiratory volume monitor (RVM) that measures minute ventilation (MV), tidal volume (TV) and RR, we examined whether RR alone can identify LowMV episodes (used to define OIRD) in the post-anesthesia care unit (PACU). Methods: After written informed consent, MV, TV and RR measurements were collected via RVM (ExSpiron, Respiratory Motion, Waltham MA) from 358 patients (age: 67.1, 19-91 yrs; BMI: 29.8,19-49 kg/m2) undergoing elective joint replacement surgery. 205 patients received PCA opioids in the PACU: 96 received hydromorphone (0.2 mg) and 109 received morphine (1 mg per dose). RVM measurements were calculated from 30-second segments in the 30 minutes before and after first dose. Predicted MV (MVPRED) was calculated from each patient's BSA. LowMV was defined as MV < 40%MVPRED for >1 min. LowRR was defined as RR < 6breaths/min. Sensitivity and specificity analysis correlated LowRR with LowMV. Results: For patients receiving opioids, LowMV occurred 13.7 % of the time before and 17.8 % of the time after a dose. Importantly, LowRR coincided with less than 12.5 % of all LowMV episodes. On average, patients experienced 1.5 ± 0.2 LowMV episodes before opioid (range: 0-9episodes; duration: 2.17 ± 0.13 min) and 1.8 ± 0.2 LowMV episodes (0-9episodes; 2.40 ± 0.12 min) after opioid dosage (p > 0.5). In both groups, a low RR alarm set to below 6 breaths/min would miss 88.3 % of LowMV episodes, yielding a sensitivity of 12.4 %, specificity 98.5 %, positive predictive value 60.5 %, and negative predictive value 85.8 %. Figure 67 shows a plot of paired MV and RR measurements. Conclusions: LowRR measurements alone provide an insufficient assessment of respiratory status. Our data suggest that LowMV is primarily related to a decrease in TV and not RR. Introduction: Although regional compliance (C) and resistance (R) can vary considerably between different lung regions, traditional pulmonary function tests can measure global values, only. To overcome this drawback we recently proposed a novel EIT-based method to estimate regional expiratory time constants (τ = tau). In this abstract regional values were compared with global ones. Methods: In ten intubated patients with hypoxemic and hypercapnia lung failure time constants (τ = R·C) were estimated from global and regional volume EIT signals obtained by Swisstom BB 2 (Swisstom, Switzerland). A first order exponential decay model was fitted to each regional and global impedance curve using MATLAB (The MathWorks, USA). The spread and mean of regional τ values is depicted in Fig. 68 and compared with the one τ value derived from the global signal. Results: ARDS lungs exhaled "faster" whereas COPD lungs were "slower", the latter reflecting the predominance of airway obstruction. ARDS patients show only minor regional differences and global τ corresponded perfectly with the mean τ derived from different lung regionals. In contrast, COPD patients showed a large spread of their regional τ values. Nonetheless, the mean of the regional τ values corresponded well with global τ . Conclusions: Mean expiratory time constant derived from regional EIT signals was similar to the one value calculated from the global EIT signal in both, patients with short (ARDS) and long expiration time requirements (COPD). As a sign of heterogeneous disease, the latter showed a wide spread of regional τ values. Introduction: Passive exhalation shows a characteristic exponential course with asymptotic approximation of the end expiratory lung volume. It can be described as V(t) = Vinsp · e^(-t/(R · C)) where V is the lung volume at time t, Vinsp the volume at end inspiration, R the resistance and C the compliance of the lungs. The product of R · C is called τ (tau). Although R and C can vary considerably between different lung regions, traditional pulmonary function tests provide global values, only. Therefore, we recently proposed a novel method to obtain regional τ values using electrical impedance tomography (EIT). As this approach can be challenging in presence of small ventilation amplitudes, noisy signals and heterogeneity we determined in this study its robustness. Methods: In 8 intubated patients, with hypoxemic or hypercapnic respiratory failure, we measured EIT signals at 50 images per second using Swisstom BB 2 recording 30 consecutive breaths during steady state conditions. Regional breath by breath τ values (Fig. 69) were estimated by fitting an exponential model to each regional curve. Only τ values within the range of 0.05 and 3 s stemming from appropriately fitted curves (R 2 > 0.6) were considered thereby excluding poorly ventilated and poorly fitted lung areas. To estimate the robustness of τ , we calculated the regional coefficient of variation (CV) for all breaths as standard deviation divided by mean. Results: Mean and median CV values of each patient were below 5 %, however some pixels showed values of up to 25 % mainly at the lung boundaries or close to poorly ventilated areas. Conclusions: The suggested approach for calculating regional expiratory τ values provided robust results with low coefficients of variation. However, certain lung regions at the lungs' boundary showed high variability. Therefore, to increase reliability such pixels should be excluded from future calculations. Introduction: Obstructive airway diseases such as asthma or COPD are increasing in numbers and frequently lead to severe exacerbations with the need of non-invasive or invasive mechanical ventilation. Severe COPD in particular is often characterized by inhomogeneous ventilation. In case of mechanical ventilation in obstructive pulmonary diseases or ARDS, modern ICU ventilators provide accurate measurements of pressure-volume curves on a global, but not on a regional level. Electrical impedance tomography (EIT) in contrast is an established non-invasive tool to measure regional ventilation. The aim of our study was to calculate an EIT-based (32x32 pixel, 50 frames per second) expiratory time constant τ (tau) for each image pixel and a global one for the entire lung and to validate it against a global volume signal measured at the airway opening. Methods: In 12 intubated patients with hypoxemic and/or hypercapnic respiratory failure we performed EIT and pneumatic volume measurements (Swisstom BB2, Switzerland) on different PEEP level recording 100 consecutive breaths during steady state conditions with breathing frequencies between 14 and 16/min. Regional breath by breath τ values were obtained by fitting in a nonlinear least square manner an exponential function to the EIT curve of each pixel. The same algorithm was applied to the pneumatic volume curve and correlated with the mean τ of all regional EIT measurements. Results: The mean τ calculated from the regional EIT signals showed a strong correlation with the global τ calculated from the pneumatic volume signal. In 2773 breaths Pearson r was 0.8752 (0.8662 to 0.8836) and r2 0.7660. Patients with severe COPD typically showed longer time constants than those with ARDS. The mean τ calculated from the global EIT signals showed a strong correlation with the global τ calculated from the pneumatic volume signal as well (Pearson r: 0.8978). Conclusions: The mean τ calculated from the regional EIT signals shows a very strong correlation with the global pneumatic volume signal reflecting EIT as a valid method to calculate global and regional expiratory time constants. EIT therefore provides for the first time a regional breath by breath measurement of airflow obstruction and may help to guide mechanical ventilation especially in obstructive pulmonary diseases. Introduction: We have shown in pigs and ARDS patients that lung elastance can be determined without esophageal pressure as the change in PEEP divided by the change in end-expiratory lung volume, deltaPEEP/deltaEELV [1, 2] . The hypothesis is that this is an effect of contra-directional forces of the expanding rib cage and recoil of the lung that creates negative pleural pressure, which persists also at an increased PEEP/EELV. Esophageal pressure, used as a surrogate for pleural pressure (PPL) is unreliable for determining absolute endexpiratory esophageal pressure, because of extra-pleural effects on measurements. The aim of the present study was to evaluate the effect of an expanding rib cage on absolute pleural pressure changes during PEEP steps in a respiratory system model. Methods: A model with a recoiling elastic lung and expansive chest wall complex was built (Fig. 70) . Lung elastance could be varied between 46, 24, and 17 cmH2O/L. In volume control, PEEP was increased from 0 to 4, 8, and 12 cmH2O, while airway and pleural pressure was measured and deltaEELV obtained by spirometry. Results: When PEEP was increased endexpiratory pleural pressure increased during the first expiration, but then declined back to baseline, while end-expiratory transpulmonary pressure (PTPEE) increased until it equaled the change in end-expiratory airway pressure (delta-PEEP = deltaPTPEE) (Fig. 71) . There was an good correlation between transpulmonary pressure at all lung volume levels derived from PAW-PPL (pleural pressure) and transpulmonary pressure derived from lung elastance calculated as deltaPEEP/deltaEELV, r2 = 0.99, y = 1.01x. Conclusions: As end-expiratory pleural pressure remains unchanged when increasing PEEP, end-expiratory transpulmonary pressure changes as much as PEEP and lung compliance can be determined as deltaEELV/deltaPEEP. (transudate or exudate).The presence of pleural or abdominal effusion is a frequent finding in patients in the ICU and Internal Medicine Department.It is possible to distinguish between transudate and exudate by Light's criteria [1, 2] .Pursuing an acid-base assessment of the fluid,we have noticed an increase in the value of lactate,beyond the blood range,in the cases that were diagnosed like exudate to the calculation of Light criteria [4] .The advantages of this test is that it is a simple test,executable by the physician and his result is available within few minutes. Methods: We collected data of 30 patients who had clinical indication for thoracentesis or paracentesis.For each patient we performed arterial blood gas with lactate,total protein and serum LDH dosage,acid-base assessment with lactate,total protein and LDH dosage,cytology and bacterial culture on the fluid.Of every patient we calculated "liquid/serum" lactate ratio in order to measure the eventually present increase in pleural or abdominal effusion.The diagnosis of the liquid effusion nature (exudate or transudate) was performed by Light's criteria.A statistical analysis was carried out by performing a ROC curve in order to find the best cut-off for liquid/ serum lactate ratio predicting the presence of an exudate. We performed a ROC curve to predict the presence of exudate by the liquid serum lactate ratio and we obtained an AuROC of 0.69 with the best cut-off value of 2,02 (Sensitivity 0,73 and Specificity 0,73). When we performed the same analysis only on pleural effusion patients we obtanined a an AuROC of 0,78 with the best cut-off value of 2,13 (Sensitivity 0,7, Specificity 1,0). Conclusions: Liquid/Serum lactate ratio seems to be a promising tool to predict the presence of an exudate,in particular in pleural effusions. Further studies are needed to warrant these statements. Results: 8048 records were reviewed and 79 met the inclusion criteria. A control group of patients without a diagnosis of pulmonary fibrosis was used for comparison (7954). Results showed 54 % of patients with pulmonary fibrosis died before discharge from the unit compared to 18.8 % in the control group p = 0.0001 (Table 34) . Similarly, 63.3 % of patients with pulmonary fibrosis died before hospital discharge compared to 26.3 % of the control group p = 0.0001 (Table 35) . Conclusions: The results from this study are consistent with previous findings. The risk of death prior to discharge from our intensive care unit is almost 30 % higher for patients with pulmonary fibrosis. From the data presented above, it may be prudent to consider that patients with pulmonary fibrosis presenting with no reversible causes of acute respiratory failure may be less likely to benefit from admission to the ICU. Introduction: For patients with ARDS supported on ECMO, the optimal sedation and analgesia strategy is unknown. Our objective was to characterize pharmacological practice in a high-volume ECMO center and describe sedation depth, incidence of delirium and mobilization in this population. Methods: We conducted a retrospective study to describe medication (sedative, analgesic, paralytic, antipsychotic) administration in 60 patients treated with VV-ECMO for ARDS at Toronto General Hospital. We recorded Sedation-Agitation Scale scores (SAS), medications administered, routes and doses, CAM-ICU scores, and mobilization while on ECMO. Results: (INTERIM) 23 adults (19 males, 48 ± 11 years, APACHE II 31 ± 5) with severe ARDS (P/F 72 ± 19), mainly from pulmonary sepsis (91 %), were treated with VV-ECMO from Jan 2012-July 2014. Median duration of VV-ECMO was 9 days (IQR 5-11). At 48 hr post ECMO initiation all patients were deeply sedated (SAS[<=]3), with 96 % and 100 % receiving a sedative (midazolam 87 %, propofol 26 %) and opioid (fentanyl 91 %) infusion respectively. Within 24 hr prior to ECMO discontinuation, in patients who survived ECMO (78 %), 44 % of patients were still deeply sedated. At this time, 89 % were receiving sedatives (67 % infusion vs 22 % intermittent) and 83 % were receiving opioids (72 % infusion vs 11 % intermittent). At 48 hr post ECMO discontinuation 83 % of patients reached light levels of sedation (SAS > 3). At this time point only 61 % were still receiving sedatives (33 % infusion vs 29 % intermittent) and 50 % were receiving opioids (33 % infusion vs 17 % intermittent); 22 % were not receiving any sedative or opioid. While on ECMO 57 % of patients had cisatracurium infusions for a median duration of 16 hrs (IQR 10-76). 88 % had [>=]1 positive CAM-ICU score for delirium. Haloperidol and quetiapine were the primary antipsychotics used throughout ECMO (30 % and 57 % respectively). 57 % of patients underwent physical were accepted into the service. The remaining 50 (11 %) were transferred to another national centre due to the lack of ECMO beds at the studied centre. The mean age and duration of mechanical ventilation before referral were significantly higher in the futile group (56.1 years, 5.1 days) compared to those who were accepted (42.9 years, 3.1 days) or ¡®too well¡¯(46.5 years, 3.3 days) (p <0.05). The futile group had higher rates of multi-organ failure and severe immunosuppression. Gender, body mass index, lung injury scores and degree of hypoxaemia were similar in all 3 groups. Interestingly, 60 % of patients who were too well had higher lung injury scores (Murray score ¡Ý3) compared to 45 % and 41 % in those who were futile and accepted respectively. The survival of the ¡®futile¡¯group was poor; whilst those who were too well and accepted had similar outcomes. At 180 days, 44 % of those who were too well had died, whilst 15 % of those who were futile survived. We present the first report on the outcomes of patients who referred but not accepted for vv-ECMO in the context of a national centralised referral system. Identical survival at 30, 60 and 180 days between the accepted patients and the ones deemed too well warrant further investigations. The [2] was used to develop a risk prediction model and cross validation applied to select the optimal tuning parameter. Model discrimination was assessed using the area under the receiver operating curve (AUC) and calibration was assessed using several common indices. Results: 141 patients were included. 48 h-mortality was 16 % (22 patients) with an in-hospital mortality of 43 %. The model demonstrated good discriminatory power AUC = 0.79 (Bootstrap 95 % CI = 0.68 to 0.88) and good calibration. Arterial pH shows the strongest association with mortality in our model (Fig. 72) . Conclusions: In a small retrospective cohort study, an 8-parameter score of pre-ECMO variables predicted mortality with an AUC of 0.789. Whilst validation and optimisation in a larger cohort is required, the use of pre-ECMO variables corresponds with medical decision-making in real-time, enhancing clinical utility in the acutesetting. Lung function six months post extra corporeal membrane oxygenation (ECMO) for severe acute respiratory failure in adult survivors C. Introduction: Acute Lung Injury and acute respiratory distress syndrome (ALI/ARDS) is characterized by tight junction's (TJ's) barrier disruption and changes in the composition and integrity of Claudins (CLDN) which are components of TJ's. CLDN5 is highly expressed and augment permeability in lung parenchyma, although the regulation by inflammatory stimuli remains elusive. This study aims to investigate CLDN5 expression and regulation by TNF, in LPS-induced ALI/ARDS in mice. Methods: Lung injury was induced by intratracheal LPS administration in adult male C57BL/6, and TNF-/-mice. Mice underwent bronchoalveolar lavage (BAL) and sacrificed at 6 h and 24 h after LPS administration. Histological score of ALI was assessed in lung tissues. CLDN5 cellular localization was evaluated by immunohistochemistry, while its expression levels were verified by Western blotting. Results: In mice lung tissue, following LPS administration, CLDN5 immunoreactivity was distributed along the alveolar epithelium and vascular endothelium. CLDN5 protein expression levels was increased by 3 fold along with ALI characteristics (augmented in BALF cellularity, protein content and ALI score). In TNF-/-mice, CLDN5 expression is statistically reduced as early as 6 h after LPS administration with consequently increased in BALF protein content. In addition increased CLDN5 expression was observed upon 24 h LPS administration. TNF biphasic action on CLDN5 expression was dependent on ERK1/2 expression levels. Conclusions: In LPS induced ALI/ARDS in mice, CLDN5 was increased associated with ALI severity. Moreover, TNF regulates CLDN5 expression, and TJs permeability dependent in part on ERK1/2 pathway. Our findings emerge the role of Claudins and especially CLDN5 to elucidate the pathophysiology of this form of injury. Cell counts in endobronchial aspirate to assess airway inflammation in ARDS patients: a pilot study S. Spadaro 1 , I. Introductions: Adult respiratory distress syndrome (ARDS) is characterized by a breakdown of capillary integrity, leading to the leakage of protein-rich fluid and the accumulation of inflammatory cells within the airspaces. Bronchoalveolar lavage (BAL) is the commonly used method to obtain lower respiratory tracts samples and to define the role of cells in the inflammatory response in ARDS. However, there is the need to assess the feasibility of using relatively non- Introduction: Constipation in critically ill patients is a common problem and has been associated with a number of significant complications including prolonged mechanical ventilation, increased length of stay, compromise of intenstinal protective barrier and difficulty establishing enteral feeding [1] . Our intensive care unit (ICU) bowel care guideline specifies initiation of treatment for constipation, defined as a patient not opening their bowels for three consecutive days. Methods: We retrospectively audited consecutive ICU patients over a two-months to establish the incidence of constipation as well as adherence to bowel care guideline. Patients were excluded if they had stayed on the ICU for less than three days, had undergone recent bowel surgery or were on an alternative bowel care protocol (e.g. encephalopathy, spinal or brain injury patients). Data collected included bowel care prescriptions, rectal examinations, episodes of bowel opening and stool type. Results: Of the 37 patients included, 43 % suffered a period of constipation during their ICU stay. Treatment for patients who had not had their bowels opened for three consecutive days was initiated in only 44 % of patients. Treatment strategies for constipation varied and were partially or fully compliant with guidelines in only 37.5 % of constipated patients. Despite similar APACHE II scores and ages across the two groups, the incidence of a patient having at least one episode of failure to feed via the enteral route occurred in a higher proportion of constipated patients (69 %) compared with non-constipated patients (35 %). Conclusions: The incidence of constipation on our unit is decreasing, however it remains a common problem, which seems to impact on establishing enteral feeding. As it can cause potentially serious complications in critical care patients, compliance with the guideline should be improved. We have changed policy to include starting prophylactic laxatives [2] , unless contraindicated, in all new admissions with earlier escalation of treatment. Continued education and promotion of the new guideline is ongoing in our unit. Methods: Adult patients were recruited prior to theatre on their day of surgery. Fasting times were calculated using a verbal questionnaire and noting the anaesthetic start time. Results: All patients (100 %) exceeded the ESA fasting guidelines for solids, with some fasting for greater than 24 hours before an elective procedure. Fasting times for liquids were shorter, but on average were in excess of 2 hours. Conclusions: Patients, irrespective of whether they arrive on the day of surgery or are inpatients fast for similar times, all longer than the ESA guidelines. Patients and their families, as well as hospital staff should be aware of current guidelines. Current protocols should be adjusted to improve compliance. Introduction: ICU is a local where there is a great concern with costbenefit ratio. Introduction: Although described more than 70 years ago, the refeeding syndrome remains understudied with lack of a standardized definition and treatment recommendations. The aim of this systematic review was to provide evidence in regard to a standardized definition, incidence rate and time course of occurrence, association with adverse clinical outcomes, risk factors and therapeutic strategies to prevent or treat this condition. Methods: We searched MEDLINE and EMBASE for interventional and observational clinical trials focusing on refeeding syndrome, excluding case reports and reviews. We extracted data based on a predefined case report form including bias assessment. Results: Out of 2206 potential abstracts, 44 records with a total of 6269 patients were included in the final analysis including 2 interventional studies and 16 studies including anorexic patients. Definitions for refeeding syndrome were highly heterogenous with most studies relying on electrolyte disturbances only and others also including clinical symptoms. Incidence rates varied between 0 % and 80 % depending on the definition and patient population. Occurrence was mostly within the first 72 hours of start of therapy. Most of the risk factors are in accordance with the NICE guidelines, but older age and enteral feeding were additionally described. Associations of refeeding syndrome and adverse outcomes remain unclear, as does the effect of preventive measures and treatment algorithms. Conclusions: Although there is consensus in regard to risk factors and timing of the occurrence of refeeding syndrome, there is wide variation in definition, reported incidence rates and management recommendations (preventive measures and treatment). Further research is warranted to fill this gap. After participating in the EPaNIC trial, we changed the protocol for the administration of parenteral nutrition (PN) in our ICU [1] . The administration of PN supplementing enteral nutrition (EN) in case of insufficient caloric intake, was postponed from day 3 after ICU admission to day 8, compared to the "late-PN-group" in the EPaNIC trial. One of the exceptions is patients with a BMI less than 18, where starting PN depends on the individual decision of the attending intensive care physician. The protocol for the administration of EN and glycaemic control remained unchanged. Micro-nutrients intravenously are prescribed for all patients as from day 3 after ICU admission in case of inadequate caloric intake. Two years after implementing the new protocol, we conducted a survey investigating the compliance with the new protocol. We conducted a retrospective analysis of all patients admitted to a 6-bed medical unit of our ICU from 01/04/2013 to 30/11/2013. From 220 admitted patients, 123 were considered critically ill since they received tight glycaemic control. The other patients were excluded from the analysis since they stayed for less than 24 hrs in the ICU, mostly for monitoring purposes. Only 6 out of 123 patients received PN (4.88 %). In 4 cases PN was started at day 8 after ICU admission in accordance to the new protocol. PN was started earlier in 2 patients suffering from an haematological-oncological disease complicated with neutropenic sepsis and a BMI less than 18 and already receiving PN on the ward. Since a caloric intake less than ¾ of the caloric target at day 3 after ICU-admission was considered a cut-off for starting PN in the old protocol, 88.62 % of the patients would have met the criteria for receiving PN according to that protocol. There is a good compliance to the revised protocol where PN is only started at day 8 after ICU admission in our medical ICU unit, besides some predefined exceptions. This significantly reduces the number of patients receiving PN compared to the old protocol. Several studies have shown that exclusive enteral nutrition is mostly ineffective in providing adequate energy intake in critically ill patients. For this reason, ESPEN reviewed its guidelines in 2009 in an attempt to improve this practice by recommending that all patients, who are receiving less than their targeted enteral feeding after 2 days should be considered for parenteral nutrition. The aim of this study is to evaluate whether Portuguese ICUs changed their nutritional strategy according to ESPENs guidelines. Observational prospective study, conducted in eleven Portuguese ICUs of nine general hospitals. Patients with 18 years of age or older were eligible if they were ventilated and had a length of stay (LOS) in ICU greater than 7 days. Demographic data, along with the energetic intake in the first 7 to 10 days and type of nutritional support used were collected from the selected patients. Results 130 patients were enrolled, 63.8 % were male, the median age -64 ± 16 (19Ð91), median BMI Ð 27.9 ± 5.9 (18.8Ð49), ICU LOS Ð 15.4 ± 6,1 days, mortality rate of 26.9 % (35). 70 % of patients were admitted for medical reasons, 31.5 % had normal weight, and the remaining patients were either overweight or obese. Energy intake in the first 10 days was 12.4 ± 6.6 Kcal/Kg/day (0Ð34.1); excluding the first 3 days it rose to 15.6 ± 7.4Kcal/Kg/day (0Ð34.1). Nutritional support was evaluated over a period of 1,223 days in which 80 % of the days patients received nutritional support, 66 % by enteral route and 14 % by parenteral route. Energy needs in a critically ill patient can vary between 20-25 Kcal/ kg/day. In our study a large proportion of Portuguese ICUs patients receive inadequate nutrition. Recent randomized clinical trials support the recommendation that "less nutrition is better". However, it is not uncommon for critically ill patients to be starved and this is deleterious and should be avoided. Should we provide more protein to critically ill patients? Introduction Appropriate nutrition delivery in the ICU is a topical issue. Most studies have focused on energy supply by enteral or parenteral nutrition. A few studies now also focused on protein supply. Studies also agree on the importance of adequate protein supply, 1.2-2.0 g/kg/day, for the outcome. The aim of this study is to evaluate the amount of protein supply given to critically ill patients in Portuguese ICUs Methods Observational prospective study, conducted in eleven Portuguese ICUs of nine general hospitals. Patients with 18 years of age or older were eligible if they were ventilated and had a length of stay (LOS) in ICU greater than 7 days. Demographic data, along with the energetic intake in the first 7 to 10 days and type of nutritional support used were collected. Results 130 patients were enrolled, 63.8 % were male, the median age -64 ± 16 (19Ð91), median BMI Ð 27.9 ± 5,9 (18.8Ð49), ICU LOS Ð 15.4 ± 6,1 days, mortality rate of 26.9 % (35). 70 % of patients were admitted for medical reasons, 31.5 % had normal weight, and the remaining patients were either overweight or obese. The energy supply in the first 10 days was 12,4 ± 6,6 Kcal/Kg/day (0Ð34,1); Carbohydrates are the main source of calories -1,5 ± 0,93 g/kg/day (0Ð4,2), lipids -0,4 ± 0,3 g/kg/day (0Ð1.37), protein/ amino acid administration were -0,32 ± 0,27 g/kg/day (0Ð1.11). Regarding additional supplements besides artificial nutrition, the impact on overall energetic intake of propofol was 1 ± 0.6 Kcal /Kg/day (0-9.7), while the amount for glucose solution was 1.3 ± 1 Kcal /Kg/day (0Ð8). We may safely conclude that less calories is better in the ICU but scientific recognition of the importance of protein is growing, and although optimal protein dosing studies are not available, expert opinion supports administering in excess of 1.2 g/kg/day. At present, in most of the Portuguese ICUs, the majority of critically ill patients receive less than half of the recommended protein intake. An audit was carried out to determine the amount of protein, in grams per kilogram body weight per day (g/kgBW/d), being provided to adult, Level 3, critically ill patients. International evidence based guidelines recommend 1.2 g/kgBW/d [1] . Protein is an integral component of nutritional therapy in critically ill patients which aims to minimise muscle wasting [2] . Methods A prospective Audit was carried out in 2015 in a 21 bedded, Level 3, Adult Intensive Care Unit within a University Teaching Hospital and Major Trauma Centre. Inclusion criteria included all patients who were enterally fed. Exclusion criteria included parenteral nutrition, less than 4 days of enteral nutrition and incomplete data. Protein provision was measured against 1.2 g/kgBW/d. Propofol was not recorded which was a limitation of the Audit. The Audit was not subject to ethical approval according to Trust standards. Of 308 admissions, 111 patients were audited, 27 excluded and 84 patients included within data analysis (n = 84). Mean protein provision was 0.87 g/kgBW/d. Mean protein percentage delivery was 78 %. Protein deficits became more pronounced when measuring against 'higher' disease specific recommendations. Feeding protocols were adhered to in 6 % of patients. At initial assessment the nutritional products met protein in 55 % of patients. On average, patients experienced 20 hours of enteral nutrition stoppages. Patients received a mean of 0.87 g/kgBW/d of protein compared to the minimum of 1.2 g recommended by international guidelines. There were wide variations of protein delivery between patient groups. Reasons for not meeting protein provision were multifactorial. Dietitians are expertly skilled within the MDT to improve nutritional care within the ICU. Vitamin D deficiency is commonly found in hospitalized patients, and associated to the multiple adverse clinical events, including increased in morbidity and mortality. Although, this condition is also frequently found in the critically ill patients, the correlation to clinical outcomes remains unclear. This report will demonstrate the prevalence of vitamin D deficiency in medical and the associated clinical outcomes among the medical critically ill patients in Thailand. Prospective observational study was conducted during 6 months in the medical ICU in Songklanakarind hospital. Demographic data and clinical outcomes, including 28-days mortality, ventilator days, ICU and hospital length of stay, were collected. Serum 25-hydroxyvitamin D (25(OH)D was measured within 24 hours after ICU admission. The 25(OH)D level less than 20 ng/ml is generally defined as a deficiency state. The correlation of 25(OH)D status on admission and clinical outcomes were then analyzed with chi-square test or student T-test. The p-value less than 0.05 defined as clinically significant. Of the 116 critically ill patients, the prevalence of 25(OH)D deficiency was 64.65 %. Patients with 25(OH)D deficiency were statistically significantly younger, had female gender, had higher blood glucose levels, higher rate of respiratory arrest 24 h prior to ICU admission and higher requirement of mechanical ventilator support. The 25(OH)D deficiency was not significantly correlated to the 28-day all-cause mortality. However, there was a trend toward increased in the vasopressor used, higher rate of ICU infection and longer hospital length of stay, among the deficiency group. Our study demonstrated that the prevalence of vitamin D deficiency in Thai critically ill population was high (64.65 %). We also found the significantly higher rate of respiratory arrest and mechanical ventilator dependent in the deficiency group. However, the vitamin D deficiency in Thai critically illness was not significantly associated to the ICU morbidity and mortality. Vitamin D deficiency has been associated with several adverse outcomes mainly in the outpatient setting. The objective of this study was to examine the prevalence of vitamin D deficiency and its association with risk of adverse clinical outcomes in a large prospective cohort of medical inpatients. We collected clinical data on measured 25(OH)D levels in adult medical patients upon hospital admission and followed them for 30 days. Regression analyses adjusted for age, gender, comorbidities and In this comprehensive and large medical inpatient cohort, Vitamin D deficiency was highly prevalent and strongly associated with adverse clinical outcome. Interventional research is needed to proof the effect of vitamin D supplementation on these outcomes. Omega-3 fatty acids in patients undergoing cardiac surgery: a systematic review and meta-analysis P. Langlois 1 , W. Manzanares 2 Over the last few years, supplementation with omega-3 polyunsaturated fatty acids (n-3 PUFA) has emerged as a therapeutic option for patients undergoing cardiac surgery due to their immunomodulatory properties and anti-arrhythmic action. Nonetheless, there is a paucity of data supporting the effectiveness of n-3 PUFA in the treatment of cardiac surgery patients. So far, several randomized controlled trials (RCTs) have assessed the effect of perioperative n-3 PUFA preventing postoperative atrial fibrillation (POAF), although their efficacy still remains controversial. Therefore, we conducted an updated systematic review and meta-analysis evaluating the effects of perioperative oral/ enteral n-3 PUFA and intravenous (IV) fish oil lipid emulsions on relevant clinical outcomes for cardiac surgery patients. We included RCTs enrolling adult patients undergoing cardiac surgery, which evaluated oral/enteral and parenteral n-3 PUFA compared to a placebo and reported clinically important outcomes. According to eligibility criteria, original studies were abstracted in duplicate. Intensive care unit (ICU) length of stay (LOS) was the primary outcome; secondary outcomes were incidence of POAF, duration of mechanical ventilation (MV) and hospital LOS. Hypothesis-generating subgroup analysis was performed to identify potentially more beneficial treatment strategies. A total of 11 RCTs (n = 2846 patients) met inclusion criteria. When the data from 4 trials were aggregated, n-3 PUFA had no effect on ICU LOS (WMD -4.43, 95 % CI -13.43, 4.48, P = 0.34, heterogeneity I2 = 43 %). However, n-3 PUFA were associated with a trend in the reduction of POAF (RR 0.89, 95 % CI 0.76, 1.05, P = 0.17; I2 = 34 %, P = 0.13). In addition, in those oral/enteral based trials, n-3 PUFA showed a tendency towards a reduction in POAF (RR 0.87, 95 % CI 0.74, 1.03, P = 0.11; I2 = 47 %, P = 0.07; Fig. 75 ). The test for subgroup differences on overall POAF showed a trend (P = 0.16). There was no effect of n-3 PUFA on MV days and hospital LOS. In patients undergoing cardiac surgery, supplementation with omega-3 fatty acids does not improve any clinical outcome in the postoperative period. Long-term hospitalized patients in Intensive Care Units (ICUs) are at high risk of developing major depressive syndrome or Post-traumatic Stress Disorder (PSD). There are correlations between the degree of depression and survival in these patients. The incidence of depression and PSD is three times higher in patients in intensive care compared with the general population. 5-hydroxytryptophan, a precursor of serotonin, can be used for the prevention of depression in these patients. We conducted a randomized controlled trial on 30 patients, hospitalized in our ICU for non-surgery causes. We dosed serotonin levels on admission day, at 7 and at 14 days. Patients with psychiatric disorders and malignancies were excluded as those who were in treatment with drugs that interfere with the metabolism of serotonin. Selected patients were randomly assigned either to receive placebo treatment (first group, N = 30) or to received 300 mg of 5hydroxytryptophan (5-HTP) (second group, N = 30). Psychiatric evaluation was performed using the Clinician-Administered Posttraumatic Stress Disorder Scale. At baseline, no significant differences were recorded regarding serotonin plasma levels (243 ± 47.75 and 252.6 ± 38.59 μ g/l, P > 0.05) or age (55 ± 12.38 and 54.66 ± 12.11, P > 0.05), in both groups. After the first determination we observed a decrease in serotonin levels in the placebo group (184.46 ± 37.57) and in 5-HTP group (229.26 ± 39.35) (p < 0.001, T-value is 3.18). Delirium was present in 3 patients in the placebo group and 1 patient in 5-HTP group. After 14 days, serotonin levels were lower in the placebo group when compared to treatment group (P < 0.001), and delirium was absent in the latter. Mean end points for Clinician-Administered Posttraumatic Stress Disorder Scale were higher in placebo group when compared to treatment group (P = 0.006), at the same time. Several studies suggest that depression can increase mortality in critically ill patients. The use of antidepressants in these patients may be contraindicated, therefore 5-HTP can become an alternative therapy in depression of patients in ICU. Selenium is an essential trace element with antioxidant and immunomodulatory effects. Several randomized controlled trials (RCT) and meta-analyses have demonstrated that parenteral selenium may be able to improve clinical outcomes in intensive care unit (ICU) patients and new trials have been published since our last update. Thus, we updated our data with this systematic review on parenteral selenium as single strategy or in combination with other antioxidant micronutrients in the critically ill. We included RCTs that evaluated clinical outcomes associated with intravenous selenium as single or combined strategy in parenterally or enterally fed patients. Overall mortality was the primary outcome; secondary outcomes were infections, ICU length of stay (LOS), hospital LOS, and ventilator days. Subgroup analyses were done to elucidate the effects of selenium on mortality and infections (Table 41) . Results 21 trials met our inclusion criteria (n = 4129). When the results of the 20 trials reporting mortality were aggregated, no statistically significant reduction in mortality was found (RR 0.99, 95 %, CI 0.99, 1.08, P = 0.75, I2 = 0 %). In addition, there was no significant effect of parenteral selenium on infections (P = 0.15). There was no effect of IV selenium therapy on ICU LOS, hospital LOS, and ventilator days. Result of subgroup analysis is found in Table 41 . Parenteral selenium as monotherapy or in combination with other antioxidant micronutrients has no effect on overall mortality, infections, as well as on ICU LOS, hospital LOS, and ventilator days in critically ill patients. Probiotics in the critically ill: an updated systematic review and meta-analysis P. Langlois 1 , M. Lemieux 2 , I. Aramendi 3 , D. Heyland 2 , W. Manzanares 3 Critical illness is characterized by changes in the microbiology of the intestinal tract, leading to a loss of commensal flora and an overgrowth of pathogenic bacteria. Probiotics are living non-pathogenic bacteria colonizing intestine and providing benefits to the host. In 2012, we demonstrated that probiotics may be able to reduce infections and may influence intensive care unit (ICU) mortality1. Over the last three years different randomized controlled trials (RCTs) have been published. Therefore, the aim of this systematic review is to update our previous data on probiotics in the critically ill. We included RCTs enrolling critically ill adults, which evaluated probiotics compared to a placebo and reported clinically important outcomes. According to eligibility criteria, original studies were abstracted in duplicate by two reviewers. Overall infections was the primary outcome; secondary outcomes were overall mortality, ventilator-associated pneumonia (VAP), diarrhea, ICU length of stay (LOS), and hospital LOS. A total of 29 RCTs (n = 2737) met inclusion criteria, including 6 new trials since our last update. When the results of trials reporting infections were aggregated, probiotics were associated with a significant reduction in infections (RR 0.82, 95 % CI 0.69, 0.97, P = 0.02; heterogeneity I2 = 41 %), and particularly among patients with higher risk of death (RR 0.75, 95 % CI 0.61, 0.99, P = 0.04; I2 = 58 %; Fig. 76 ). Moreover, probiotics were associated with a significant reduction in the incidence of VAP (RR 0.74, 95 % CI 0.58, 0.96, P = 0.02, I2 = 29 %). Finally, probiotics had no effect on mortality, diarrhea, and hospital LOS Conclusions Probiotics are associated with a significant reduction in overall infections, as well as a reduction in the incidence of VAP. The relationship between diabetes and pancreatic cancer has been discussed. However, the effects of diabetes with poor control on pancreatic cancer have never been evaluated. We address the strength of association for relationship between diabetes and pancreatic cancer. Fig. 76 (Abstract P295) . Effects of probiotics on infections. The data from 1,000,000 National Health Insurance beneficiaries were utilized. The study cohort consisted of 42,581 diabetic patients and 672,750 unexposed subjects. Among patients with diabetes, 1082 have been admitted with hyperglycemic crisis episodes. All adult beneficiaries were followed from 1 January 2005 to 31 December 2012 to evaluate if pancreatic cancer was diagnosed. Cox regression models were applied to compare the hazards adjusted for potential confounders. After controlling for age, gender, urbanization level, socioeconomic status, liver cirrhosis, hypertension, coronary artery disease, hyperlipidemia, malignancy, smoking, chronic obstructive pulmonary disease, obesity, history of alcohol intoxication, chronic renal insufficiency, biliary tract disease, chronic pancreatitis and Charlson Comorbidity Index score, the adjusted hazard ratio of pancreatic cancer was 2.48 (95 % confidence interval, 1.84¡X3.34) in patients with diabetes. In patients with history of hyperglycemic crisis episodes, the hazard ratio of pancreatic cancer was significant higher. (hazard ratio, 3.60; 95 % confidence interval, 1.15¡X11.25) The cohort study provided evidence for the relationship between diabetes and pancreatic cancer. Moreover, diabetes with hyperglycemic crisis episodes is associated with higher risk of pancreatic cancer. Hypoglycemia is associated with increased mortality in critically ill patients. The aim of this study was to assess glucose control in an intensive care unit and the incidence of hypoglycemia depending on insulin protocol. Single-center prospective observational study. All patients admitted to 2 ICUs between January and April of 2014 for more than five days, were enrolled. Data on demographics and clinical characteristics of patients were collected at baseline. All blood glucose measurements during the first 5 days were recorded. Three insulin protocols were used, two using subcutaneous actrapid insulin depending if the patient is diabetic (DP) or not (NDP), and an intensive protocol (IP) using IV insulin depending solely on the glycemic value using a target value of 180 mg/dL. A glucose value of 70 mg/dL or less was considered moderate hypoglycemia and 40 mg/dL or less as severe hypoglycemia. SPSS was used for statistical analysis. were under 70 mg/dL and 5/1720 (0,29 %) were under 40 mg/dL. Depending on insulin protocol, the incidence of moderate hypoglycemia was 2,88 % in DP, 0,98 % in NDP and 0,96 % in IP; and in severe hypoglycemia was 0,89 % in DP, 0,25 % in NDP and 0,21 % in IP. As usually the incidence of hypoglycemia was higher in the diabetic population. With the new target for glycemic control in intensive therapy, the incidence of hypoglycemia is now very low in this group. In contrast the protocol for NDP had a better glycemic control than the Intensive Protocol. Severity of the diseases is two-dimensionally correlated to blood glucose, including blood glucose variability, especially in moderately to severely ill patients with glucose intolerance. M. Hoshino 1 , Y. Haraguchi 2 , S. Kajiwara 1 , T. Mitsuhashi 3 , T. Tsubata 2 , M. Aida 1 Elucidation of the relationship between severity of the diseases and blood glucose(BG), including BG variability, is considered to be significant for determining BG targets. The purpose of this study is to analyze the relationship because that analysis has not been fully performed. One hundred and forty nine patients with glucose intolerance were analyzed during 1 week after ICU admission. Studied items were, 1) maximum value of SOFA score(SOFAmax), 2)mean of BG(BGm) and 3)standard deviation of BG(BGsd). BG was measured basically every 6 hours and insulin was used when BG was above 140 mg/dL. Correlation between SOFAmax and BG(BGm, BGsd) was analyzed according to the SOFAmax levels. Conclusions 1)BG control to hyperglycemia is one of the measures which bring the reduction of BG variability. 2)BG targets were considered to be obtained by the analysis of two-dimensional relationship between severity of the diseases and BG. 3)There is a possibility that strict BG control is more effective to severer(moderate to severe) patients, considering the results indicating higher correlation coefficient (R) in severer patients. 4)Glucose administration for increasing BG and BG variability might bring beneficial effect not only for hypoglycemic patients but also for a part of normoglycemic patients, judging from BG targets listed above. 5)Low BG variability in a part of severe patients may be related to lower variability of physiologic activity. 6)Analysis of the relationship between severity of the diseases and BG based on the fine BG control, including by the use of artificial pancreas, was considered to link to better BG control and outcomes. Vary Insulin protocols were applied to control blood glucose(BG) in critical patients, mostly based on short acting insulin, which consumed lots of manpower and resources. By previous study yielded 66.7 % success rate of glycemic control with 80 % dose conversion from short acting insulin dose with mean of differences of 26.5 mg/dL. The aim of this study was to determine whether a single subcutaneous glargine injection with 100 % dose transition is not inferior to the continuous regular insulin (RI) infusion for BG control in critically ill patients (inferior margin acquired from 85 % of the mean of differences from previous study =22.525 mg/dL). All eligible participants admitted Critical care unit which required constant rate of continuous RI infusion per standard MICU insulin infusion protocol for 24 consecutive hours were switched to receive single dose of glargine insulin subcutaneously by 100 % of their total daily insulin requirement calculated by cumulative dose of RI requirement in the previous 24 hours. The continuous insulin infusion was discontinued 2 hours after the transition. BG were monitored every 2 hours for the following 24 hours or until the study termination or discontinuation Of 20 cases included, 12 cases achieved a good glycemic control. Among success subgroup, there were no BG level higher than mean glucose during RI more than 22.525 mg/dl. Only a mild hypoglycemia event was observed with spontaneously resolved. Otherwise among 8 participants in failure subgroup, half of them developed hyperglycemia at 8th hour. This research also found that median RI dose was different as 1.8 mg/dL among success cases and 2.5 mg/dL among failure cases. A single subcutaneous Glargine injection with 100 % dose transition was not inferior to the continuous RI infusion for BG control in critically ill patients with 60 % success rate. Higher pre-treatment RI dose was related to failure and success outcome. Target levels of glycemia in critically ill patients have fluctuated since 2001, as evidence initially indicated that tight glycemic control (80-110 mg/dl) leaded to the lowest morbidity and mortality, however subsequent studies have demonstrated minimal clinical benefit combined with greater morbidity and mortality, so nowadays the target glucose level approaches 150-180 mg/DL. The aim of this study was to assess glucose control in critically ill patients in Portuguese ICUs and evaluate the incidence of hyperglycemia and hypoglycemia. Prospective observational study conducted in eleven ICUs in nine Portuguese hospitals for six months. Included participants were patients who were expected to have a length of stay (LOS) in the ICU of 7 days or more. Data on demographics and clinical characteristics of patients were collected at baseline. All patients were assigned a target glucose of 180 mg/dL or less. All blood glucose measurements during the first 7 days were recorded. A glucose value of <40 mg/dL was considered as severe hypoglycemia. Results 155 patients were included, where 60.6 % were men, with an average age of 62. ± 17, ICU LOS was 15.2 ± 6.4 days and mortality rate was 25. Using a target glucose level of 180 mg/DL, uncontrolled hyperglycemia was avoided over 72 %, with fewer hypoglycemic episodes. As diabetics had more hyperglycemic and hypoglycemic events, may be diabetic patients should have different targets. Aim of this study was to investigate whether hyperglycemia duration (hours/24 hour cycle) intraoperative and on first ICU day, has any impact on postoperative outcome after cardiac surgery Methods This long-range observational study, carried in our department, from January 2013 until June 2014 and consisted of data from 615 cardiac surgery patients. Based on glucose levels on the operation day (24 hours), patients were divided in 5 groups depending on the hours of hyperglycemia. We investigated the possible relationship between duration of hyperglycemia and the following postoperative complications: Deep sternal wound infection, acute kidney injury, need of intravenous drip infusion of a diuretic solution and renal replacement therapy requirement in ICU. The study was conducted by an independent investigator and the results were processed by statistician with the statistical program SPSS 20. Results are showed in Table 42 . Hyperglycemia has statistical significant relationship with AKI, need of IV diuretics and RRT requirement. In our study, hyperglycemia has no impact on postoperative sternal wound infections. Diabetic ketoacidosis (DKA) is a life-threatening acute complication of diabetes that presents with high-anion-gap metabolic acidosis, hyperglycemia and ketonemia (1). The aim of this study is to determine lactate levels and its clinical outcomes in DKA patients admitted to the medical Intensive Care Unit (ICU). A total of thirty-one (31) patients who admitted to our medical ICU of tertiary care university hospital with DKA in last thirty (30) months was included in the study retrospectively. Hyperlactatemia was defined as serum lactate >2.5 mmol/L. The mean age of patients with DKA was 50.4 ± 24.1 and 45.2 % (n = 14) of them were men and 54.8 % (n = 17) were women. Twelve (38.7 %) of patients had type 1 diabetes and 19 (61.3 %) had type 2. The mean serum lactate was 2.93 ± 1.81 mmol/L (0.89-9.20). Of 31 patients, 14 (45.2 %) had elevated serum lactate levels (>2.5 mmol/ L), and 6 (19.4 %) had high lactate levels (>4 mmol/L). No correlation was found between serum lactate levels and mortality, length of ICU and hospital stay. Besides, lactate levels were not correlated with Acute Physiology and Chronic Health Evaluation II (APACHE II) score, blood pressure, serum glucose and HbA1C levels. Elevated lactate level is commonly found in DKA patients (2) . However, this is not associated with worse clinical outcomes and disease severity (3). Thus, elevated lactate levels determined in DKA patients should not to be appreciated by clinicians to define a high-risk population. The London Chest Hospital and The Heart Hospital merged with St Bartholomew's Hospital to become the Bart's Heart Centre, receiving all Heart Attack Centre (HAC) Activations in the North Central and North East London areas. This created the largest Cardiac Intensive Care Unit in Europe with 42 beds currently (and expanding to 58 next year.) We are reporting the changes that have occurred following the merger of two HAC services in adjacent regions. Data was collected from the Intensive Care National Audit and Research Centre (ICNARC) database as well as the Myocardial Ischaemia National Audit Programme (MINAP). The ICNARC and MINAP databases were interrogated to identify numbers for: HAC activation pPCI (primary percutaneous coronary intervention) Admissions to Level 3 Intensive Care. We compared data from the five months following the merger (May -September 2015) with the same months in 2014. High dependency and coronary care admissions were excluded. The in-unit mortality and length of stay were examined. Results are expressed as median and interquartile range (IQR) for non-normal distribution data. Ethics approval was sought but deemed not necessary. Between May and September 2014 there were 628 HAC activation patients admitted to the legacy sites of the London Chest Hospital and Heart Hospital. Following the merger there was an increase of 246 patients (39 %). Of the HAC activations at the legacy sites from May-September 2014, 282 (45 %) patients underwent primary percutaneous coronary intervention, pPCI, compared to 254 (29 %) patients in the five months post-merger. ICNARC data was not collected at the Heart Hospital and therefore data is only available from one legacy site. The ITU admission rate at the London Chest was observed as 4.9 % compared to 6.4 % in the new Centre. The length of stay on Intensive Care increased following the merger from a median 3.9 days (IQR 2.1-8.0) to a median 2.9 days (IQR 1.6-7.0). The in-unit mortality decreased in May-September 2015 to 41.1 % from 45.2 % the previous year. To our knowledge this is a unique situation where three specialised London hospitals have merged to produce one large cardiac centre. The rise in patient numbers yet fall in pPCI rate is interesting and further analysis is needed to understand this trend. In-hospital cardiac arrests (CAs) are not a sudden event but in most of the cases the result of a slow and progressive deterioration of the patient. The aim of this study was the systematic analysis of CA incidence, primary causes and initial rhythm with regard to its location in the hospital in order to identify potential obstacles and to undertake preventive measures to enhance outcome. Methods 177 in-hospital CAs, in which resuscitation was attempted, were included in the study and data were recorded according to the Utstein Style template for in-hospital CA. Age and gender of patients, primary causes, initial rhythm and location of CA were recorded and reviewed. Results are depicted on Tables 1 & 2 . The vast majority of CAs (42.3 %) occurred in areas supervised by the cardiology department. Together with the "cardio surgical" CAs, cardiology related CAs percentage arises to 58.2 %. Number of CAs in the ED and in the RIU was also significant. Mean age of CAs patients was similar in all locations, whereas there were significant differences as far as primary causes and initial rhythm analysis are concerned. These interesting results merit further study, larger number of patients and detailed analysis of all related parameters in order to conduct safe conclusions and to be able to undertake preventive measures. Patients with nosocomial cardiopulmonary arrest (CPA) have aggravated clinical signs before CPA occurs. The early recognition of these signs and quick treatment may reduce the death rate in these patients. We examined the background factors and various biomarkers of the patients who were resuscitated in the general medicine ward. Eighteen patients who were admitted to the ICU after resuscitation in the general medicine ward were studied. Patients were classified as gfair prognosis h (FP-group: 12 persons) if the patient survived more than 28 days or gpoor prognosis h (PP-group: 6 persons) if the patient died within 28 days. Blood samples were drawn for measuring blood chemistry and sepsis-related biomarkers (procalcitonin, presepsin, endotoxin activity assay) immediately after ICU admission. Statistical analysis was performed with the Mann-Whitney U-test and Chi square test or Fisher's test. A value of p < 0.05 indicated statistical significance. The underlying diseases were as follows: sepsis (5), acute respiratory distress syndrome (3), lethal arrhythmia (2), ileus (1), suffocation (1), amyotrophic lateral sclerosis (1), sudden unexpected death in epilepsy (1), unknown (4). The SOFA scores of the FP-group and PPgroup were 10}3 (10) and 14}3 (14) , respectively; the difference between the groups was statistically significant (p < 0.05). APACHE2 scores of the FP-group and PP-group were 34}7 (35) and 40}8 (44), respectively; however, there was no significant difference between the groups. Procalcitonin levels of the FP-group and PP-group were 9}13 (6) and 22}26 (15), respectively; again there was no significant difference between the groups. IL-6 levels of the FP-group and PPgroup were 1096}2851 (98.8) and 19,475}24,603 (11,305), respectively; the difference between the groups was statistically significant (p < 0.05). The EAA level of the FF-group was 0.41}0.16 (0.46), whereas the PP-group was 0.54}0.26 (0.49); however, there was no significant difference in EAA levels between the groups. In our institute, the 28-day survival rate of the patients who were resuscitated in the general medicine ward was 66.7 %. The factors which affect patient outcome were SOFA score and IL-6. However, we could not evaluate the duration of CPA; therefore, it will be necessary to examine this further in the future. Teaching basic life support (BLS) to secondary school students has been proved feasible, but not much evaluated. We conducted a study to evaluate the efficacy of a one-hour BLS training course for secondary school students. From September 2014 to June 2015, six secondary schools were included in the study. Two of them received a one-hour BLS session conducted by an intensivist assisted by 3 medical students. Sessions contained: a theoretical lecture and a practical simulation-based training using low-fidelity manikins (MiniAnnePlus®, Laerdal). Between 2 and 3 months later, both trained and non-trained students were assessed using a serious game reproducing real-life cardiac arrest situation (3D real-time simulation software, Stayingalive®, ILUMENS-Dassault system). Primary outcome was the time spent to complete the serious game (from onset of cardiac arrest to shock delivery and recovery). Secondary outcomes included knowledge of emergency call, hand placement during chest compressions and electrodes placement of automated external defibrillator. Among 199 students assessed on the serious game, 72 had received the one-hour BLS session. Children's ages did not differ between the 2 groups (12 [11-12] vs 12 [11-12] years old, p = 0.7). The total game completion time was 169 (+/-28) seconds in the trained group versus 185 (+/-32) seconds in the non-trained group (p = 0.0006) (Fig. 80) . Knowledge of emergency call did not differ between the two groups (92 % in trained group vs 86 % in non-trained group, p = 0.2). Hands were more often properly placed during chest compression in the trained-group (92 % vs 73 %, p = 0.002). Electrodes of automated external defibrillator were more often properly placed in the trainedgroup (44 % vs 10 %, p = 0.0001). A one-hour BLS training session with low-fidelity manikin improved significantly young students' performance during a 3D-real life simulation session. People have unrealistic expectations of the outcomes of inpatient adult cardiopulmonary resuscitation (CPR). We conducted a survey looking at clinical staff and the general public perceptions and knowledge of CPR compared to local and national data. Methods 276 public and staff were surveyed at 2 hospitals and a GP surgery in South East London. Results 75.2 % of the public knew what CPR was and 29.5 % felt they had enough CPR knowledge. Knowledge did not appear to be influenced by attending university. Compared to the national figure of 45.6 % for initial CPR survival, the public predicted the correct survival range more accurately (see Fig. 81 ) [1] . Staff results were more comparable to their local hospital data. Hospital A's initial survival rate was 33 %: 15 % of staff compared to 6.6 % of public chose the correct survival range. In hospital B, every doctor underestimated the initial CPR survival figure (55 %). Nurses were closer to the figure with the most popular response range being 30-40 % compared to 10-20 % for doctors. GPs were most pessimistic at predicting general CPR survival rates and nurses were most optimistic. 87 % of staff wanted CPR if they were healthy, this dropped to 29 % if they were to have a chronic illness that affected them daily. Public figures were 62 % and 48.8 % respectively. People of African origin were less likely to want CPR if they had a chronic illness, compared to Caucasians. Those who went to university were more likely to want to discuss CPR with a doctor. Less than 3 % surveyed had a do not attempt resuscitation (DNAR) in place. Staff are not better at predicting initial CPR survival rates. The public felt that they need more knowledge about CPR; the most popular learning modality was online learning. Staff opinions on whether they would want CPR are strongly influenced by the presence of a chronic disease causing a daily impact. Pediatric out-of-hospital cardiac arrest (OHCA) is a devastating event with low survival rates, partly due to lack of bystander cardiopulmonary resuscitation (CPR) (1). While telephone dispatcher-assisted CPR instructions (T-CPR) improve the frequency and quality of bystander CPR for OHCA in adults (2), this support remains undeveloped in pediatry. We aim to assess the effectiveness of a new pediatric T-CPR protocol in both previously trained and untrained bystanders, and secondarily, the utility of the ventilations. Adults with no CPR experience were recruited in a public movie center and among bachelor nursing students in Liège district. Volunteers were randomly assigned either to «T-CPR» or to «no T-CPR» and were submitted to a 6 minutes infant OHCA scenario. CPR performance data were extracted from video and infant manikin using Laerdal Simpad® SkillReporter. T-CPR increased significantly CPR attempts (74.1 % of U-NG group performed chest compressions, compared to 100 % of other groups; p = 0,0007) and overall CPR performance, but median time to first chest compression was longer in guided groups (p < 0,0001). The proportion of volunteers who managed to deliver ventilation during CPR was higher (p < 0,0001) in guided groups (81,2 % for U-G group and 83,3 % for T-G group) than in unguided groups (0 % for U-NG group and 46,1 % for T-NG group). While the tidal volume delivered was too high for each group, the fraction of time required to give ventilations was significant higher for guided groups. Conclusions T-CPR instructions have the potential to increase the number and the performance of bystander CPR in trained or untrained volunteers (3). Mouth to mouth ventilations was responsible for major interruptions in chest compressions. Moreover, we identified an overall tendency to hyperventilation. shows the predicted survival rates of initial CPR (staff v public). Dantrolene is a therapeutic drug for patients with malignant hyperthermia. Interestingly, dantrolene has antiarrhythmic characteristics and might be an alternative for patients with VTs (ventricular tachycardia) or VF (ventricular fibrillation). However, experimental resuscitation studies comparing dantrolene and amiodarone are still missing. Aim of this study was to evaluate ROSC (return of spontaneous circulation) rates for dantrolene versus amiodarone in a pig model of sustained cardiac arrest (8 min) under VF. Ventricular fibrillation was induced in anesthetized pigs (dantrolene, n = 14 and amiodarone, n = 14 or saline, n = 10). After 8 min of untreated VF, chest compressions and ventilation were started and one of the drugs (amiodarone 5 mg kg -1, dantrolene 2.5 mg kg -1 or saline (sham)) was applied in a blinded fashion. After 4 min of CPR without epinephrine or other drugs, defibrillation with 200 J was performed. Standardized resuscitation including mechanical cpr and defibrillation according to protocol was applied in all animals. ROSC rates and cardiovascular as well as blood gas analysis and cerebral oximetry and perfusion measurements were calculated during the experiments. Initial ROSC rates were 10 of 14 animals in the dantrolene group vs. 5 of 14 animals in the amiodarone group (3 of 10 for saline group). However, persistence of ROSC until 120 min follow-up was shown in 6 animals of the dantrolene group, 4 animals in the amiodarone group and 2 animals of the saline (sham) group. However, results were not statistically significant between groups. Hemodynamic data and cerebral perfusion as well as cerebral oximetry were comparable between survivor animals. Pharmacological effect of dantrolene is comparable to amiodarone regarding ROSC in our experimental model. Hemodynamic parameter did not show any relevant significance between both groups. Future studies should evaluate dantrolene for CPR in a bundle approach including epinephrine and post-ROSC therapy protocol. Therapeutic hypothermia(TH) improves short term physical neurological outcome and survival in post-arrest patients. However, there are limited data of long term survival and functional neurological outcome in the survivors. We reviewed database of post arrest patients undergoing TH who survived to discharge with consciousness (Glasgow-Pittsburgh Cerebral Performance: CPC 1-3) in our hospital from 2006 to 2014. All hospital survivors were evaluated life or death status and death date on January 31st , 2015 by checking death certificates from national registry system. We contacted the survivors or their relatives by phone or mail and scheduled a follow-up visit to evaluate their ability. The patients who were unable to visit were assessed the functional disability by phone interview or using recorded follow-up data. The functional neurological outcome was scored on a disability rating scale. [1] This scale consisted of 1) arousability awareness and responsivity, 2) cognitive ability for self-care activities, 3) dependence on others and 4) psychosocial adaptability. The level of disability was defined as the followings: 0none; 1mild; 2 to 3.5-partial; 4 to 6moderate; 7 to 11moderately severe; 12 to 16severe; 17 to 21extremely severe; 22 to 24vegetative state and 25 to 29extreme vegetative state. In addition, survival rates at 6 months, 1 and 2 years after discharge were analyzed. Of 51 patients treated with TH, 27 patients survived to hospital discharge. Seventeen of the hospital survivors were conscious: 6, 3 and 8 patients with CPC at discharge 1, 2 and 3, respectively. Five of them passed away later. Approximately 78.6 %, 76.9 % and 75 % of awake patients survived at 6 months, 1 and 2 years after discharge, respectively. The majority (3/5) of dead cases died within 6 months with severe functional disability (score 24-26). The patients' disability scores were shown in table. One-third of awake patients with CPC 2 and 3 who still survived at 6 months after discharge finally recovered to normal physical and cognitive function, while most of patients with CPC1 returned to normal function or minimal disability. Long term survival rate of conscious survivors with TH was high. Most of hospital survivors alive longer than 6 months after discharge had good functional capability. Introduction Kidney disease after out-of-hospital cardiac arrest (OHCA) is previously incompletely described. We examined the occurrence of chronic kidney disease (CKD) and acute kidney injury (AKI) in OHCA patients, and the impact of AKI on six-months mortality and neurological outcome in patients with our without renal replacement therapy (RRT). Prospective cohort study at Oslo University Hospital, Oslo, Norway, between September 8, 2010, and January 13, 2014. Resuscitated comatose OHCA patients admitted to an intensive care unit were included. Patients were treated according to a standardized treatment protocol including therapeutic hypothermia to 33°C for 24 hours. CKD and AKI were classified according to the Kidney Disease Improving Global Outcomes (KDIGO) guidelines. Patients with previously known CKD were excluded from further detailed analyses. Main outcomes were six-months mortality and good neurological outcome defined as Cerebral Performance Category 1-2. [1, 2] . Adoption of this approach will increase ICU admission at the regional PPCI centers. In this observational cohort study we analysed the ICU service dependency of the PPCI patients following OHCA. All adult patients admitted to our intensive care unit between April 2014 and March 2015 were analysed. Patients admitted from the PPCI pathway following OHCA were compared with all other unplanned admissions during the study period. Data regarding outcome and dependency was collected from the ICNARC (Intensive Care National Audit Research Council, UK) database. Paired t-test & Chi square test were performed to estimate statistical significance levels. The number of OHCA-PPCI and unplanned admissions during the study period were 51 and 630 respectively. Median length of ICU stay were significantly high for the PPCI patients (4.25 vs 3.70 days, P = 0.02). Furthermore, PPCI patients had higher ICNARC severity scores (29.3 vs 20; p < 0.0001). PPCI patients had significantly higher levels of dependency, evidenced by higher levels of organ support (3.2 vs 2 organs; p < 0.0001), greater level 3 care (4.1 vs 2.6 days; p = 0.012), more days ventilated (4 vs 2.4 days; p = 0.005), and higher requirements for advanced CVS support (1.6 vs 0.6 days; p < 0.0001). There were no significant differences between dialysis requirement or hospital length of stay. Overall ICU mortality rates were higher in the PPCI group (46.9 % vs 30.7 %, p = 0.02). Despite representing only 8 % of the total number of patients admitted during the study period, PPCI patients accounted for 12 % of total ICU level 3 bed days, 13 % of ventilated days and 19 % of advanced CVS days. Overall, patients undergoing PPCI post OHCA had significantly greater levels of dependence than other patients admitted to the ICU. The high level of dependency need to be considered when contemplating expansion of PCI services for non STEMI, OHCA patients with suspected cardiac causes. We established a multi-center, prospective cohort that included Utstein data in prehospital and treatment contents and data after hospital arrival. The purpose of this study was to determine the most important indicators of prognosis in patients with return of spontaneous circulation (ROSC) following out-of-hospital cardiopulmonary arrest (OHCA) and to develop a best outcome prediction model. All consecutive patients who were suffering from OHCA and who were then transported to institutions participating in this registry were prospectively recorded in July 2012 to December 2013. Criteria for inclusion of this study were a witnessed cardiac arrest, age greater than 17 years, presumed cardiac origin of the arrest, and successful ROSC. Multivariate logistic regression (MLR) analysis was used to develop the best prediction model. The dependent variables were favorable outcome (cerebral-performance category: CPC 1-2) and poor outcome (CPC 3-5) at 1 month after the event. The explanatory variables were used concerning patient characteristics and resuscitation. Subjects comprised 147 patients in VF and 177 patients with pulseless electrical activity (PEA)/asystole. The percentage of favorable outcomes was 42.2 % (62/147) in VF and 11.3 % (20/177) in PEA/asystole. The most important prognostic indicators of favorable outcome founded by MLR were age (p < 0.01), time from collapse to ROSC (TROSC) (p < 0.01), base deficit (p = 0.06), presence of by-stander cardiopulmonary resuscitation (p = 0.09) for VF and age (p = 0.05), TROSC (p < 0.01), and base deficit (p < 0.05) for PEA/asystole. Areas under the receiver-operating characteristic curve were 0.866 for VF and 0.896 for PEA/asystole respectively. A model based on four selected indicators showed a high predictive value for favorable outcome in OHCA patients with ROSC. Near infrared spectrometry (NIRS) could be a useful non-invasive monitoring of cerebral perfusion during cardiopulmonary resuscitation (CPR). The aim of our study was to investigate and evaluate NIRS during CPR in a porcine model of cardiac arrest (CA). Methods 24 pigs under general anesthesia and mechanical ventilation were included in this study. In all study animals ventricular fibrillation was induced with application of electrical current via a pacing wire. After 7 min of CA, CPR was initiated with the use of LUCAS mechanical device and after 5 more min, initial CA-rhythm was analyzed and advanced life support interventions were provided according to the 2015 ERC guidelines. Regional cerebral oxygenation (rSO2) was recorded before and after resuscitation in 1 min intervals by INVOS OX-IMETER via a Somasensor electrode placed on pigs heads. Pigs were divided into two groups depending on return of spontaneous circulation or not, G-A: no ROSC (n = 15) & G-B: ROSC (n = 9). G-A & G-B were compared at 6 Phases: P0 (baseline), P1 (7 min after CA-no flow), P2 (5 min after CPR initiation) and thereafter every 5 min (P3-P5). For the statistical analysis repeated measures ANOVA was used (SPSS21). Duration of CPR was 40 min for G-A and until ROSC for G-B. Directly after CA, a rSO2 decrease was recorded, which remained for the whole resuscitation attempt (Table) . There weren't any statistical differences between the study groups. After ROSC, rSO2 remained decreased and increased gradually over time. According to our study results, rSO2 alterations are useful in identifying CA immediately but do not provide any other useful information during CPR. While a new 2015 AHA guideline of cardiopulmonary resuscitation recommends cardiac compression should be performed on the sternal bone at a level of bust point, there is individual difference in the heart position. The ratio of real-time low-oxygenized cerebral blood circulation represented by rSO2 (regional cerebral oxygen saturation) can be non-invasively and quantitatively measured using a near infrared-red spectroscopy (NIRS) sensor placed at the forehead in which higher absorption of near infrared-red ray occurs in lowoxygenized blood. Therefore, rSO2 potentially enables us to monitor the effective cardiopulmonary resuscitation. The purpose of this study was to validate rSO2 monitoring on effectiveness of chest compression by investigation of relationship between arterial blood pressure and rSO2 in patients with out-of-hospital cardiac arrest (OHCA). This study was a prospective observational study conducted at a tertiary emergency medical care center in Japan. Systolic arterial pressure (SAP) and mean arterial pressure (MAP) through invasive arterial pressure monitoring, and rSO2 were measured at every 3 minutes during the cardiopulmonary resuscitation of OHCA patients, and the relationships between absolute values in the measurements were compared using the two types of coefficient of correlation, within and between patients in order to perform the statistical analysis considering repeated measurement taken from a single patient. There were 38 patients out of total 169 patients with OHCA who fulfilled the inclusion criteria of the study and 16 patients were finally included. Total measurement points of these patients were 29 points. Median value of the patient age was 82 years old (IQR 71-87). Male patient number was 14 (88 %), and 81 % patients had witnesses at the time of cardiopulmonary arrest. Two types of coefficient of correlation between arterial pressure and rSO2 showed that the within patients f coefficient of correlation between MAP and rSO2 was 0.20 (95 % CI:-1.8-0.53), but the coefficient of correlation between patients was 0.82 (95 % CI:0.14-0.72). There was correlation between rSO2 and MAP during sternal chest compression in patients with OHCA indicating that rSO2 may be useful to monitor cerebral blood flow during the chest compression. Our results suggest that sternal chest compression under rSO2 monitoring improves outcome of the patients with OHCA. Testing EEG reactivity is a standard procedure during EEG registration in unconscious ICU patients. It is suggested to use EEG reactivity as a marker for poor outcome after cardiac arrest [1] . However, there is no clearly defined protocol how to test reactivity. In this study we investigated which stimulus type most effectively evokes a cortical response in cardiac arrest patients. Prospective cohort study in patients monitored with continuous EEG after cardiac arrest in two Dutch ICUs. Twice a day a very strict stimulus protocol including five stimulus types (in this order: clapping, yelling patients name, passive eye opening, nasal tickle and sternal rub) was executed. Each stimulus was applied 3 times for 5 s with an interval of 30s. Each stimulus response was individually scored reactive, non-reactive, doubtful or unscorable (e.g. too much noise) by three independent, blinded raters (JH, MvP, MT-C). Reactivity was defined as a change in the amplitude or frequency in the EEG upon stimulation, excluding muscle artifacts. Iso-electric EEGs were excluded beforehand. Stimulus responses on which two out of three raters agreed were included in the analysis. The stimulation protocol was executed 72 times in 29 patients, resulting in 1050 independently rated stimulus responses. 42 stimuli were excluded because of disagreement between raters. The intraclass correlation coefficient was 0.38. The response to 91 stimuli in 19 patients was found reactive. The stimulus inducing EEG reactivity most often was clapping, 29 %. Passive eye opening least often induced EEG reactivity, 16 %. The serum concentration of neuron-specific enolase (NSE) has been established as a highly specific predictor of poor outcome after cardiac arrest but no clear cut-off has been identified. We prospectively included all out-of-hospital cardiac arrest patients admitted in our intensive care unit (ICU) from 2011 to 2014. They were submitted to target temperature management (34°C) for 24 hours. Outcome was assessed according to the Cerebral Performance Category (CPC) at ICU discharge. CPC 1 to 3 was considered as good outcome and CPC 4 to 5 as poor outcome. Recent guideline suggest to sample NSE at 72 hours after cardiac arrest. Results 132 patients were admitted from 2011 to 2014. 54 patients presented good outcome while 78 patients presented poor outcome according to CPC. The NSE serum concentration was sampled at the 3rd days after cardiac arrest. In the first group NSE value was 22.3 mcg/l. In the second group NSE value was 74,5 mcg/l. Nevertheless 8 patients presented poor outcome and NSE serum concentration < 40 mcg/l and 3 patients presented good outcome and NSE > 60 mcg/l (Fig. 83) . We observed great variability of NSE value in patients with poor outcome (Fig. 84) . Conclusions NSE could be an important biomarker of poor outcome after cardiac arrest. Further investigation is necessary to clarify timing for sample and cut off. Anoxic-ischemic encephalopathy after cardiac arrest is a common cause of coma requiring intensive care of survivors. Certain malignant electroencephalographic (EEG) patterns have been shown to correlate with poor prognosis. Neuron specific enolase (NSE) released after cardiac arrest is regarded as a severity indicator of postanoxic neuronal injury. We investigated the EEG findings in post-cardiac arrest patients and correlated these findings with NSE levels and outcome scales. Methods 34 Egyptian patients after resuscitation from cardiac arrest were subjected to, EEG, NSE measurement, and Glasgow outcome scale (GOS). EEG data EEG monitoring was done within the first day of admission for about 30 minutes and repeated on the seventh days after cardiac arrest. EEG classification on days 1 and 7 post-arrest according to Young et al. (1) I. Delta/theta > 50 % of recording II. Triphasic waves III. Burst-suppression pattern: with or without epileptiform activity IV. Alpha/theta/spindle pattern coma (no reactivity). V. Suppression (generalized) : <20 microvolts. NSE will be sampled at both day 1 and 3 after cardiac arrest. Serum concentrations of NSE will be measured using ELISA technique. There was statistically significant negative correlation between EEG coma scale on day 1 and 7 with GOS ( p > 0.001). There was statistically significant negative correlation between NSE on day 1 and 3 with GOS (p > 0.001) (Table 44) . There was statistically significant positive correlation between EEG scale on day 1 and 7 with neuron specific enolase on day 1 and 3 (Table 45) . In post-cardiopulmonary arrest patients, there is significant correlation between EEG coma scale and Glasgow outcome scale, as certain malignant EEG patterns are correlated with high serum levels of NSE and they are good predictors of poor outcome of these patients. We report the experience of a 10-bed intensive care unit (ICU) introducing a targeted temperature management (TTM) strategy following cardiac arrest based on the TTM trial investigation protocol [1] , including pre-and post-intervention data. Our unit averages 4 cardiac arrest cases a month; at this low frequency, and given recent updates to TTM evidence, we were concerned that our temperature management practice could be inconsistent or outdated. Cases were identified retrospectively from the Scottish national ICU audit database. All patients with a diagnosis of cardiac or respiratory arrest were selected for full case note review. Baseline survey from May-November 2014 yielded 21 cases of which 3 sets of notes were unobtainable. Data collected included documented target temperature (TT), use of active cooling, hourly temperature data over the first 24 hours in ICU, and eventual outcome. A new TTM strategy, based on the TTM study protocol, was then introduced in February 2015. This was presented at the local ICU meeting and copies made available at each bed space for staff to refer to. Subsequently, all cases from March-September 2015 were identified by the same method and the same data set collected, totalling 20 cases of which 1 set of notes was unobtainable. Baseline data indicated no consistent TTM strategy with documented TT ranging from 35 to 37°C, with no target documented in 50 %. Only 28 % were actively cooled, while 50 % were pyrexial (defined as >37.5°C) and all but one patient (94 %) exceeded the TTM target of 36°C by at least 0.5°C. Prior to intervention, the average time spent above 37.5°C was 5.8 hours and the average time above 36.5°C was 11.8 hours. Following the intervention, TTs ranged from 36-36.5°C with an increased proportion of patients (12, 63 %) having the desired TT of 36°C. A decreased proportion, 26 % regrettably still had no documented TT, however significantly more patients (13, 68 %) were actively cooled. Only 4 patients (21 %) were pyrexial, while 12 still exceeded 36.5°C (63 %). Encouragingly, after the introduction of the protocol, the average time spent above 37.5°C was 1.15 hours (80 % reduction) and the average time above 36.5°C was 4.2 hours (64 % reduction). Following changes to TTM in recent years, our promising initial data show that introduction of a TTM strategy in a small unit with infrequent cases is achievable, and compliance with targets can be significantly improved with a simple intervention protocol. Regardless of the recent advances in cardiopulmonary resuscitation and post-resuscitation care, most patients who die during their hospital stay after out-of-hospital cardiac arrest (OHCA) decease due to post-anoxic neurological injury. Near infrared spectroscopy (NIRS) provides information on regional brain oxygenation by measuring the regional cerebral oxygen saturation (SctO2) and offers the possibility to determine changes in the balance between oxygen delivery and uptake in the frontal lobe of the brain. Therapeutic hypothermia(TH) improves neurological outcome and survival in comatose patients with ROSC after cardiac arrest [1] . Our institute has developed hypothermia protocol and used it as a standard practice for 8 years. We aimed to evaluate clinical outcomes of patients undergoing TH and analyze factors associated with these outcomes. Targeted temperature management after out of hospital cardiac arrest remains a recommended therapeutic approach by the Australian Resuscitation Council. As one of the top five reasons for admission to our major tertiary Australian Intensive Care Unit, we aimed to audit our current practice with regards to time to target temperature, time at target temperature and rewarming practices. We conducted a retrospective review of 54 patients admitted to our Intensive Care Unit after return of spontaneous circulation (ROSC) from out of hospital cardiac arrest, collecting data from our electronic medical records system 'Metavision' and analysed using SPSS v22.0. Ethics approval was obtained. Only 16 % of patients achieved target temperature within 4 hours, with 25 % still not achieving target within 16 hours of ROSC and 44 % were maintained at the target for 12-24 hours. The target was chosen as 32-24 degrees in 73 % of patients. Rewarming occurred at a a rate of <0.5 degrees per hour in less than 50 % of the patients. Despite these findings there was no correlation between neurological outcome and adherence or not to the current evidence based recommendations. This data shows that our unit has been unable to consistently apply targeted temperature management in "real world practice" according to evidence based guidelines, however this has not resulted in adverse outcomes for our patient population. Mild therapeutic hypothermia (MTH) improves neurological outcome and survival after out-of-hospital cardiac arrest (OOH-CA) from TV/FV [1] . However, this technique is not universally accepted, partially due to technical difficulties related with the implementation of practice protocols [2] . We present our experience over one year since the introduction of an institutional MTH protocol, testing its feasibility. was 7.2 days. At 3 months, all patients were at home and free from neurological alterations. The study points out feasibility of a MTH protocol implementation in this particular hospital setting. We overcame difficulty to rapidly induce and safely maintain MTH mixing two cooling techniques: times of induction and rewarming were acceptable, maintenance phase was characterized by temperature stability and lack of life threatening events, and clinical outcome was encouraging. Out of hospital cardiac arrest (OHCA) patients treated with targeted temperature management (TTM) may have substantial difficulty in ventilator weaning due to multiple organ failure, including post-TTM neurologic injury. Ability to predict the clinical course of TTM patients regarding duration of ventilation is important for clinical decisionmaking for both safer extubation and planning early tracheostomy. However, predictive factors of ventilator weaning after TTM remain unclear. Hypothesizing that weaning difficulty is associated with admission resuscitation conditions, we explore whether failure to wean may also be predicted at admission. The purpose of this study is to examine which factors predict ventilator weaning difficulty. We performed a retrospective cohort study of OHCA patients brought to the emergency room at St. Luke fs International Hospital in Tokyo, Japan, who underwent TTM between January 2006 and July 2015. Primary outcome was days to weaning from admission to the intensive care unit. Using the electronic medical record, we collected patient characteristics, resuscitation conditions, and examination data at admission. After characterizing weaning success using descriptive statistics, the relationship between ventilator weaning during hospitalization and resuscitation conditions were assessed with Cox regression. Of 115 OHCA patients who completed TTM, median time to weaning from ventilation was 6 days (4-8: IQR Introduction c-Fos is well used to detect a pathogenesis in CNS disorders. We examined changes in c-Fos immunoreactivity in the paraventricular nucleus of the hypothalamus (PVNT) and paraventricular nucleus of the thalamus-(PVNT) after myocardial infarction (MI) in rats. Infarction in the left ventricle was examine by Massoni's trichrome staining. Neuronal degeneration (damage/death) was examined for 56 days after MI using cresyl violet (CV) and Fluoro-Jade B (F-J B) histofluorescence staining. Changes in C-Fos immunoreactivity were examined by immunohistochemistry for c-Fos. The average infarct size of the left ventricle circumference was about 44 % after MI. Neuronal damage/death was not detected in both PVNH and PVNT after MI. c-Fos immunoreactive (+) cells were hardly found in both nuclei of the sham-group. However, c-Fos + cells were increased in both nuclei after MI and peaked in the PVNH and PVNT 3 days and 14 days, respectively, after MI. At 56 days after MI, c-Fos + cells were barely found in both nuclei. These results show that MI dramatically induced c-Fos immunoreactivity in the PVNT and PVNT and suggest that the increase of c-Fos expression may be associated with brain stress associated with MI. Cerebrovascular diseases are leading cause of morbidity, mortality and disability [1] . Elevated troponin levels after AIS are common and are associated with increased risk of death and cardiac complications [2] . The aim of these study was to investigate if elevated levels of serum cTnI in acute Ischemic stroke patients are able to predict higher inhospital mortality. We retrospectively investigated 218 patient between 11.2012 and 11. 2014. They were admitted in ICU and diagnosed as (AIS) according to the TOAST classification. They were 8.51 % of total 2561 patients hospitalized in the ICU in the same period. Brain CT or MRI were used on admission. The severity of neurological deficit were scoring based on NIHSS. The functional condition of patients was assessed on discharge by modified Rankin scale (mRS). Troponin I was monitored on the 1st, 3rd and 5th day. Inclusion criteria were age over 18,AIS less than 24 hours, CT/MRT on admission and Troponin I on 1st, 3rd and 5th day. Exclusion criteria were SAH or ICH renal failure, Acute coronary syndrome, Acute PE. Patients were divided into two groups-discharged alive and died in hospital. The average age of patients was 75,27 ± 12,6 (SD) years. The study showed increased levels of cTnI in patients with AIS died inhospital. Those results gave us the grounds to offer routine monitoring of cTnI for early stratification of high risk patients. It is well known that neurons in the dentate gyrus (DG) of the hippocampus are resistant to short period of ischemia. Hyperthermia is a proven risk factor for cerebral ischemia and can produce more extensive brain damage and related with mortality rates. The aim of this study was to examine the effect of hyperthermic conditioning (H) on neuronal death, gliosis and expressions of SODs as anti-oxidative enzymes in the gerbil DG following 5 min-transient cerebral ischemia. The animals were randomly assigned to 4 groups: 1) (N + sham)group was given sham-operation with normothermia (N); 2) (N + ischemia)-group was given 5 min-transient ischemia with N; 3) (H + sham)-group was given sham-operation with H; 4) (H + ischemia)group was given 5 min-transient cerebral ischemia with H. H (39 ¡¾ 0.5¨¬C) was induced by subjecting the animals to a heating pad for 30 min before and during the operation. In the (N + ischemia)-groups, a significant neuronal death was observed in the polymorphic layer (PL) from 1 day after ischemia-reperfusion. In the (H + ischemia)-groups, neuronal death was also observed in the PL from 1 day post-ischemia; the degree of the neuronal death was severer than that in the (N + ischemia)-groups. In addition, we examined the gliosis of astrocytes and microglia using anti-glial fibrillary acidic protein (GFAP) and anti-ionized calcium-binding adapter molecule 1 (Iba-1). GFAP+ and Iba-1+ glial cells were much more activated in the (H + ischemia)-groups than those in the (N + ischemia)-groups. On the other hand, immunoreactivities and levels of SOD1 rather than SOD2 were significantly lower in the (H + ischemia)-groups than those in the (N + ischemia)-groups. In brief, on the basis of our findings, we suggest that cerebral ischemic insult with hyperthermic conditioning brings up severer neuronal damage and gliosis in the polymorphic layer through reducing SOD1 expression rather than SDOD2 expression in the DG. Remote ischemic post conditioning (RIPoC) has been proven to provide potent protection of the heart and brain against ischemiareperfusion injury. However, despite the evidence of cerebral protection with RIPoC is compelling, RIPoC-mediated neuroprotection against transient cerebral ischemic damage insult is still mired in controversy. In this study, we examined the effect of RIPoC induced by sublethal transient hind limb ischemia neuronal death in the hippocampus following 5 min of transient cerebral ischemia in gerbils. Animals were randomly assigned to sham-, ischemia-, sham plus (+) RIPoC-and ischemia + RIPoC-groups. RIPoC was induced by three cycles of 5 min and 10 min occlusion-reperfusion of both femoral arteries at predetermined points in time ( 0, 1, 3, 6, 12 and 24 h after transient cerebral ischemia). CV staining, F-J B histofluorescence staining and NeuN immunohistochemistry were carried out to examine neuroprotection in the RIPoC-mediated hippocampus 5 days after ischemia-reperfusion. In the ischemia-group, we found a significant loss of pyramidal neurons in the stratum pyramidale (SP) of the hippocampal CA1 region at 5 days post-ischemia compared with the sham-group. In all of the ischemia + RIPoC-groups, the loss of pyramidal neurons in the CA1 region at 5 days post-ischemia was not different from that in the ischemia-group. Our present findings indicate that RIPoC does not prevent hippocampal CA1 pyramidal neurons from neuronal death induced by transient cerebral ischemia. Brain death and admission diagnosis in neurologic intensive care unit, a correlation? A. Marudi 1 , S. Baroni 1 , A. Gaspari 2 , E. Bertellini 1 Aim of the study is to evaluate which acute brain injury (ABI) was correlated with brain death (BD) in our neurologic intensive care unit (NICU , ischemic stroke for 46 (9.9 %), brain cancers for 9 (1.9 %) and other reasons for 7 (1.5 %). 155 (63,3 %) patients with brain haemorrhage developed brain death (p < 0.01) (Fig. 86) , while only 20 (21,9 %) anoxic brain injury patients experienced brain death (p = 0.052) (Fig. 87) . The incidence of brain death was higher for patients admitted in the NICU with cerebral haemorrhage than for patients with anoxic brain injury. Brain magnetic resonance imaging findings in patients with septic shock G. Orhun 1 , E. Senturk 1 , P. E. Ozcan 1 , S. Sencer 2 , C. Ulusoy 3 , E. Tuzun 3 , F. Esen 1 The imaging of the brain of septic shock patients showing alterations in their neurological status are unremarkable most of the time. The current study provided findings from magnetic resonance imaging of the brain in septic shock. Twenty patients with septic shock and brain dysfunction symptoms (acute alterations in mental status, delirium, coma, seizures and focal neurological deficits) [median age 54 years (26 to 65), APACHE II: 21 (15 to 29), SOFA: 8(1 to 15)] underwent brain magnetic resonance imaging (MRI) including gradient echo T1-weighted, fluid-attenuated inversion recovery (FLAIR), T2-weighted and diffusion isotropic images, and mapping of apparent diffusion coefficient. MRI findings were classified based on lesion types and localizations. Blood was withdrawn simultaneously for biomarker analysis. Neurological recovery of all patients were evaluated using Glasgow Outcome Scale (GOS) during discharge. None of the patients' brain imaging was normal at the time of the diagnosis of neurological alterations including mostly delirium. Twelve patients showed white matter hyperintensity (leukoencephalopathy), three patients showed ischemic lesions and posterior reversible encephalopathy syndrome findings were evident in three patients. Unexpected findings of cerebral atrophy not relating to age were seen in seven patients. The lesions were correlated with disease severity and GOS. No unexpected events were encountered during transport and MRI scan. Our study indicates the importance of brain imaging in severe sepsis and septic shock. MRI can be an important imaging method in these patients, where lesions were associated with disease severity and poor outcome. Valproic acid (VPA) is widely used as antiepileptic drug, which is known to induce hepatotoxicity and hyperammonemia encephalopathy. These toxic side effects are produced by a decrease in the carnitine bloodlevels, of which the valproic acid is directly responsible. The aim of this report is to demonstrate the benefits of L-carnitine supplementation in patients with acute valproic acid intoxication. Patients who are chronically treated with VPA show low levels of carnitine by depletion of its deposits, therefore L-carnitine addition proves to be appropriate. This was a randomized controlled trial conducted in the Intensive Care-Toxicology Unit of the Clinical Emergency Hospital in Bucharest, that included all patients admitted for acute VPA poisoning (VPA >150 μg/l), in 2014. The patients were randomly allocated in 2 groups to receive standard therapy or 1800 mg of L-carnitine/ day together with standard therapy for 3 days. Plasma levels of valproic acid and ammonemia were determined every six hours. Data was statistically analyzed and the results were considered statistically significant at p < 0.05. A total of 62 patients (34 in the standard group and 28 in the L-carnitine group) accomplished the inclusion criteria. L-Carnitine supplementation resulted in significant reductions in ammonemia (47.9 ± 6 vs. 61.9 ± 11.39 μ mol/L), determined after 24 hours, levels compared with baseline (P < 0.001), whereas this parameter remained still high in standard group. The trend was similar for plasma VPA levels (p < 0.05). Studied parameters were not influenced by gender, but were correlated with age (p = 0.01). The use of L-carnitine in the treatment of VPA poisoning accelerates the elimination of VPA and facilitates the decrease in ammoniac plasma levels, therefore reducing systemic toxicity and encephalopathy risk. Further controlled, extented trials are required to better elucidate the therapeutic and prophylactic roles of L-carnitine in the management of VPA toxicity. Introduction EEG reactivity has been reported as a predictor of outcome in comatose patients with post-anoxic encephalopathy [1, 2] . However, visual analysis of EEG reactivity has a high inter-rater variability and is time consuming [3, 4] . Therefore, analysis of EEG reactivity may benefit from a quantitative approach enabling automatic interpretation. Prospective cohort study in comatose ICU patients with continuous EEG background patterns in two Dutch ICUs. Reactivity to auditory, visual and sensory stimuli was tested following a strict protocol. Visual analysis was regarded as the gold standard. Reactivity was defined as a change in amplitude or frequency in the EEG after stimulation, excluding muscle activity. Stimulus responses were scored by three independent experts (JH, MvP, MT-C), responses on which all three experts agreed were used. Data was separated in an independent training and validation set. Spectral characteristics of the EEG before and after each stimulus were compared. These were determined by both parametric and non-parametric methods. Epochs of 2, 5 and 10 seconds before and after the start of the stimulus were analyzed and averaged over 1, 3, 5 and 7 channel. In the training set and validation set, 190 and 163 stimuli of 13 and 7 patients, respectively, were included. The prevalence of a reactive response is 8.3 % in the training set and 3.1 % in the validation set. The non-parametric power spectral density with a difference of 30 %, in 5 second epochs, averaged over 5 channels was found to be the optimal method for automatic analysis. This resulted in a sensitivity of 92.9 % (95%CI 87.6-96. Conclusions Automatic analysis of EEG reactivity based on the differences in spectral characteristics detects reactivity. With further research we will optimize this method to increase objectivity of EEG reactivity testing. Accurate prognostic models, including common intensive care severity scores such as APACHE II (acute physiology and chronic health evaluation II), SAPS II (simplified acute physiology score) and SOFA (sequential organ failure assessment) are of high importance for quality assurance and research in intensive care [1] [2] [3] . The aim of our study was to investigate the usefulness of these in predicting mid-term mortality after spontaneous intracerebral hemorrhage (ICH), and whether the scores are of any extra value compared to a basic model comprising of only age and level of consciousness. We included adult patients with spontaneous ICH, treated in Finnish intensive care units (ICU) in 2003-2012 from a nationwide ICU database. Logistic regression was used to customize APACHE II, SAPS II and SOFA for six-month mortality prediction. We created a reference model based on age and Glasgow Coma Scale (GCS) score for comparison. We used a temporal split-sample technique for internal validation. We tested model performance by assessing discrimination (area under the curve [AUC]) and calibration (Hosmer-Lemeshow &#264; and GiViTI calibration tests). Totally 3,218 patients were included (1,238 in the development and 1,980 in the validation cohort). Both APACHE II and SAPS II showed good discrimination (AUC 0.84, AUC 0.85, respectively) but did not outperform the reference model (AUC 0.85; compared to APACHE II, p = 0.133; compared to SAPS II, p = 0.141). SOFA, however, displayed significantly poorer performance compared to the other models (AUC of 0.76). All models showed poor calibration (p < 0.05). Age was found to be a particularly strong predictor of death in patients with GCS 9-12, as the mortality for patients aged <40 was only 8 %, yet being as high as 43 % for patients aged >79. Both APACHE II and SAPS II-based models showed good discrimination, but poor calibration for predicting six-month mortality in patients with spontaneous ICH treated in the ICU. The performance of a simple prognostic model composed of only age and GCS, was comparable to the more complex scoring systems and based on our results, could replace these in this patient group. The commonest symptom of non-traumatic subarachnoid haemorrhage (SAH) is sudden onset severe headache. The Emergency Physician is faced with a significant diagnostic challenge by this condition as headache represents 1-2 % of ED attendances while SAH only accounts for 1-3 % of these headaches [1, 2] . Conventional practice remains lumbar puncture (LP) sample analysis in patients with a negative CT scan [1] . Recent series in Europe and the United States have demonstrated wide variation in practice as whether an LP is undertaken with no clear guidance on LP omission [2] . We propose a clinical decision tool integrating recent evidence on this patient group. Our search strategy included Medline, Embase, Google Scholar, international guidelines and reference searches as well as the authors' collection on the subject. Multiple risk factors have been identified for SAH including: Loss of consciousness, family history, hypertension, polycystic kidneys, excessive alcohol intake and smoking. The aetiology of SAH in the majority (>85 %) of patients is an aneurysm in the "Circle of Willis" [2] . CT sensitivity within eligible studies ranged from 91-100 % [2] , the largest study was of a prospective design and included 3132 patients and reported 100 % sensitivity if CT is performed within 6 hours of symptom onset [1] . Recent work in the UK including 2248 patients who underwent an LP concluded that an LP to be of low diagnostic yield with the investigation of choice in patients equivocal LP findings or recurrent symptoms was CT Angiography [2] . The majority of patients presenting with acute severe headache can have a SAH excluded with a negative CT scan if seen with 6 hours of onset and the images are reported by a Radiologist. A number of patients with delayed presentation or risk factors for SAH benefit from CT Angiography unless contra-indicated. Our findings will be presented as an evidence based clinical decision tool. Aneurismal subarachnoid haemorrhage (aSAH) is a significant cause of morbidity and mortality throughout the world. The primary goal of the treatment is the occlusion of the ruptured aneurysm preventing a rebleeding. To reduce the rate of this adverse event current guidelines recommend that surgical clipping or endovascular coiling should be performed as early as feasible. However, this strategy may be associated with several disadvantages and timing of procedure remains controversial. ; P = 0.03). There was no correlation between time from admission to procedure and ICU, IMCU or hospital length of stay. Only one patient was readmitted to the ICU and in three was necessary repeat a new invasive procedure. In our cohort of aSAH patients treated with endovascular and surgical methods and requiring intensive care we found a high mortality rate mainly related to early procedure. Furthermore, we speculate that these findings could have a significant impact on organizational strategies and cost saving of healthcare organizations in some countries. Restrictive red blood cell transfusion (RBCTx) strategies are advocated in most critical care populations. Whether this is standard practice and applies to an aneurysmal subarachnoid hemorrhage (aSAH) patient population is unclear. We aimed to describe transfusion practices in an aSAH population in Canadian hospitals, including hemoglobin (Hb) triggers and predictors of RBCTx in preparation for a transfusion trial. Methods This is a multi-centre retrospective cohort study conducted at four Canadian academic tertiary care centres. Population: All adult aSAH patients admitted to the study hospitals from January 1, 2012 to December 31, 2013. Patients were identified from hospital administrative discharge records and existing local SAH databases. The diagnosis was confirmed by reported presence of blood in the subarachnoid space (on either imaging or lumbar puncture) due to a ruptured cerebral aneurysm (as demonstrated on angiography, neuro-imaging or autopsy). Data Collection: Trained abstractors collected demographic data, aSAH characteristics, administration of RBC transfusion, daily hemoglobin concentrations, other major aSAH cointerventions, and outcomes using a pre-tested case report form in reference to a standardized operations manual. In our retrospective study of aSAH patients, we observed that RBCTx was uncommon (19.2 % of patients), and mostly reserved for patients with significant anemia (Hb <80 g/L). Little is published about anemia in aneurysmal subarachnoid hemorrhage (aSAH). We aimed to describe the disease burden of an aSAH population in Canadian hospitals, and the prevalence and incidence of moderate anemia (hemoglobin <100 g/L) in preparation for a transfusion trial. This is a multi-centre retrospective cohort study conducted at four Canadian academic tertiary care centres. Population: All adult aSAH patients admitted to the study hospitals from January 1, 2012 to December 31, 2013. Patients were identified from hospital administrative discharge records and existing local SAH databases. The diagnosis was confirmed by presence of blood in the subarachnoid space (on either imaging or lumbar puncture) due to a ruptured cerebral aneurysm (as demonstrated on any of angiography, neuro-imaging or autopsy). Data Collection: Trained abstractors collected demographic data, aSAH characteristics, daily hemoglobin concentrations and nadir hemoglobin, other major aSAH co-interventions, and outcomes using a pre-tested case report form in reference to a standardized operations manual. Moderate anemia (hemoglobin <100 g/L) is common in patients admitted with aneurysmal subarachnoid hemorrhage. Less than 20 % of aSAH patients receive a RBC transfusion. Acute aSAH is associated with significant mortality, and morbidity in survivors. NLR has been investigated as a biomarker of poor outcome in various conditions. The prognostic value of NLR and of C-reactive protein (CRP) was investigated in patients with SAH. Our hypothesis was that the profile of NLR from admission to the 5th hospital day could predict the development of symptomatic vasospasm or DCI. We retrospectively reviewed data from consecutive SAH patients (Jan 1, 2004 -Jan 1, 2015) and recorded demographics, neurological examination, radiological and transcranial Doppler findings, length of stay (LOS) in the ICU, and mortality. NLR values were calculated from admission to the 5th hospital day. Medians were compared between groups using Mann Whitney U test. Multivariate logistic regression analysis using stepwise backward method was generated to predict the occurrence of symptomatic vasospasm and DCI. Receiver operating characteristic (ROC) curves were generated to assess the accuracy of NLR to predict complications. The Patients with aneurismal subarachnoid hemorrhage (aSAH) have increased risk to develop nosocomial infections, associated with poor outcome. Aim of our study was to describe infections incidence and features in aSAH patients and evaluate the impact on ICU and hospital length of stay (LOS). We retrospectively analyzed data from a observational trial on aSAH. All patients with aSAH admitted to our ICU were included. Demographic, clinical, radiological and microbiological data were recorded. Infections were diagnosed according to current guidelines. Patients who died within 72 hours from admission were excluded from analysis. Statistical analysis was performed using Prism. Significance was set at a p value < = 0.05. Results 159 patients were included: median age was 59 years old, 30 % was male. Aneurysm was located in anterior circulation in 139 patients and secured through coiling in 76 %. 74 patients (47 %) experienced 98 septic episodes: incidence was low (31 %) in less severe (World Federation of Neurosurgeons grade (WFNS) 1 to 3) and higher in more severe aSAH (WFNS 4-5 78 %, p < 0.001). Diagnosis was made at 4 ± 1 days from admission. VAP (79 %) and urinary tract infections (10 %) were the most prevalent infection. 1 patients had septic shock. Most common pathogens were Haemophilus influenzae, Escherichia coli and Pseudomonas aeruginosa, accounting for 37 % of infections, while multidrug resistant bacteria were less than 5 %. ICU and hospital LOS were significantly higher in septic compared to non-septic patients, independently of WFNS category, as shown in Fig. 88 . Conclusions ICU-acquired infections are common especially in severe aSAH patients, predominantly VAP caused by Gram negative bacteria. Infections cause longer ICU and hospital LOS. Mechanical ventilation is often necessary in patients with poorgrade subarachnoid hemorrhage (SAH) and supra-normal oxygen levels are frequently achieved in the acute phase both in prehospital setting or in the intensive care unit. However the exact cerebral physiologic effects of normobaric hyperoxia have not been investigated in this setting in humans. Cerebral microdialysis (CMD) is a unique technique that allows the chemistry of the extracellular brain interstitial fluid to be monitored continuously at patient bedside. Retrospective analysis of a prospective database of comatose patients after SAH monitored with CMD as part of standard care. variables were within normal ranges and did not differ between the two conditions. Our findings in comatose patients with SAH monitored with intracerebral microdialysis suggest that hyperoxia was associated with increased secondary cerebral damage. Our study indicates that the use of supra-normal oxygen levels (PaO2 > 150 mmHg) may be harmful in the acute ICU phase following SAH. Admission to intensive care unit (ICU) is a routine practice following elective craniotomy for brain surgery. The rationale for this practice is the high risk of surgery on this vital organ and that ICU allows the early detection of serious complications. However, several studies have shown that many of these patients require minimal ICU interventions [1] . In addition, ICU resources are scarce and costly. Intermediate care units provide more monitoring than general wards and less than ICU with evidence to support the benefits on cost and patient outcomes [2] . The aim of this study was to compare outcomes of patients underwent elective craniotomy who were admitted to the intensive or intermediate post-operative care unit (IMCU) settings postoperatively. We conducted a prospective observational study during a year period (November 2014 to October 2015) with patients who had undergone elective craniotomy for brain surgery admitted postoperatively to ICU or IMCU of a tertiary 750-bed hospital. Main variables analyzed included demographic, comorbidities, ASA, APACHE II score, duration of anesthesia and surgery, surgical procedure and diagnosis, ICU and IMCU length of stay. Study outcomes were mortality, readmissions to ICU and re-operation rates. One hundred and fourteen patients were included in the study (mean age 53. Introduction Traumatic brain injury (TBI) may affect the pharmacokinetics of anti-epileptic drugs (AEDs) leading to the potential for ineffective post-traumatic seizure (PTS) prophylaxis. Levetiracetam (LEV) is increasingly utilized for the prophylaxis of PTS. Recent studies suggest that neurocritical care patients may require higher LEV doses to achieve therapeutic trough levels of 12-20mcg/mL [1, 2] . The aim of this study is to describe the pharmacokinetics of LEV in the early (less than or equal to 7 days after injury) post-TBI period and evaluate the incidence of seizures. This retrospective chart review included adult patients with a diagnosis of severe TBI who had at least one LEV level drawn during an intensive care unit (ICU) admission from January 2010 to October 2015. Patients were evaluated for the following outcomes: LEV trough levels (drawn at steady state), LEV dosing regimen, % of patients reaching LEV trough goal, LEV duration of therapy, diagnosis of seizure during hospital admission, hospital and ICU length of stay (LOS), and in-hospital mortality. Evaluation of 26 patients resulted in a median LEV trough of 9.7mcg/ mL (IQR 6.75-18.8) with only 19.2 % of patients within trough goal (57.6 % below goal, 23.1 % above goal). A subset of 15 patients had two consecutive LEV levels, which allowed for pharmacokinetic evaluation, resulting in median maximum concentration (Cmax) = 32.2mcg/mL, minimum concentration (Cmin) = 8.1mcg/mL, volume of distribution = 28.5 L, and half-life = 7.4 hours. The most common maintenance dose was 1 g twice daily (57.6 % of patients), followed by 1.5 g twice daily (23.1 %). Median duration of LEV therapy was 7 days. Seizures occurred in 7 patients during admission (26.9 %), with the majority of seizures occurring in the first 24 hours of admission. Median ICU and hospital LOS were 11 and 15 days, respectively, with an in-hospital mortality rate of 11.5 %. The majority of LEV trough levels were subtherapeutic during the early post-TBI period despite more aggressive initial dosing compared to previous studies. Linear regression suggests that the optimal dose of LEV for patients with normal renal function is 1.5 g every 8 hours. Further research is necessary to determine the impact of augmented renal clearance on LEV levels in patients with severe TBI. The BrainIT database contains the physiological and outcome data of 261 TBI patients [1] . We aimed to identify distinct patient groups distinguishable by the trajectory of their physiology over time using cluster analysis, a form of data mining [2] . We hypothesised that the resulting groups would have a good or poor outcome, measured by the extended Glasgow outcome score (eGOS). This would indicate physiological trajectories associated with a good or poor prognosis. The MREC for Scotland approved the use of BrainIT data for scientific purposes in 2002, and need for informed consent was waived. We first cleaned the database, which left 155 patients. These patients were then clustered based on 24-hour trajectories of CPP, ICP, HR, SaO2, and mean BP. To do this, we applied the general linear mixed model, a form of cluster analysis [2] , to the data to create three algorithms: Algorithm 1 clustered patients on minute-by-minute physiological data, Algorithm 2 used the same data with six outlying patients removed, and Algorithm 3 used hour-by-hour data, with the same six outlying patients removed as in Algorithm 2. Each algorithm's resulting clusters were paired to admissions data such as first GCS and outcome data as measured by eGOS. Algorithm 1 identified six outliers with distinct and extreme physiological trajectories and eGOS. (n = 149, eGOS 5; n = 3, eGOS 7; n = 2, eGOS 1; n = 1, eGOS 1). The patients in Algorithm 2 did not separate into clusters (n = 149, eGOS = 5). Algorithm 3 revealed three clusters (n = 65, eGOS 5; n = 58, eGOS 5; n = 26, eGOS 6) with similar physiological trajectories and outcomes. However, as well as having a slightly better outcome, the cluster with an eGOS of 6 was older than the other two clusters and had a higher CPP and mean BP. Clustering TBI patients based on physiological trajectories is possible. We identified outliers with distinct eGOS based on minute-by-minute physiology. The model is not able to identify groups of patients with distinct outcomes based on hour-by-hour clustering with outliers removed. Changes of pressure inside cranium occur only when compartment inside has reached the compliance limit and changes below that point are not monitored. Second weakness of ICP is that local changes may have gone wrong long before ICP has reached the alarming value. Use of electrical BI is real-time and has high sensitivity to minor changes of volume shifts related to metabolism or HS infusion (1). We used four electrode BI catheter connected to multi-frequency bioimpedance measurement device (Smartimplant Ltd.) (2) . BI measurement provided continuously from frequencies 100 Hz to 100 Khz. All data have been recorded simultaneously with ICP and arterial pressure (AP) values. In ischemic tissue extravascular volume will decrease and we have cytotoxic edema. In case of microvascular and trauma damage we will have vasogenic edema and extracellular space will expand (Fig. 90) . By administrating HS we can monitor changes of intracellular and extracellular volumes. Conclusions BI is fast and rapid indicator to any volume changes in tissue level and is useful method to measure brain tissue status in icu as indicator of HS administration resultativity and rebound effect. We aimed to evaluate RBC transfusion frequency, their determinants and associated clinical outcomes. We conducted a retrospective multicenter cohort study using data from the National Trauma Registry of Canada. All patients admitted with a moderate or severe traumatic brain injury to one of the 114 participating centers from 2005 to 2013 were eligible. The registry contains information on age, gender, trauma severity, discharge status and discharge disposition. Data from the Discharge Abstract Database containing information on blood products, comorbidities, interventions and complications were linked to NTR observations. We conducted multivariate robust Poisson regression with a random effect at the center level. Missing data were handled through multiple imputation techniques. Over the course of their hospital stay, 1991 patients (28.19 %; 95%CI 27.16 to 29.25)) received RBC transfusions among 7062 patients suffering from traumatic brain injury, with frequencies varying from 0 to 43 % between centers. Female gender, age 55-65 (in reference to patients aged < 55), anemia, coagulopathy, sepsis, bleeding, hypovolemic shock, presence of multiple other comorbidities, face, thoracic and abdominal, spine, lower extremities and skin injuries, as well as invasive interventions were associated with higher frequency of RBC transfusions. Trauma severity and invasive interventions explained 76 % of the observed variation. Mortality, complications and discharge elsewhere than home were increased in patients who received RBC transfusions. ICU length of stay and hospital length of stay were also longer in patients who received RBC transfusions. When stratified, patients who were anemic on admission showed neither benefit nor disadvantage on any outcome relative to RBC transfusions. Similar results were obtained for stratum of patients who were diagnosed with sepsis on admission. Conclusions RBC transfusion is common in patients with traumatic brain injury and is potentially associated with unfavourable outcomes. Important variation between centers was observed, and highlights the need for stronger evidence over optimal transfusion practices in this population. Considering the potential impact of RBC on brain oxygenation in this population, rigorous trials evaluating Hb triggers for transfusion are needed. In general intensive care units (ICU), the non-inferiority of restrictive transfusion strategies has been observed. Due to the risk of secondary cerebral lesions in traumatic brain injury, concerns have been raised regarding the impact of low hemoglobin and hypoxia in this population. Data remain scarce and clinical equipoise persists. We conducted a cohort study of patients with moderate/severe traumatic brain injury admitted to the ICU of a level I trauma center (Halifax). Data related to pre-transfusion hemoglobin levels and red blood cells were collected. The association between hemoglobin level, red blood cell transfusion and mortality was evaluated using a robust Poisson model. We also constructed proportional hazard models with time-dependant variables (hemoglobin and transfusion) to evaluate survival. Neurological and non-neurological trauma complications as well as length of stay (LOS) in ICU or hospital were considered as secondary outcomes. We included 215 patients (78 % male; mean age: 45 years). A third of patients were transfused over their ICU stay (n = 66)). The median pre-transfusion hemoglobin levels in patients who were transfused were 81 g/L (IQR 67 to 100); and 110 g/L (IQR 93 to 123) in patients who did not receive red blood cells. Worse outcomes were significantly more frequent in patients who were transfused in adjusted models ( . A non-statistically significant trend toward higher risk ratios for unfavorable outcomes following transfusions in strata of patients with higher hemoglobin levels was observed. No significant modification of transfusion effect by age, presence of comorbidities, or traumatic brain injury severity was observed. In our cohort, red blood cell transfusions were associated with worse outcomes in patients with traumatic brain injury, an association that tend to increase with hemoglobin levels. However, residual confounding from comorbidities and traumatic brain injury severity is possible. Penetrating gunshot wounds to the head (PGWH) are associated with high mortality and morbidity. Getulio Vargas Hospital is a public trauma center in Rio de Janeiro with a high volume of neurocritical patients and an unusual number of civil patients admitted with PGWH. The aim of this study was to describe patients with PGWH admitted in Intensive Care Unit (ICU), investigating clinical characteristics, complications and management that could be related to a better outcome in this setting. We retrospectively assessed the hospital records of every patient with PGWH admitted to the ICU from October 1st, 2014 to September 30th, 2015. Exploratory analysis of clinical data, image results, treatment, complication and outcomes was made. The primary end point was the modified Rankin scale at hospital discharge. In the period of the study, 1789 patients were admitted to the ICU and 13 PGWH patients were included. There were 10 male patients (77 %) and 3 female (23 %), with a mean age of 30 years (range 14-64). There were no self-inflicted lesions (all PGWH were results of aggression). Glasgow Coma Scale at admission was 8 or less in 10 patients. On admission, 7 patients were anisocoric, 8 presenting shock, 5 had associated body lesions from another gunshot wound (such as limbs or in the thorax). Mean SAPS 3 was 67 (range 35-94) and mean Apache II, 26 (range 8-37). The CT scan findings were: midline shift in 8 patients, 7 single lobe haemorrhages (such as frontal or parietal lobe), and 6 patients with bleeding in more than one lobe. Six patients had subarachnoid and 3 had intraventricular hemorrhage. Early surgical procedure was made to 10 patients (mainly decompressive craniectomy). The mean hospital length of stay was 21 days (range 2 to 136 days). Six patients had wound infection, and three, infection in other sites. The mortality rate for the entire group was 54 % (7 out 13). 4 patients progressed to brain death. Of the six patients discharged from the hospital, four had good outcome (defined as modified Rankin scores of 0-3), and two, bad outcome (modified Rankin of 4). In agreement with previous reports, our results showed that the surviving group was mostly of hemodynamically stable patients, included all with GCS above 8, and had lower SAPS3 and APACHE 2 scores. Patients with multiple gunshot wounds, even on non-vital organs, had worse outcome. According Previous studies have found Proadrenomedullin (ProADM), an inflammatory blood marker, to provide additional prognostic information for risk stratification. We aimed to translate ProADM cut-off levels into an easy-to-use emergency department (ED) triage algorithm to improve current risk assessments and clinical outcome prediction of medical patients. In this large multi-national, prospective, observational study from Switzerland, France and the United States [1] , we combined two ProADM cut-off values with a five-level ED risk assessment tool (Manchester Triage System, MTS) to further risk-stratify medical ED patients [2] . The proposed ProADM based triage algorithm allows a more accurate prediction for adverse clinical outcome in medical ED patients at high risk. To proof safety and efficacy of this new ED triage algorithm, an intervention study involving a rapid ProADM point-of-care measurement is mandatory. Simulation training within Emergency Medicine is growing with various peer-reviewed journals promoting simulation as a valuable educational tool [1, 2] . High-fidelity mannequins are commonly used but considerable barriers remain, particularly a lack of human interaction. We suggest that a live patient during a point-of-care simulation teaching series could both create more realistic scenarios, and develop non-technical skills between the multi-disciplinary team. In addition, studies have shown that curriculum mapping optimises teaching in medical education and results in improved student satisfaction [3] . Thus, we set out to develop an innovative Emergency Medicine point-of-care simulation programme using live patients with an associated curriculum map. We developed and implemented a six-month point-of-care simulation programme within the Royal Cornwall Hospital Emergency Department covering critical illness, paediatric, and major trauma presentations. A live clinically trained actor, rather than a high-fidelity mannequin, acted as the patient in suitable weekly simulations. A curriculum map was designed for each session detailing specific curriculum topics covered. Feedback was collected from 12 Emergency Medicine doctors using visual analogue scale questionnaires. Using a curriculum map, doctors' knowledge of specific curriculum learning increased from 42 % to 88 %, an increase of 46 percentage points. 100 % stated that a curriculum map was useful/very useful. 100 % found interacting with a live patient a positive learning experience. 92 % stated it improved realism and the learning experience compared to a high-fidelity mannequin. 100 % felt a live patient was superior for non-technical skills. Our research suggests that interaction with a live, simulated patient creates a realistic emergency medicine simulation providing the opportunity to develop non-technical skills within the multi-disciplinary team. We appreciate that particular scenarios may not be suitable for this approach. A comprehensive curriculum map in this context is valued by trainees, and improves trainee familiarity with their curriculum. Further work will involve evaluating any clinical/patient outcome improvements from participating in this simulation programme. In situ simulation is increasingly recognised as an effective means of delivering education to clinicians within their own work environment [1, 2] . We have developed the InSIM program, a novel in situ simulation program aimed at junior trainees within our ICU. The program focuses on developing both the clinical and non-clinical skills of trainees, improving multi-disciplinary teamwork and identifying areas for logistical improvement within our unit. The simulation team developed three simulation scenarios: Alarming Ventilator; Massive Haemorrhage; and Tracheostomy Emergency. Each scenario has learning points addressing both clinical and nontechnical skills and involves a structured debrief following the scenario. The initial program objective was to deliver a total of 15 simulations, including junior medical and nursing staff within each scenario. A total of eight out of 15 planned sessions were delivered. Feedback on the scenarios indicate that participants 'Strongly Agree' (72 %) or 'Agree' (18 %) that the scenarios were useful and improved their understanding of the clinical subject (Strongly Agree 60 %, Agree 40 %). They also 'Strongly Agree' (50 %) or 'Agree' (50 %) that discussion of the non-clinical aspects of the scenarios was useful. When asked to identify one thing they learnt from each scenario, multiple respondents identified non-technical aspects such as the importance of communication within the team, situational awareness and closed loop feedback. A number of important logistical issues were identified through the simulation program. These included a lack of knowledge on the contents of the difficult airway trolley and a lack of familiarity on the correct procedure for activating the hospital's Massive Haemorrhage Protocol. We have been able to develop and deliver an in situ simulation program to our junior trainees within our ICU. The program has been well received and has increased junior trainees' understanding and confidence in dealing with a number of emergent scenarios. It has also helped to develop and increase their awareness of key non-technical skills in these situations. Finally, it has identified a number of logistical issues within our intensive care unit, which are being addressed through quality improvement projects. The cardiac Troponin test has a key role in the diagnosis, prognosis and risk stratification of acute coronary syndrome (ACS). Over utilization and inappropriate requests for it have created a heavy workload for laboratory staff, increased costs to the health care system, and unnecessarily increased the length of stay and costs for patients. The aim of this study was to make and standardize all requests of Troponin more accurate and scientific without causing any burden to the patients, doctors or laboratory section. Methods A total number of 1073 requested serum Troponin-T Tests in the Emergency department)ED) of Hamad General hospital were reviewed retrospectively during the period from start of October/ 2014 to end of December /2014, then a data analysis was done including sex, race, age, a proper history taking and physical examination in addition to electrocardiographic (ECG) records and any necessary aid can be helpful before making any decision for Troponin request, then our results were compared to other studies and international results. Out of 1073 involved cases, there were only 208 patients with elevated serum Troponin-T levels, representing about 19.38 %, those whom were proved to have acute coronary syndrome were only 83, representing 7.7 % from the whole studied cases, (having a cardiac cause for their chest pain). Although patients who describe chest pain to the emergency physician represent an immediate challenge, but still we conclude that there is over utilization of Troponin test which increases workload and costs. The majority of chest pain complaints are not due to (ACS). Noncardiac chest pain is the second most common reason presentation to the ED and accounts for approximately 2 to 5 percent of all visits worldwide. Unnecessary and inappropriate requests for serum troponin should be reduced. The reduction in these requests eases the workload, cuts test and labor costs. Reducing unnecessary testing can lead to shortening stay times also beneficial for patients. Examination time at emergency department (ED) could have influence for patient Ls prognosis. Door to balloon time is one of the most famous quality indicators related to patient Ls outcome. However, there have not been a lot of time quality indicators at ED except door to needle time and door to antibiotics time. Other door to examination or intervention time could be a quality indicator at ED. It will be used evaluation, improvement, and maintenance of ED. Our aim was to develop a real time monitor to register and manage patient Ls flow and time. We will make new time quality indicators at ED with this monitor. It will lead to hospital automation. We developed the monitor system to track patient Ls flow and to show it in real time on a big display at ED in such a way of logistics system, which is connected with electronic health record (EHR). It can show time components such as door to intervention time. We named it Time Tracking Monitor (TTM) at ED. Results Our TTM system had three big appeals. First, it showed time data per examination or per intervention in real time. It helped to share patient Ls flow within medical staff. It would prevent to mistake. Second, this time data was being compared with a world standard time period based quality indicators such as door to balloon time in real time. It was also alerted if that time ran past the time limit. Moreover, we can analyze the relationship between diseases and time data because TTM has capability as database. Data analysis is used quality comparison among patients L individuals, within a hospital setting, among hospitals. It has been still hard to manage patient Ls flow because there were variety of patients and diseases at ED although we used many automatic data accumulation with EHR. However, this monitor could bring medical staff to their attention of examination time and its importance. Syncope remains a common Emergency Department (ED) presentation [1] . Almost 50 % of syncope cases are likely caused by a reflex response with excellent prognosis [1, 2] . Unfortunately, up to 33 % of patients with syncope will be discharged with no clear diagnosis and are known to have an elevated risk of morbidity and mortality from cardiac causes [1] [2] [3] . We set out to develop a focusedechocardiography based algorithm to risk stratify this patient group. Literature searches were carried out by the authors using Medline 1946-2015, Google Scholar, Cochrane Collection and international guidelines as well as reference searches. Our search converged on the role of comprehensive echocardiography in evaluation of syncope in parallel to utility of focused echocardiography in detecting structural cardiac disease. In patients with syncope, echocardiography is an imaging modality with high sensitivity and specificity for clinically significant structural disease [3] . Current guidelines only recommend formal echocardiographic evaluation in patients with clinically suspected structural cardiac disease [3] . Clinical assessment tools including composite risk scores have repeatedly been demonstrated to be poor at identifying structural heart disease [3] . Syncopal events in such context are associated with significant increase in mortality [1] [2] [3] . Studies have demonstrated that emergency physicians can attain a high level of proficiency in echocardiography allowing diagnostic evaluation of complex systolic and diastolic dysfunction and in operators with > 250 examinations, accurate identification of structural disease [3] . Focused-echocardiography can be safely integrated into the clinical assessment process in the presence of an experienced operator and yields findings superior to clinical assessment alone [3] . Clinical evaluation, laboratory investigations, and electrocardiography continue to be core investigations for patients with syncope. In the cohort of patients with no clear diagnosis, focussed-echocardiography by an appropriately trained operator offers additional safety-net for patients who are discharged without a clear diagnosis. Our paper assimilates these findings into a novel focused-echocardiography based algorithm for ED evaluation of syncope. This real-time ultrasound (US) guided study aimed to determine the rate of pneumothorax progression after 14-gauge (G) cannula insertion in a self-ventilating swine model. Standard treatment of suspected tension pneumothorax includes emergent insertion of a large bore catheter into the pleural cavity [1] . Widely accepted practice includes a catheter is left in-situ, open-ended and not capped based on a quoted dictum that air will only significantly be entrained into the pleural space via a wound >¨ø the diameter of the trachea [2] . Three swine were consecutively anaesthetised using total intravenous anaesthesia and remained self-ventilating without endotracheal intubation at all times. A single 14G cannula (BD Venflon) was inserted sequentially into the second intercostal space of each hemithorax under US guidance. Real-time US was used to identify the development of pneumothorax. Lung-point shift was used to track pneumothorax expansion within pre-mapped numeric chest wall segments. In the event of respiratory distress, air was aspirated through the cannula. Insertion of an uncapped 14G cannula resulted in a significant pneumothorax developing in <2 minutes. Each insertion (n = 3) resulted in the lung-point moving laterally by >10 cm in <2 minutes in all cases. At 2 minutes, swine were observed to develop a significant tachypnoea (respiratory rate >50/min) and showed signs of respiratory distress. Upon aspiration of 800 ml-1200ml of air, normal physiology was restored. During aspiration, the lung-point was observed to shift medially and apically over time, confirming lung re-inflation. [2] . Despite introductory training few use IO devices on a regular basis. There is increasing recognition that lifesaving technical skills used in emergencies but required infrequently, need regular practise to prevent 'psychomotor skill fatigue' [3, 4] . We adopted a novel technique for technical skill training, used initially for difficult airway training [4] , to train anaesthetic and ICU staff for EZ-IO® access. A theatre trolley is converted into a mobile training trolley with EZ-IO® training devices, tea and cakes. The trolley is operated by two anaesthetists and visits theatres during normal working hours: one anaesthetist takes over the care of a stable patient in theatre while the second delivers a 15 minute training session in the anaesthetic room to the listed anaesthetist and nurse/operating department practitioner (ODP). Training includes EZ-IO® device locations, all practical aspects of their use, and is followed by tea, cake and a questionnaire. The EZ-IO® tea trolley was used to train a mixture of 36 consultants, juniors, nurses and ODPs. 97 % reported an increase in their confidence using the EZ-IO® system and 100 % found the 'tea trolley' a useful teaching method (mean Linkert 4.9, 5 = most useful). Following training 100 % of participants were correctly able to identify the locations of EZ-IO® devices for their clinical areas and identify the correct anatomical sites for IO insertion. We have utilised the 'tea trolley' to improve EZ-IO® awareness and skills, and to provide mobile training sessions requiring minimal manpower and time commitments from participants who ordinarily have little time to practice. It is a transferable training approach which could be applied to many resuscitation skills and utilised in many departments. Latrodectus is a genus of spider, which is the widely known black widow. It is characterised by neurotoxic venom. It has a worldwide distribution. Soluble Barium salts poisoning is a rare and potentially fatal cause of severe hypokalemic paralysis. There is no agreement on adequate management, particularly on the use of renal support. We present a case of barium intoxication successfully treated with hemodiafiltration, resulting in rapid clinical improvement and undetectable barium levels. Case report Results A 21-year-old male with a history of depression was admitted to our hospital emergency department two hours after the voluntary ingestion of a non-quantified amount of barium chloride, which the patient had procured and purchased online as a means for a suicide attempt. The clinical picture consisted of vomiting, diarrhea, and progressive generalized muscle weakness requiring tracheal intubation and mechanical ventilation, with subsequent ICU admission. Severe hypokalemia was documented (minimum K+ 1.2 mmol/L), complicated by frequent ventricular premature beats with prolonged QT interval (corrected value was 640 ms). Magnesium sulfate and a K+ infusion were administered, but serum K+ remained low. Serum K+ was then 2.2 mmol/L and barium levels were 0.04 mmol/L (toxic range). Continuous veno-venous hemodiafiltration (CVVHDF) was started. This resulted in a rapid decrease in barium levels and normalization of serum K+. Clinical improvement ensued with the correction of hypokalemia and restored muscular function, allowing spontaneous ventilation and extubation 6 hours later. Barium was persistently undetectable from 12 hours after the beginning of CVVHDF (<0.006 mmol/L). The patient was later transferred to the Psychiatry department, with an uneventful hospital stay. There is very limited published data on the use of renal replacement techniques in acute barium intoxication, which is associated with a quicker barium half-life reduction and clinical improvement. A transcellular K+ shift and the barium concentration itself impact muscle weakness, implicating a possible direct effect of barium on either skeletal muscle or neuromuscular transmission. This patient presented with symptoms typical of severe barium intoxication, nonresponsive to K+ supplementation. Clinicians in the ICU should remain aware of the potential ingestion of unusual chemicals such as barium chloride in the appropriate clinical context, as the risk of availability through internet purchases may continue to rise in the future. With this unique case we add on to the scanty existing evidence in the literature and strongly advocate the use of dyalisis, targeting both barium removal and the correction of hypokalemia. Consent to publish Written informed consent was obtained from the patient for publication of this abstract. The Central Statistics Office Ireland (CSO) reported following the 2011 census that there was a 9.6 % increase in the number of people cycling to work compared with 2006. This leads to a higher prevalence of injuries and hospital attendances. The Road Safety Authority (RSA) conclude from their most recent research that there were 639 cycling associated collisions in 2012. We hypothesise that both the true number of cycling injuries and the major trauma that results from cycling collisions is being under-reported and this is placing a significant demand on Emergency Departments (EDs) and Intensive Care Units (ICUs) in Ireland. This is a retrospective review of patients with a cycling-related injury presenting to Saint Vincent¡¯s University Hospital (SVUH) from 1st of January to 31st of December 2014. Subjects were identified by interrogating the Maxims Clinical© patient information system in use at the ED. Triage records were searched for the keywords; "bike", "cycling", "cyclist", "pushbike", "bicycle". SVUH participates in the Trauma Audit and Research Network (TARN) and data was extracted from this database, e.g. the injury severity score (ISS) for patients who fulfilled the criteria for TARN. The total number of cycling associated injuries that attended our ED in 2014 was 534, accounting for just over 1. Majority of burns are non-battle injuries, small in size and accessible to prevention [2] . For battle-related burn injuries, the anatomical topography can be explained by the personal protective equipment and the higher severity score due to the associated trauma and mechanism. The initial management of burns with multiple trauma remains attention to the priorities of circulation, airway and breathing. The treatment of either the burn or the associated injuries may be compromised by their combined presence, and a team approach is essential to their optimal management [3] . Only patients with major burns that required hospitalizations have been included. So, this study is the visible tip of the iceberg. All military care providers should be familiar with the assessment and treatment of burns in military settings. Burn injuries are amongst the most severe physical and psychological insults a patient can experience and morbidity is extensive with variable mortality [1] . Studies have repeatedly confirmed factors associated with high mortality, which include increasing age, extent of burn and presence or absence of inhalational injury [2] . Predicting mortality from burns is useful in identifying those that may benefit from treatment or those in whom initiation of treatment is futile and not in the best interests of the patient. Treatment in an Intensive Care Unit (ICU) often necessitates uncomfortable and painful procedures for patients throughout their admission. There is growing evidence to suggest that chronic pain is becoming increasingly recognised as a long term problem for patients following an ICU admission [1] . Intensive Care Syndrome: Promoting Independence and Return to Employment (InS:PIRE) is a five week rehabilitation programme for patients and their caregivers after ICU discharge at Glasgow Royal Infirmary. This study investigated the incidence and location of chronic pain in patients discharged from ICU and classified the analgesics prescribed according to the World Health Organization analgesic Methods The InS:PIRE programme involved individual sessions for patients and their caregivers with a physiotherapist and a pharmacist along with interventions from medical, nursing, psychology and community services. The physiotherapist documented the incidence and pain location during the assessment. The pharmacist recorded all analgesic medications prescribed prior to admission and at their clinic visit. The patient's analgesic medication was classified according to the WHO pain ladder from zero to three, zero being no pain medication and three being treatment with a strong opioid. Data collected was part of an evaluation of a quality improvement initiative, therefore ethics approval was waived. Data was collected from 47 of the 48 patients who attended the rehabilitation clinic (median age was 52 (IQR, 44-57) median ICU LOS was 15 (IQR 9-25), median APACHE II was 23 (IQR 18-27) and 32 of the patients were men (67 %)). Prior to admission to ICU 43 % of patients were taking analgesics and this increased to 81 % at the time of their clinic visit. The number of patients at step two and above on the WHO pain ladder also increased from 34 % to 56 %. Of the patients seen at the InS:PIRE clinic two-thirds stated that they had new pain since their ICU admission. Despite the increase in the number and strength of analgesics prescribed, almost a quarter of patients still complained of pain at their clinic visit. These results confirm that pain continues to be a significant problem in this patient group. Raising awareness in primary care of the incidence of chronic pain and improving its management is essential to the recovery process following an ICU admission. Despite the development and availability of effective analgesic procedures, pain is still underestimated and poorly managed, especially in the critical areas. Nurses' knowledge about pain plays a significant role in effective clinical decision making for pain management. The objective of the study is describing the level of knowledge and practice of nurses in pain management as well as highlighting the possible barriers that nurses face in assessing and managing acute pain in the critical areas ICU and ER. This study a cross-sectional study. A semi-structured questionnaire was distributed to the nurses in the critical care areas in tertiary care hospital. These nurses have direct contact with patient care in clinical settings and must provide pain assessment and management to their patients. Measurement investigated like the demographic data, work ranking, level of education and years of experience. As well their actual level of knowledge and practice and the perceived barriers about pain assessment and management during their clinical work. Although the critical care nurses who participated in this study have good exposure to different critical specialties, but still the inadequate pain management is significant. The level of knowledge and the perceived barriers are playing a major role in the adequacy of pain management practiced in the hospital and those needs to improve. Critical-Care Pain Observation Tool (CPOT) and The Behavioral Pain Scale (BPS) are behavioral pain assessments for unconscious intensive care unit (ICU) patients. The aim is to determine the validation and reliability of the CPOT in Turkish in mechanically ventilated adult ICU patients. This prospective observational cohort study included 50 mechanically ventilated mixed ICU patients who were unable to report pain. The study was conducted in a tertiary ICU in a university hospital in Turkey. After obtaining permission from the author of the original study (1), CPOT was translated into Turkish and language validity was performed according to the reports obtained from 10 senior intensivists. Pain was assessed before and during non-painful and painful routine care procedures (touching with wet towel and suctioning respectively) using the CPOT and the BPS performed by a resident and an intensivist concomitantly. Tests reliability, interrater reliability, and validity of the CPOT and the BPS were evaluated. Thirty three of the patients (66 %) was male. The mean age was 57.4 and the mean APACHE II score was 18.7. Glasgow Coma Score was equal or less than 8 in 66 % of all patients. A total of 100 assessments performed by a resident and an intensivist were recorded from 50 patients using the CPOT and the BPS. Scores of the CPOT and the BPS during the painful procedures were both significantly higher than those during the nonpainful procedures. The agreement between CPOT and BPS during painful and nonpainful stimuli was ranged as; sensitivity 66.7 % -90.3 %; specificity 89.7 % -97.9 %; kappa value 0.712 -0.892. The agreement between resident and intensivist during painful and non-painful stimuli was ranged from 97 % to 100 % and the kappa value was between 0.904 -1.0. The Turkish version of the CPOT showed good correlation with the BPS. Inter-rater reliability between resident and intensivist was good. The study showed that the Turkish version of BPS and CPOT are reliable and valid tools to assess pain in intubated and unconscious mechanically ventilated critically ill patients for use in daily clinical practice. Assessing and managing pain and sedation levels in patients who are mechanically ventilated in the intensive care unit is often challenging but essential in order to provide holistic patient-centred critical care [1, 2] . The aim of this audit was to evaluate the assessment of pain and sedation in the intensive care unit at Milton Keynes University Hospital NHS Foundation Trust, by examining current clinical practice and highlighting areas for improvement. Guidelines were utilised to formulate audit criteria and standards. A proforma was developed for use in 50 adult patient cases who were mechanically ventilated in the intensive care unit over a 3 month period between August and October 2015. All data were coded and entered into a spreadsheet for analysis. Initial recommendations for change have been made. Results 90 % of patient cases had a pain assessment documented within 6 hours of admission, scored utilising the VAS. 73 % of initial pain assessments were scored by nursing staff, with self-reporting occurring in the remaining 27 %. When pain scores were subsequently evaluated over a following 24 hour period, the mean number of scores documented was n = 5. When sedation scores were evaluated utilising the RASS over the same 24 hour period, the mean number of scores documented was n = 6 (ranging from -5 to +2). Of note, 60 % of patients did not have a formal sedation hold, with 66 % having no reason for this documented. In addition, only 46 % of patients had a goal RASS documented on the daily ward round, and of these 39 % met their target RASS. This audit has raised awareness of the need for improvement in the assessment of pain and sedation in the intensive care unit. As a result, a new clinical guideline has been developed and an educational intervention will be designed to integrate this into daily clinical practice. A second audit cycle will take place in 2016. As underlying diseases and causes of agitation are varied in different patients, a selection of sedative agents should be made individually. Currently, there is a lack of guideline for the sedative use in the medical intensive care unit (ICU), Ramathibodi Hospital. A prescription of sedative agents performed by critical care pharmacists during patient care rounds, based on appropriate guidelines and patient data would shorten the ICU length of stay (LOS). This study was set out to investigate the impact of pharmacist interventions on ICU LOS, hospital LOS, ventilator days and mortality in Medical Intensive Care patients at Ramathibodi Hospital. A before-after study at the medical ICU (total of 8 beds) of the university hospital was conducted on mechanical ventilated patients receiving sedative agents. Baseline characteristics, active problems and APACHE II were used as the criteria to match the patients between two groups, the retrospective group (no pharmacist interventions) and the prospective group (pharmacist interventions). Medical chart reviews were performed and data were collected over a 2-year period in the retrospective group. In the prospective group, pharmacists made interventions with the team to select sedative agents to individual patients based on underlying disease, active problems, renal and hepatic function, and causes of patient agitation. There were 156 mechanically ventilated patients prescribed sedative agents, 66 and 90 patients in the prospective and retrospective group, respectively. The median duration of ICU LOS was reduced from 10.0 days in the retrospective group to 6.5 days in the prospective group (p = 0.002). The median hospital stay and ventilators days were significantly different between two groups. The median hospital stay was reduced from 30.50 days in the retrospective group to 17.50 days in the prospective group(p = 0.000). The ventilator days were 14 days and 8.5 days in retrospective and prospective group, respectively (p = 0.008). Mortality remained unchanged, 53.03 % in the prospective group compared to 46.67 % in the retrospective group (p = 0.432). The pharmacist participations in critical care team resulted in a significant reduction in the duration of ICU stay, hospital LOS and ventilator days, but not mortality in mechanically ventilated patients receiving sedation. Agitation is a syndrome characterized by the acute onset of central nervous system dysfunction identified by the several features including a change or fluctuation in baseline mental status, and either disor-ganized thinking or an altered level of consciousness. Agitation occurs frequently in critically ill patients and intensive care unit (ICU). Methods A retrospective study with waiver of consent was conducted in one of the trauma intensive care unit (ICU) for one year. ICU patients were enrolled in the study and classified based on their agitation status. The depth of sedation along with the Ramsey Sedation Scale (RS), Glasgow Coma Scale (GCS), type of injury and observation of vital signs were clinically evaluated. One hundred and two patients (n = 102) were enrolled during the period of the study. Among those patients, forty six patients (n = 46) were agitated. Comparing the two groups of patient, agitated patients had higher incidence of infection (p < 0.02) compared to the non-agitated patients. In addition, there was a significant difference type of sedation used (p < 0.001), ventilators free days (p < 0.001) and length of stay. Agitation in ICU patients is associated with several adverse outcomes including prolonged stay, nosocomial infections, and unplanned extubations. The aim of our observation retrospective study was to test the hypothesis that a correlation exists between percentages of ventilated patients developed VAP (% VP) and the use of main sedative and neuromuscular blockage agents indexes in our both medical and surgical ICU served in community hospital. Table 46 Conclusions According to our data, there was no statistically significant correlation detected between percentages of ventilated patients developed VAP and use of Midazolam, Propofol and Cisatracurium. On the other hand, not all but some patients had increased demand for more sedative agents. Our data suggest that the percentages of ventilated patients developed VAP is independent from the use of main sedative and neuromuscular blockage agents in all ICU ventilated patients. Sedation is routine in UK emergency departments (EDs); to maintain and improve our quality of care, and compare local practice with national and international standards, a standardised and sensitive tool is needed to detect and report adverse events (AE). The World SIVA International Sedation Taskforce tool [1] incorporates physiological thresholds, clinical interventions, and overall outcome to grade AE as minimal to sentinel. We incorporated this tool into our sedation documentation, and report its effect on the detection and documentation of AEs. All notes from patients undergoing sedation in our ED between 1/1/14 and 30/4/14 were retrospectively reviewed for documentation of AE, as well as evidence of adverse events where not formally recorded. After implementation of a new sedation proforma incorporating the SIVA tool [1] , the data collection process was repeated retrospectively between 1/10/14 and 30/1/15. See Table 47 . The SIVA tool highlights AEs within our ED have gone unreported. Its incorporation improved awareness and documentation of AE, but further education is needed to improve the accuracy of reporting. Utilising this tool should facilitate more sensitive and standardised data collection, allowing regular review and improvement of practice, as well as comparison with national and international standards. Sedative drug use per ventilated hour is strongly associated with LoMV and the increase of these metrics over the last 3 years is significant and strongly associated. Sedation in ICU using sevoflurane via Anaconda enables desirable sedation level with numerous advantages over propofol/midazolam based sedation. In patient with severe ARDS inhaled nitric oxide (NO) improves oxygenation and reduces right heart afterload. Both therapies are simultaneously used in our ICU with ARDS patients and patients with right heart failure being ventilated for various reasons. Since sevoflurane can interact with certain gas monitoring (eg. Quark rmr indirect calorimetry) we examined potential gas readings interaction with NO. This cohort of patients belonged to a group with high mortality and high length of stay as a result of prolonged ventilation, associated with difficult to manage sedation and/or delirium. Further studies should evaluate the optimal duration of treatment, to facilitate deescalation in those patients not obtaining benefit. Alcohol withdrawal patients have the increased risk of PTSD [1] , especially when treated at intensive care unit (ICU) with high BZD doses. We hypothesized that dexmedetomidine addition to BZD for AWS patients' sedation in the ICU will decrease PTSD incidence. Prospective cohort study was conducted in the mixed ICU during July 2014 -July 2015. AWS patients were assigned in 2 groups: group D (dexmedetomidine) and group C (Control). In group D dexmedetomidine infusion at dose 0. Evidence from experimental studies suggests that ketamine may be beneficial for analgosedation in acute respiratory failure due to bronchospasmolytic and immunomodulatory effects. However, limited data exist whether analgosedation with ketamine is safe to use and whether it decreases the requirements for respiratory support. Over a 6-year period we retrospectively identified patients who received ketamine for hypoxic respiratory failure after admission to critical care. Ketamine dose, mean arterial pressure, heart rate, blood biochemistry, ventilator settings, and sedation score were recorded over the first 5 days of ketamine treatment. Complications (delirium, arrhythmias, malabsorption of feed) were documented. Chest X-ray findings were documented on the day ketamine was commenced. Ketamine is safe for analgosedation in severe hypoxic respiratory failure caused by COPD or asthma. Chest infection was a common trigger. Common complications of critical care (constipation, delayed gastric transport, abdominal distension, delirium, atrial fibrillation) were not more frequently observed than reported for critically ill patients. Many Health professionals believe that the lunar cycle influences the incidence of psychiatric presentation to hospitals [1, 2] . Research has shown that the circa-lunar cycle can influence the objective quality of sleep [3] . However more robust research largely involving the effect of the full moon on emergency psychiatric presentations has failed to find any correlation [4] . The aim of this novel study was to examine whether the incidence of delirium in intensive care patients was influenced by the lunar cycle. The study was conducted over six months in a general intensive care unit. Data was collected on the day of the full moon and new moon of each month and 24 hours before and after). Delirium was assessed using the confusion assessment methodintensive care (CAM-ICU). Patients who were too sedated to be assessed by CAM-ICU were excluded. Data on 92 patients was collected. There was no significant difference in these baseline characteristics and interventions known to increase the risk of delirium. Sedative drugs used in the two groups were similar apart from the fact that patients in the full moon group received more benzodiazepines (0 % vs. 10 %) and more haloperidol (0 % vs. 12 %). There was no statistical difference between the incidence of delirium during the new moon (CAM-ICU positive 20 %) and the full moon groups (CAM-ICU positive 14 %). Our study found no association between the incidence of delirium and the full moon in intensive care patients. We hypothesise that the reason for the persistence of the belief that the full moon is associated with increased psychiatric morbidity is due to confirmation bias. This may explain why patients received more sedative drugs during the full moon when compared to the new moon. Cardiac surgery is associated with a high incidence of neurological complications, including stroke, delirium and cognitive impairment. Involved mechanisms include embolism and reduced brain flow during bypass. There are no data evaluating the impact of cardiac surgery in cerebral autoregulation and its potential relation to neurological outcomes. The aim of this prospective observational single-centre study was to evaluate the impact of cardiac surgery on dynamic cerebral autoregulation (dCA) and assess its potential association with postoperative delirium. Adult with a EuroSCORE > 6 or left ventricular ejection fraction < 40 % undergoing coronary artery bypass graft (CABG) surgery with bypass were included. Cerebral blood flow velocity (CBFV, transcranial Doppler, middle cerebral artery), end-tidal PaCO2 (EtCO2, infrared capnograph), and blood pressure (BP, Finapres or intra-arterial) were continuously recorded supine during 5 minutes at rest preoperatively (T1),after 24 h (T2) and 7 days after surgery (T3). Diagnosis of delirium was performed using the confusion assessment method for ICU (CAM-ICU). Autoregulation index (ARI) was estimated from CBFV response to a step change in BP derived by transfer function analysis using standard template curves. Changes in ARI at T1, T2 and T3 were assessed with repeated measures ANOVA. Thirty-three (23 male) patients, mean age 64.5 (SD10.9) years had data with acceptable quality.CBFV step responses at T2 were markedly different from T1 and T3 (Fig. 96) with corresponding values of ARI (Fig. 97) Introduction: This study investigated risk factors for prolonged intensive care unit (ICU) length of stay (LOS) in an elective surgical patient cohort. LOS is an important outcome as a marker of ICU resource consumption, and there is increasing pressure to use beds more efficiently. If we can estimate the common LOS for common ICU patient groups, such as elective procedure patients, and identify risk factors for prolonged LOS in these groups, then beds can be booked in a more prospective way. This could help improve the flow efficiency of these patients through intensive care beds. We recommend routine assessment of high risk patients in the recovery room to determine suitability for admission to CCU. We have introduced a proforma outlining peri-operative risk factors to allow consistent assessment of suitability for ward-based care. Effect of obesity on mortality in surgical critically ill patients P. Wacharasint Introduction: Obesity is one of the major risk factor for a number of chronic diseases including diabetes, cardiovascular diseases and cancer. While obesity itself is a chronic inflammatory condition, previous studies showed that the obese patient with severe infection was found associated with significantly lower mortality compared to the septic patients with normal BMI [1] . In surgical critically ill patients whether obesity confers a "protective" effect is not clearly defined. We hypothesized that surgical critically ill patients, which their disease pathologies differ from septic medical patients, may also have "protective" effect from obesity. Methods: We conducted a retrospective cohort analysis using the THAI-SICU study databases, which recruited 4652 Thai patients who admitted to the surgical intensive care units of nine university-based hospitals in Thailand [2] . All 4579 patients whose body mass index (BMI) and mortality at 28 days were available, were included in this analysis. The patients were categorized into 4 groups based on their BMI i.e., underweight (BMI <18.5), normal BMI (BMI 18.5-24.9), overweight (BMI 25-29.9), and obesity (BMI >30). Primary outcome was 28-day mortality, and secondary outcome was the incidence of systemic inflammatory response syndrome (SIRS Introduction: Our objective was to identify factors associated with high mortality on admission to a mixed ICU in a resource-constrained environment, and to examine the impact of high-risk admissions. Currently we admit around 1200 patients per year to our mixed general university hospital ICU. Our predicted mortality is higher than the national average (ICNARC). Methods: This was a retrospective review of 6570 consecutive admissions from 2009-14. We analysed our existing ICNARC dataset, together with referring specialty, lead time bias and CPR prior to admission. Logistic regression was used to describe predicted hospital mortality using age, Apache II score, referring specialty and preadmission CPR. The subset admitted post-CPR were further analysed. These models were then fit-tested using ROC curves. Cost of ICU stay was estimated using mean duration of stay and our current mean ICU cost. ) and daily meetings between oncologists and ICU team (OR = 0.69 (0.52-0.91)) were associated with lower mortality. Conversely, mortality was higher in ICU holding training programs in critical care (OR = 1.38 (1.05-1.81)). The number of protocols (OR = 1.52 (1.11-2.07)) and meetings between oncologists and ICU team (OR = 4.70 (1.15-19.22)) were also independently associated with a more efficient resource use (both low SMR and SRU). Admission to ICUs in cancer centers compared to general hospitals and annual casevolume had no impact on mortality or resource use. Conclusions: Organizational aspects, namely the implementation of protocols and presence of clinical pharmacists in the ICU, and close collaboration between oncologists and ICU team are targets to improve mortality and resource use in critically ill cancer patients. Introduction: Cancer has grown steadily in ICU. The cancer patients are gradually improving the treatment of the underlying disease, both the earlier diagnosis, as most effective treatment, this lead to a better prognosis. And the prospect is that these numbers reach higher proportions over the years. But when you have to decide how much admission or not in a ICU? Based on this question we decided to retrospectively (nearly two years) evaluate the results of hospitalization of patient with or without cancer, and compare their major clinical outcomes. This study aimed to improve the knowledge of the main clinical outcomes of cancer patients in the ICU of a large private hospital. Can we break the paradigm that cancer patients may have a non-significantly different evolution of a patient without cancer admitted to the ICU? Methods: We evaluated 2341 patients admitted in ICU during the period from Jan 2014 to Oct 2015. It was an observational study which was assessed retrospectively, the data of all adults patients admitted in ICU (excluding terminals). We separated in two groups: (1) With some oncologic disease (2) Without oncologic disease. After the division of the groups they were analyzed in comparison to each other in age, sex, SAPS3 severity score, ICU and hospital length stay, final outcome in the ICU and hospital. Data were collected from Epimed System and the MV system, tabulated in Excel and studied statistically. Results: The group (1) Introduction: Hemato-oncological patients with neutropenic sepsis or septic shock are a growing concern in oncological Intensive Care Units (ICU). Acute Physiology and Chronic Health Evaluation II (APACHE II) and Simplified Acute Physiology Score II (SAPS II) are used to predict prognosis in critical patients based in clinical and analytic criteria obtained in the first 24 hours after admission. Through this study we aim to analyse which factors predict mortality in a cohort of febrile neutropenic (FN) patients with haematological malignancies. Methods: We carried out a retrospective study which evaluated all hemato-oncological patients with FN admitted to an ICU in a comprehensive cancer center between January, 2011 and December, 2014. Baseline clinical and demographic information was described using descriptive statistics. Factors predicting in-ICU mortality were identified using univariate logistic regression. Variables with a p value <0.20 were included in a multivariate analysis. Results: A total of 54 patients were included for analysis. The median age at admission was 48 years and 51.9 % were male (N = 28). The baseline cause of immunosuppression was Alogeneic Stem Cell Transplant (N = 20) and recent Chemotherapy (N = 34). The SAPS II and APACHE II scores had a median of 64 and 28.5, respectively. Multiple organ dysfunction was observed in 79.63 % of cases (N = 43). The single more frequent dysfunctions were respiratory, in 94.44 % (N = 51), renal, in 62.96 % (N = 34), and cardiovascular dysfunction, in 57.40 % of patients (N = 31). Renal replacement therapy was required in 11 cases. Invasive ventilation was performed in 81.48 % of patients (N = 44), with median duration of 168 hours. The median ICU length of stay was 8 days. In-ICU mortality was 31.48 % (N = 17). In univariate analysis, the APACHE II (OR 1.279, p value <0.01), SAPS II (OR 1.082, p value <0.01), renal dysfunction (OR 3.403, p value <0.05) and invasive ventilation (IV) (OR 8.7, p value <0.20) were associated with in-ICU mortality. In a multivariate analysis, APACHE II remained statistically significant (OR 1.546, p value 0.018, AUC 0.84) as IV in an exact logistic analysis. Conclusions: A high APACHE II score and IV seems to predict an elevated mortality risk in ICU patients with FN and haematological malignancies. SAPS II appears to be a less important predictor of mortality in this setting. However, it should be noted that the scores shouldn't be used to ascertain prognosis in individual cases. Alopecia in survivors of critical illness: a qualitative study C . Battle, K. James, P. Temblett Morriston Hospital, Swansea, UK Critical Care 2016, 20(Suppl 2):P404 Introduction: Alopecia in adult survivors of critical illness has not been previously well researched. During acute critical illness, alopecia is of minimal concern to an ICU team, as survival is the primary aim. In the recovery phase of illness however alopecia can prove distressing. (1) The aim of this study was to investigate the incidence and nature of patient-reported alopecia in a cohort of critical illness survivors. Methods: This was a single centre, qualitative survey study completed in the ICU Follow-up Clinic of a teaching hospital in Wales. All patients who had been discharged home for at least 12 weeks, following an ICU stay of three or more days and attending the Follow-Up Clinic, were invited to complete the survey. The survey was adapted from one used in a previous study and included questions regarding patient demographics and alopecia.(2) Statistical analysis included numbers (%) and medians (IQR). Comparisons between patients with and without alopecia were made using Fisher's Exact test and Mann Whitney U test. Due to the anonymity of the surveys the Wales REC 6 confirmed ethical approval was not needed for this study. Results: The survey was completed by 75 patients attending the clinic between July and December 2014. Median age was 65 years and 35 % of patients were male. Alopecia was reported in 13 (17 %) patients and the hair loss was gradual in 12 (92 %) of the 13 patients. The severity of hair loss in this cohort ranged between less than 10 % to over 50 %, with the location of hair loss varying between the patients. In all but one of the patients hair loss occurred gradually at between one and six weeks post-ICU discharge and lasted for between one and seven months. On analysis, no significant differences were found between the patients with and without alopecia in the demographic variables. Conclusions: Limited research exists examining the incidence and nature of patient-reported alopecia in adult survivors of critical illness. The type, amount and location of the hair loss varied between patients in this study. The main limitation of this study is the survey design resulting in a lack of information regarding potential predictors of alopecia. The incidence reported in this study however suggests that further research is warranted. Introduction: South Wales is an area of relative socio-economic deprivation, with a myriad of well-recognised associated mental health problems. Patients with mental health disorders are commonly admitted to an ICU as a direct consequence of their illness, such as a suicide attempt, or due to indirect consequences, such as an intravenous drug overdose in drug addicts. As a result of these findings we set out to ascertain the effect of pre-existing mental health issues including addiction on ICU admission. Methods: A prospective study design was used in which data was collected on all admissions to the general ICU of a large teaching hospital in South Wales, over a two month period. All adult patients were screened for the following questions; Was the admission a direct consequence of a mental health disorder or recreational/ illicit psychoactive substance use? Is there a co-existing mental health diagnosis present prior to admission? ICD-10 codes were used to define mental health disorders. Outcomes studied included in-hospital mortality, ventilator days and length of ICU stay. Data analysis included Fisher Exact and Mann Whitney U tests. Statistical significance was set at p < 0.05. Ethical approval was waived for this service evaluation. Results: A total of 192 patients were screened with 51 (27 %) patients admitted with a mental health disorder (15 patients as a direct consequence and 36 patients with an existing mental health problem). The patients with a mental health disorder were significantly younger than the controls (p < 0.05) however there was no difference in APACHE II scores. There were no significant differences in mortality, mechanical ventilation days or ICU length of stay between the two groups. There was a significant difference in the mortality rate between the patients admitted as a direct consequence of a mental health disorder and the control group (p < 0.05) Conclusions: Patients with mental health problems appear to have worse outcomes than those without. The mortality of those admitted to ICU as a direct consequence of their mental health problems is significantly higher than controls. The reasons for this are unclear and likely multi-factorial. A future multi-centre prospective study is planned in Wales to investigate the national impact of mental health disorders on ICU resources. Introduction: ICU illness severity scores are used in the ICUs for outcome prediction and guidance to human and financial resources use. Applicability of severity illness scores to octagenerians in medical ICU remains unclear. Methods: Retrospective analysis of 244 consecutive octogenarians admitted to our MICU in 2014 was performed calculating APACHE II and IV. Predicted mortality was compared to actual mortality. Actual and predicted length of stay were compared. Results: Out of 1075 admitted patients, there were 244 octagenerians (23 %) with 28 % ICU mortality. ICU mortality in younger patients was 19 % (p < .005). Both APACHE II and IV predicted 50 % mortality thus overestimating it by nearly 45 %. Actual LOS (3 days) was statistically shorter then predicted (6 days), p < 0.005. Conclusions: APACHE II and APACHE IV correlate poorly with observed mortality when applied to octagenerians in MICU. There is discrepancy between predicted and actual LOS. Introduction: Advanced age is associated with increased illness severity leading to an increased mortality when critical care is required. Although the average life expectancy in India is 68 years, and only 0.9 % of the population is above 80 years of age, ICUs in major cities like ours admit a substantial proportion of octagenarians. Since the ICU outcomes of octagenarians in the Indian setting is not known, we conducted this retrospective review. Methods: This retrospective review was conducted in a 19 bedded multidisciplinary ICU (mixed medical, surgical and coronary care) in the state of Tamilnadu in south India. Data was extracted from our ICU patient database for a one year period from Sept 2014 to Aug 2015. There were 470 admissions during this period, of which 53 patients were octagenarians making it 11.3 % of the total admissions. The demographics and outcomes of octagenarians were compared with the rest of the ICU admissions with full available APACHE 4 during this period. Results: The average age of octagenarians admitted to our ICU was 84 years. Unlike the rest of the admissions, there were more females (55 %) amongst the octagenarians. Their average APACHE 4 score was 57.6 with a predicted mortality of 13.9 %. This correlated well with the actual mortality of 11.3 %. There was no statistically significant difference between the length of ICU stay or mortality between the octagenarians and the rest of the admissions. The number of cardiac patients between the two groups was also comparable. Conclusions: 11.3 % of all our ICU admissions were octagenarians. Compared to the western population, octagenarians admitted to our ICU seem to have a lower predicted and actual mortality indicating less illness severeity. This could be due to a combination of patient factors, healthcare factors, healthcare delivery location and cultural factors. Published western data may not be helpful for discussing the outcomes with the patient and the family or decision making, before or after ICU admission. Introduction: Compared to younger patients, octagenerians experience higer shor-and long-term mortality rates, and their admission rates over the last decade is increasing. The purpose of this study was to evaluate ICU and hospital mortality and mortality associated with ICU readmission. Methods: Single center retrospective study in tertiary center academic hospital. Charts of 244 octagenerians consecutively admitted to medical ICU were analyzed. Pre -ICU performance was assesed, ICU mortality (APACHE IV), rate of remissions and hospital mortality was analyzed. Results: In 2014, 1075 patients were admitted to our MICU, 244 aged 80 or more, oldest being 98. ICU mortality among octagenerians was 28 % and was significantly higher compared to the younger patients (19 % mortality; p <0.0005). None of the patients aged 80 or more diagnosed with sepsis survived ICU. In-hospital mortality of octagenerians was 43 %. 10 patients (4 %) were readmitted to ICU, none of whom survived. Pre-admission performance status was not predictive considering ICU or hospital mortality. Conclusions: Octagenerians have higher ICU mortality rates compared to younger patients and their in-hospital mortality doubles. Readmission to ICU in analyzed period was associated with 100 % mortality. Pre ICU performance status was independent risk factor for mortality. Careful evaluation is warranted before readmitting elderly patient to MICU. We encourage making clear therapeutic plan after ICU discharge with clear recommendations considering potential ICU readmission, re-introducing invasive or non-invasive mechanical ventilation, vasopressor or commencing renal replacement therapy and antibiotic treatment. Introduction: Many elderly patients are admitted to intensive care units (ICUs) despite known poor outcomes and frequent patient preference to avoid unnecessary prolongation of life. The REALISTIC-80 (Realities, Expectations and Attitudes to LIfe Support Technologies in Intensive Care for Octogenarians) trial studied the outcomes of very elderly patients admitted to ICUs in Canada. Not only was mortality high, but long-term physical and functional recovery was also poor, raising questions about the appropriateness of ICU admission in this group. These findings also carry significant economic implications. Using data from REALISTIC-80 trial, we examined the cost of ICU admission for the very elderly, factors affecting this cost, and the potential savings of reducing admissions that are either undesired by the patient or medically futile. Methods: This multicentre, prospective, observational cohort study included patients > =80 years old admitted to 22 Canadian ICUs from 2009-2013. Overall cost of care per patient was determined by length of stay and direct-variable costs for ICU admission. Using both univariate and logistic regression models, we investigated potentially predictive factors influencing cost, including patient outcome (death vs. survival), frailty, residence in a nursing home, as well as the presence of advance directives. An exploratory analysis was performed to illustrate the potential cost savings of reducing the number of ICU admissions in this population. Results: In total, 3,064 patients > =80 years old were admitted to ICU, of which 1,917 were eligible, and 610 were enrolled in the study. The average age was 84 years; median length of stay was 5 days in ICU and 21 days in hospital. Mortality was 14% in ICU, 26% in hospital and 41% at 12 months. For non-survivors, the median time from admission to death was 12 days. Of enrolled patients, 240 (39%) remained in ICU for 7 days or more; 99 (41%) of these died in hospital. At 1 year, 28% of patients had meaningful recovery measured by the Short Form-36, a survey of general health status that includes both physical and mental health component scales. The results of the cost analysis are pending and will be presented at the conference. Conclusions: In this study, we will demonstrate the significant economic costs of critical care in very elderly patients admitted to ICU-a population for whom this level of care is often unwanted or futile. More importantly, we expect to illustrate the potential cost savings of reducing these admissions, which will be crucial for informing clinical and policymaking decisions. The very elderly in intensive care: relationship between acuity of illness and long-term mortality A. Results: 78 patients were included with a median age of 87 (range 85-96). 22 patients (28%) died during their ICU stay, and a further 12 (15%) died on the ward before discharge, giving a total of 34 (44%) hospital deaths. 6-and 24-month mortality were 54% and 60% respectively. Medical patients had the highest acuity scores and highest mortality at all end-points (Fig. 102) . High APACHE III scores (Fig. 103 ) and increasing numbers of organs supported correlated with shorter time from ICU admission to death (p < 0.0001 in both cases). Of those who required support of 4-5 organs (8 patients), none survived their hospital admission. Conclusions: Almost half of patients aged over 85 admitted to ICU died before discharge from hospital, but a large proportion of patients who survived 6 months were still alive at two years. High APA-CHE III scores and organ support were associated with higher mortality. These results suggest that intensive care is of benefit to selected patients aged over 85, and that acuity of illness is a useful indicator of long-term survival. Introduction: Intensive Care Unit-acquired weakness (ICU-AW) is a frequent complication of critically ill patients with an approximately incidence of 25%-50%. The diagnosis is clinical and consists in assessing the strength of various muscles groups in the upper and lower extremities. This problem represents one of the greatest burdens patients face after surviving ICU care. Objectives: To assess the incidence and identify clinical factors associated with ICU-AW in an Oncology Center. Methods: Retrospectively collected data on patients admitted to the ICU between January 2013 and December 2014 were reviewed. Patients admitted to the ICU, aged > = 18 years, mechanically ventilated for > = 48 hours and with the final diagnosis of ICU-acquired weakness were selected. Predictive factors of ICU-AW were identified using multivariate logistic regression analysis. Results: From the 177 patients included, 20,3% developed ICU-AW. The mean age was 57,3 years and 58,3% were female. The median time of mechanical ventilation was 343,5 hours. 55,6% of patients had solid tumors and 44,4% had hematological malignancies. Glucocorticoids and neuromuscular blocking agents were administered in 88,9% and 38,9% of patients, respectively. Septic shock developed in 72,2% of patients and was found to be a predictive factor for developing ICU-AW (OR 4,15, pvalue 0,018). 66,7% of patients started or continued physical rehabilitation during the hospital length of stay approximately 12 days after admission in the ICU. 44,4% of patients with ICU-AW died 37 days after admission in the ICU. 8 patients (22,2%) with ICU-AW were re-evaluated by physical and rehabilitation medicine 144 days after hospital discharge. Significant improvement was noted in their physical status while in the program of physical therapy. Conclusions: ICU-AW is a relatively frequent problem and septic shock was found to be a predictive factor for the development of this entity. Early mobilization is an important intervention to decrease the weakness and physical deconditioning in the critically ill patients. The 'obesity paradox' of critical illness refers to better survival with a higher BMI. We hypothesized that fat mobilized from excess adipose tissue during critical illness provides energy more efficiently than exogenous macronutrients and could prevent lean tissue wasting. Methods: In a centrally-catheterized mouse model of cecal ligation and puncture (CLP)-induced prolonged septic critical illness, body weight and composition, and muscle wasting were assessed in lean and premorbidly obese mice, each with fasting and parenteral feeding [healthy mice: lean n = 8, obese n = 9; fed CLP mice: lean n = 7, obese n = 10; fasted CLP mice: lean n = 9, obese n = 9]. Muscle weakness was assessed in a second mice experiment examining ex vivo muscle force [healthy mice: lean n = 17, obese n = 15; fed CLP mice: lean n = 15, obese n = 15]. Mice were generated by providing 12-week old male C57BL/6J mice with ad libitum 10% fat chow or 45% fat chow for 12 weeks prior to the septic insult. Also, in matched lean (BMI < =25 kg/m2) and overweight/obese (BMI >25 kg/m2) prolonged critically ill patients and healthy controls, we compared markers of muscle wasting (m. vastus lateralis biopsies (n = 102) and m. rectus abdominis biopsies (n = 86)) as well as muscle weakness, quantified by Medical Research Council sum scores (n = 278). Results: Five days of sepsis reduced body weight similarly in lean and obese mice, with more fat loss in the obese (p < =0.03). Lean CLP mice, but not the obese, showed reduced muscle mass (p < =0.04), muscle protein content (p < =0.06), myofiber size (p < 0.01), and muscle and hepatic triglyceride content (p < =0.06), irrespective of administered feeding. Obese CLP mice maintained normal maximal muscle force, whereas in lean CLP mice, maximal muscle force decreased (p < 0.01) and recovered less from fatigue (p < 0.01). These differences between lean and obese CLP mice coincided with signs of more effective hepatic fatty acid and glycerol metabolism, and ketogenesis in the obese. Also overweight/obese critically ill patients showed preserved myofiber size, while myofiber size reduced in lean patients (p = 0.02 in m. vastus lateralis biopsies, p = 0.01 in m. rectus abdominis biopsies). Furthermore, fewer overweight/obese patients suffered from muscle weakness, assessed 8 days post-ICU admission (p < 0.01). Conclusions: In conclusion, during critical illness premorbid obesity, but not nutrition, facilitated utilization of stored lipids and attenuated muscle wasting and weakness. Introduction: The aim of this study was to evaluate the most suitable physical outcome measures to be used with critical care patients following discharge. ICU survivors experience physical problems such as reduced exercise capacity and intensive care acquired weakness. NICE guideline 'Rehabilitation after critical illness' (1) recommends the use of outcome measures however does not provide any specific guidance. A recent Cochrane review noted wide variability in measures used following ICU discharge (2). Methods: Discharged ICU patients attended a five week multidisciplinary programme. Patients' physical function was assessed during the programme, at 6 months and 12 months post discharge. Three outcome measures were included in the initial two cohorts. The Six Minute Walk Test (6MWT) and the Incremental Shuttle Walk test (ISWT) were chosen as they have been used within the critical care follow up setting (2). The Chester Step Test (CST) is widely thought to be a good indicator of ability to return to work (one of the programmes primary aims). Ethics approval was waived as the programme was part of a quality improvement initiative. Results: Data was collected for the initial patients attending the programme (n = 13), median age was 52 (IQR = 38-72), median ICU LOS was 19 days (IQR = 4-91), median APACHE II was 23 (IQR = 19-41) and 11 were men. One patient was so physically debilitated that the CST or ISWT could not be completed however a score was achieved using the 6MWT. Another patient almost failed to achieve level 1 of the ISWT. Subsequent patients for this project (total n = 47) have all therefore been tested using the 6MWT. Good inter-rater and intrarater reliability and validity have been reported for the 6MWT (3). Conclusions: Exercise capacity measurement is not achievable for some patients with either the ISWT or the CST due to the severity of their physical debilitation. Anxiety, post-traumatic stress disorder and depression are common psychological problems post discharge (4), therefore using a test with a bleep is not appropriate. Therefore, the 6MWT is the most appropriate physical outcome measure to be used with critical care patients post discharge. Introduction: We aimed to improve the active mobilisation of Intensive Care (ICU) patients through a quality improvement (QI) project. Only 50% of ICU patients return to work within one year of discharge [1] : there are ongoing physical, psychological and cognitive problems after ICU discharge [1, 2] . Active mobilisation shortens hospital stay, increases return to independent function [3] and reduces delirium [3] . UK ICU standards, introduced in 2013, direct 45 minutes/day of active mobilisation in suitable patients. Methods: Our ICU selected a multidisciplinary (MD) team in January 2014 to lead the mobilisation QI project and a commitment was made to the weekly collection and presentation of data. We used the ADEPT (aim, data, evidence, process, team) format. Our initial aim was 20 minutes of active mobilisation daily in 95% of suitable ICU patients. Patients had to be able to obey commands, to have achieved a degree of cardiovascular stability and must have no musculoskeletal injuries precluding mobilisation. Vasopressor use and invasive ventilation per se were not barriers to active mobilisation. Agreed forms of active mobilisation were active limb exercises (a booklet of exercises was developed), bed edge sit (dangle), sitting out of bed in a chair, standing and walking. Baseline data was collected and serial Plan, Do, Study, Act (PDSA) cycles were carried out. Results: Baseline data (March -April 2014) showed 34% daily mobilisation in the target group. Performance improved to a median of 95% by November 2014 and has been maintained from January to November 2015 at 94%. Our daily mobilisation aim was increased to 30 minutes in June 2015 and to 45 minutes in October 2015. We saw a reduction of 1.2 days in our ventilator length of stay September 2014 as better reliability in active mobilisation was achieved. Conclusions: The results of this QI project show that a MD approach to mobilisation can achieve results. The weekly data collection and discussion proved essential in advancing success. We had no extra resources or new funding to help us increase mobilisation time and we used our data and successive PDSA cycles to achieve success. despite receiving supportive care and physiotherapy. Such weakness contributes for poor outcome. Early mobilization may enhance functional recovery, reduce days in mechanical ventilation, shorten ICU and hospital length of stay, decrease the incidence of delirium and improve survival. A great part of ICU patients are on vasoactive drug therapy, but there are few data in literature about the safety of mobilization in this group. Therefore our study aimed to prove that early mobilization even of patients using low dose of vasoactive drugs is safe and feasible. Methods: Whe assigned patients who were using or not low dose of vasoactive drugs to receive 2 daily sessions of physiotherapy during their ICU length of stay and analyzed their tolerance. The exercises consisted in 3 levels of mobilization: level 1 were passive exercises, level 2 were active exercises and level 3 were orthostatic position and walking. Hearth rate, respiratory rate, peripheral oxygen saturation and MAP were evaluated in 3 moments: before the physiotherapy session, immediately after and 30 minutes after. Results: 154 patients were followed from November 3, 2015 to November 20, 2015. 50.6% of patients were men. Mean age was 71.1 in vasoactive group and 73.4 in control group. Mean SAPS 3 score was 73.6 in vasoactive group and 62.7 in control group (p < 0.05). The types of vasoactive drugs were 53.5% norepinephrine, 45.1% nitroprusside and 1.4% nitroglycerine. We removed from bed and putted on the seating position 34.7% of patients in the vasoactive group and 57.3% in the other group. There weren't complications statistically significant. During the exercises there were changes on MAP, but with no major event associated, though increase in dose of vasoactive drug (p > 0.05). The only statistically significant difference between the two groups was increase in MAP in the vasoactive group immediately after mobilization (from mean MAP of 84.02 to 87.0). The mean dosage in mcg/kg/min of norepinephrine was 0.0117, nitroprusside 0.098 and nitroglycerin 0.278. Conclusions: Our study showed that mobilization of patients on low dose of vasoactive drugs therapy may be possible and safe, with little complications and no major changes in MAP. These data are very important to encourage early mobilization of patients in ICU and to ameliorate their outcomes. Introduction: During an intensive care stay, patients often have their chronic medications withheld for a variety of reasons and new drugs commenced [1] . As patients are often under the care of a number of different medical teams during their admission there is potential for these changes to be inadvertently continued [2] . Intensive Care Syndrome: Promoting Independence and Return to Employment (In-S:PIRE) is a five week rehabilitation programme for patients and their caregivers after ICU (Intensive Care Unit) discharge at Glasgow Royal Infirmary. Within this programme a medication review by the critical care pharmacist provided an opportunity to identify and resolve any pharmaceutical care issues and also an opportunity to educate patients and their caregivers about changes to their medication. Methods: During the medication review we identified ongoing pharmaceutical care issues which were communicated to the patient's primary care physician (GP) by letter or a telephone call. The patients were also encouraged to discuss any issues raised with their GP. The significance of the interventions was classified from those not likely to be of clinical benefit to the patient, to those which prevented serious therapeutic failure. Results: Data was collected from 47 of the 48 patients who attended the clinic (median age was 52 (IQR, 44-57) median ICU LOS was 15 (IQR 9-25), median APACHE II was 23 (IQR 18-27) and 32 of the patients were men (67%). The pharmacist made 69 recommendations; including 20 relating to drugs which had been withheld and not restarted, dose adjustments were suggested on 13 occasions and new drug recommendations were made for 10 patients. Duration of treatment for new medications started during hospital admission was clarified on 12 occasions. Lastly adverse drug effects were reported on 4 occasions and the incorrect drug was prescribed on 2 occasions. Of the interventions made 58% were considered to be of moderate to high impact. Conclusions: The pharmacist identified pharmaceutical care issues with 18.6% of the prescribed medications. Just over half of the patients reported that they were not made aware of any alterations to their prescribed medication on discharge. Therefore a pharmacy intervention is an essential part of an intensive care rehabilitation programme to address any medication related problems, provide education and to ensure patients gain optimal benefit from their medication. Are daily blood tests on the intensive care unit necessary? Introduction: Daily blood tests are a long running feature of ICU care, and are regarded as an essential diagnostic tool by medical staff, however the risk of iatrogenic anaemia is a very real one and there are significant costs involved. 1 Methods: Existing guidelines are intended to limit excessive testing in stable patients. We audited adherence to these and also recorded trends in haematocrit and haemoglobin to identify cases of iatrogenic anaemia on the unit. In order to establish trends and to exclude stable post surgical patients, we limited eligible patients to those who were on the unit for 3 days or more. On six randomly assigned days over a 2 month period, all eligible patients on the unit had a proforma filled retrospectively for all days up until their admission to the unit, or up to seven days previously, whichever sooner. Presence of arterial or central access was also noted, as these facilitate obtaining samples. Results: We obtained 101 patient days worth of data across 20 patients who were considered eligible. Our results demonstrated that 54% of tests done were required but more importantly, 46% of tests carried out were deemed clinically unnecessary by a panel of three ICU physicians, and 48% were ordered despite the unit guidelines stating otherwise. The inpatient cost of carrying out the unnecessary blood tests added up to £842 and the amount of blood wasted totalled 1052 ml. Based on these figures, the average volume of unnecessary blood taken from a patient per week is 73 ml. Only 2 patients selected had no central or arterial line for part of their inspected days, and this had no effect as they had tests taken daily anyway. Conclusions: Stopping short of suggesting an opt in approach to lab tests as other units have done 2 , we are recommending a period of education for existing nursing and medical staff, and also new trainees on the Intensive Care Unit. We are also recommending removal or alteration of the "Critical Care" test set that is currently available from our electronic requesting system in the hope that it will encourage more critical thinking about which tests are appropriate for the patient they are caring for. Measuring urine output in ward patients : is it helpful? B. Avard 1 , A. Pavli 1 , X. Gee 2 1 The Canberra Hospital, Hughes, ACT, Australia; 2 ANU Medical School, Canberra, Australia Critical Care 2016, 20(Suppl 2):P423 Introduction: The significance of tracking urine output as a marker for patient deterioration on general wards remains unclear. We aimed to explore the utility of measurement of urine output in patients including those who deteriorated and required Intensive Care support and a cohort who did not, from both medical and surgical wards in a large metropolitan hospital in Australia. Of note this hospital had a program to recognise and respond to deteriorating patients well embedded, which included a modified early warning score and track & trigger response. Methods: A retrospective cross sectional analysis was performed on 440 patients admitted to our hospital over a six month time period. We excluded patients with premorbid renal insufficiency or requiring dialysis prior to admission as these patients would have skewed results. We collected data on urine output, modified early warning scores, when patients were admitted to ICU the timing of deterioration being recognised and responded to, patient length of stay and mortality.All data was subsequently analysed using SPSS software. Ethics approval was obtained for this research. Results: Only 9% of the ward patients had urine output measures performed, irrespective of whether they deteriorated to the point of ICU admission or remained on the ward. Despite this, those patients who had a low urine output recorded on the ward and who were subsequently admitted to ICU continued to have a lower mean output in the first 24 hours of their ICU stay. A statistically significant association was found between reduction in urine output and increased hospital mortality with a mortality rate of 7.9% in those with reduced output compared with 1.3% with normal output for those patients who were not subsequently admitted to ICU, with an even more pronounced correlation in those who deteriorated to the point of requiring ICU admission. 25% of those patients who received a medical review in response to deterioration may not have elicited a review if the urine output had not been monitored and included in the activation criteria. Conclusions: This research suggests that urine output remains a vital component of our recognition of deteriorating patients and our focus should be on improving routine measurement of this parameter rather than removing this from modified early warning scores.More reliable measurement may allow further investigation as to whether earlier intervention based on this parameter may improve patient outcome. Introduction: Pressure Ulcers (PU) are common and serious complications in critically ill patients. This study aims to evaluate the incidence and risk factors that are associated with PU development in an adult mixed intensive care unit (ICU) in Turkey. Methods: Ethics approval was obtained prior to start of the study. A prospective cohort study was performed from December 2013 to July 2014. Patients were screened daily until discharge or death, over a consecutive 28-day period. We collected data on demographic and baseline variables, co-existing diseases, serum albumin level, repositioning frequency, presence of mechanical ventilation, Acute Physiology and Chronic Health Evaluation II (APACHE II) score, Sequential Organ Failure Assessment (SOFA) score, Glasgow Coma Score (GCS) and vasoactive medications. Results: A total of 194 patients were examined in the study, 30 (15,4 %) of them developed a PU. The factors associated with PU development were: serum albumin level (p =0,0004), generalized edema (p = 0,004), mechanical ventilation use (p = 0,0007), sedation (p = 0,01) and GCS (p = 0.00001). Age, gender, body mass index, APA-CHE II and SOFA scores, and repositioning were not different between patients with and without PU. Conclusions: Pressure ulcers are common complications in adult mixed intensive care unit, with potentially severe consequences. The incidence of pressure ulcers were related with state of consciousness, sedation, mechanical ventilatory support and presence of generalized edema. significant compared to the eICU cost borne by the unit of INR 2.4 million. Conclusions: eICU has been judged as a cost centre by most administrators but if we look at the above case study then it's clear that in developing country like India not only it's cost effective but also made a significant decrease in ALOS along with mortality and nosocomial infection. This leads to a direct impact of financials on patients and the hospital. By mobilizing shorter turnovers we can help in treating more patients without escalating cost by building new beds which will never have adequate manpower and resources. Introduction: Critical Care (CC) services are in increasing demand but published guidance for triaging admissions may no longer reflect current practice [1] [2] . Exercise tolerance and clinical frailty assessment may have a role in assessing patients (pts) referred to CC [3] . We aimed to establish the impact of frailty and other factors on this decision making process. Methods: Referrals to CC were prospectively enrolled in a review cohort. Data included patient demographics, a measure of acute physiological derangement (early warning score, EWS), prior hospital length of stay (LOS), exercise tolerance (ET), functional and dependence status, Canadian Clinical Frailty Scale (CCFS) and comorbidities. Logistic regression analysis was used to assess factors influencing admission, using STATA 14.1. Results are expressed as median (interquartile range) and odds ratios (OR) for admission with 95% confidence intervals (CI tration reporting it was often unclear if they were actually continued, and therefore whether harm was due to chronic effects, ongoing use, or drug withdrawal. Conclusions: Overall there were inadequate data to draw firm conclusions. We feel there is sufficient uncertainty regarding the safety and utility of SSRI/SNRIs in critical care to warrant a randomized interventional study into continuing/withholding them on critical care admission. Measuring adaptive coping of hospitalized patients with a severe medical condition:the sickness insight in coping questionnaire (sicq) Introduction: The literature lacks a brief, specific, and validated instrument for measuring and monitoring adaptive coping of hospitalized patients with a severe medical condition. Hence, we introduce the Sickness Insight in Coping Questionnaire (SICQ) and examined its validity and patient-proxy agreement. Methods: Study 1 (n = 103 hospitalized patients) addressed the internal consistency, initial factor structure, and construct validity, of the SICQ. Coping subscales of the BRIEF COPE, Illness Cognition Questionnaire, and Utrecht Coping List, were used as comparator measures in testing the construct validity of the SICQ-subscales (fighting spirit, toughness, redefinition, positivism, non-acceptance). Study 2 (n = 100 ICU-patients and close family members of ICUpatients as proxies) addressed the structural validity of the SICQ with confirmatory factor analyses, and examined patient-close proxy agreement with correlation and Bland-Altman Plot analyses. Introduction: In Portugal training in intensive care is done after the achievement of a primary specialty. This is about to change with a proposal to create a primary intensive care medicine (ICM) specialty. A national survey was done to determine intensive care practitioners' opinion on how to train in ICM. Methods: Questionnaires were sent to all intensive care practitioners. Models of intensive care training were described as: primary specialty with direct entry following primary medical qualification; subspecialty of another discipline; supra-specialty permitting multidisciplinary access from a range of base specialties. Results: The team sent 391 questionnaires and obtained 240 answers (61%). Responders had 46 ± 10 years, 52% are male and worked in intensive care for 10(4-19) years, 86% worked in full time, of those working in shared schedule (n = 33) only 16 dedicated < 50 % of their time to ICM. Primary specialty of the responders was: Internal Medicine (67%), Anaesthesiology (24%) and other (9%). Most of them (73%) had previous experience in their primary specialty before initiating Intensive Care practice, during 2(1-5) years. When questioned about the ideal model 45% (n = 106) choose a "primary speciality" and of those: 69% thinks that training should be 5 years; 90% consider important a common core with Internal Medicine (92%), Anaesthesiology (92%), Emergency Medicine (89%), Cardiology (83%) and others (7,4%); for 81% the remaining training should be entirely in ICM and of these 95 % think that training in a dedicated ICU should be considered, mainly in neuro-ICU (93%) and RED-ICU (61%). For those who support ICM as a supra-specialty (51%,n = 121) or subspecialty (28%,n = 65) the main reasons pointed were: previous specialty facilitates acquisition of ICM knowledge (97%), increases clinical autonomy (97%), provides a higher capacity in clinical judgment and medical decisions (90%) and allows more firm bioethics knowledge (80%); it is an advantage having a multidisciplinary team (95%). Regarding the training program: 77% knows the CoBatriCE project; 89% knows the PACT and 95% finds it usefull. Technical competencies found to be desirable in the ICM training were: Echocardiogram (97%), Bronchoscopy (97%), Fast Eco (97%) and others (30%). Undergraduate teaching was reported as desirable by 83% (n = 193). Conclusions: Good response rate, less than half of the responders think that ICM should be a primary specialty, the vast majority thinks that an intensive care team should be composed by consultants from different specialties. Introduction: Work-related stress with the accompanying emotions provoked specifically in the Intensive Care Unit (ICU) is well documented over the previous years. The high-stakes, high stress environment that ICU health care professionals (HCP) practice in, are demanding intellectually, physically, and emotionally. Work engagement has been operationalized as a positive work-related state of mind and characterized by vigor, dedication, and absorption. The aim of the study is to explore the influence of personal resources, e.g. empathic ability on work engagement among the ICU HCP. Methods: The design of the study was a cross-sectional survey study among ICU HCP (i.e. intensivists and nurses) of Erasmus MC, a university hospital in the Netherlands. A digital link to the questionnaire was sent in October 2015 to a given e-mail address of 163 nurses in a mixed ICU, 45 nurses in a thoracic ICU, 54 nurses in a cardiac ICU and 53 intensivists. Two reminders were sent. Work engagement was measured with the Utrecht Work Engagement Scale, which includes 17 items five point Likert items about how a person feels at work. Additionally, 14 items based on the Jefferson Scale of Physician Empathy measured emphatic ability. Results: The overall response rate was 66%, with a male-female ratio of .47. The mean age of the respondents was 43.4 years, mean working hours/weeks rated 33.6, and 54% finished A-levels in education. ICU HCP scored the same on vigor, higher on dedication (p < .05), and lower on absorption (p < .05) compared to the average Dutch employee. They reported no more stress related symptoms such as mental distance and sleeping disorders. Although only 2.8% of the ICU HCP assessed workload as high, compared to 3.6%, the emotional burden scored higher (3.1 and 1.8 in respective, p < .05). General HCP acted as benchmark in empathic ability. ICU HCP showed an equal score on the cognitive component (3.9 compared to 4.0) and a lower score on the emotional component (3.0 compared to 3.8, p < .05). Mean cognitive empathy is positively correlated with vigor (r = .21) although both cognitive and emotional empathy did not correlate with work engagement. tients PALbefICU and 150 (84%) PALaftICU. When comparing these two groups, there were no differences in MPMII0 score, organ dysfunctions or use of artificial life support at the moment of ICU request. PALbefICU patients were hospitalized for fewer days before PC consultation request (8 vs 29 days, p < 0.05), were more likely to have no diagnosis at the moment of consultation (10% vs 0%, p < 0.05) and presented less frequently with palliative performance scale (PPS) lower than 30 (56% vs 85%, p < 0.05). More patients had PC consultations discontinued in the PALbefICU group (41% vs 20%, p < 0.05) and the main reason for discontinuation was the absence of indication for exclusive palliative care (31% vs 6%, p < 0.05). However, PALbefICU patients were less likely to be admitted in the ICU (31% vs 53%, p < 0.05) and hospital mortality was similar in both groups (88% vs 82%, p < 0.05), both adjusted for severity of illness. Conclusions: Co-occurrence of ICU admission request and PC consultation in the same hospitalization was frequent and PC consultation before ICU request was associated with a lower chance of ICU admission. These patients had extremely high hospital mortality, despite differences in timing of consultation, diagnosis, PPS and definition of life support limitation. This data seems to support the notion that clinical deterioration during hospitalization may be seen as an opportunity to discuss end of life goals. Introduction: End-of-life (EOL) decision making in acute care is complex, involving difficult decision, such as whether to initiate or discontinue life support, place a feeding tube or a tracheostomy, or initiate cardiopulmonary resuscitation (CPR) in the event of a cardiac arrest. Methods: our critical care nurses suffer when they think they are performing procedures that are harmful or of low efficacy and when their advocacy is ineffective. Prolonged, unrecognized suffering can be detrimental and can lead to disengagement and silence. Habitual silence or silencing in the face of perceived wrongs can result in permanent, deleterous changes in ethical values (1) . In recent studies researchers suggested that false optimism is associated with medical activism or a strong need for control over death, which is prevalent in the western world.. or ignorance or some emotional anesthesia present in us. Results: Narratives enable those involved in the ICU to negotiate the meaning of critical care for the patient. Narrative functions as an interpretive procedure that allows diverse persons-we-nurses-patient, and patient's families-to make sense of critical care.... and supplies a theory and instrument for negotiating meaning throughout the process that respects the realities and limitations of such care. Conclusions: the effective practice of medicine requires "narrative competence.. the ability to acknowledge, absorb, interpret, and act on the stories and plights of others". The nurses that working with us every day are often caught in the middle-an ethically untenable position-as they attempt to comply with our medical directives and simultaneously protect and advocate for their patient. Introduction: Organ donation after euthanasia has been performed at least 30 times in Belgium and the Netherlands, often at an ICU. Thousands undergo euthanasia each year, but it is unclear what the potential of this procedure can be for the donor pool. Methods: It was not possible to obtain detailed euthanasia data from Dutch authorities. All Belgian euthanasia reports of 2013 were analyzed. The primary outcome was the number of patients which could donate at least 1 organ. Subgroup analysis was performed to evaluate the number of potential donors per organ. Exclusion was based on the contraindications for donation within the Eurotransplant region. Cancer was an exclusion criterion as the majority of malignancies are a contra-indication for organ donation. Patients with "Multiple pathologies" were excluded, because they would be unlikely to be accepted as an organ donor. Patients with "Other pathologies" were excluded, because many mentioned contra-indications for donation or details about the pathology were lacking. For subgroup Fig. 107 (Abstract P450) . Skin procurement. analysis, patients with renal disease were excluded as potential kidney donors and patients with lung disease as potential lung donors. Within the Eurotransplant region, 75 years is the maximum age for donation of kidneys, lungs and pancreas islet isolation in Maastricht category 3 deceased donors. For potential liver donors, a maximum age of 60 years was used and for whole pancreas donors 50 years. Results: Of 1811 obtained Belgian reports, a total of 186 potential donors with at least 1 suitable organ for donation remained. 6 patients with a history of kidney disease were excluded, leading to 360 potential kidney donations. For lung donation 31 patients were excluded leaving a potential of 155 lung donors. 113 patients between 60-79 years old were excluded, leaving 73 potential liver donors. For whole pancreas donation, the age group of 50-79 years was excluded, resulting in 30 potential pancreas donors. Conclusions: In this first analysis of Belgian data on euthanasia in order to explore the potential of organ donation after euthanasia, 10,3% of all euthanasia cases are potentially suitable patients for donation. Whether they are willing to donate is unclear and should be studied. If only a small percentage of these patients would consider donation, this could mean an increase in donation in Belgium and the Netherlands. Introduction: The ability to communicate well with patients and their relatives is a fundamental clinical skill in intensive care medicine and central to good medical practice. Communication at the bedside, even when the patient is unconscious or sedated, may often be recalled by critical care survivors and can impact upon long-term psychological outcomes [1] .The Intensive Care Society released guidelines for the provision of intensive care services and one area which was highlighted was the patient and relative perspective, the importance of effective communication with relatives and patients and valuable time spent talking to the patient [2] .The aim of this audit was to assess the communication skills on the daily ward rounds between the team leader, normally the on call Consultant, and the patient at the bedside. Methods: An observational study comprising of 32 daily ward rounds on the 14 bedded Critical Care Unit (CCU) at Whiston hospital between October and November 2015. An average of 11 patients per ward round were recorded including non-ventilated and ventilated patients. A data collection tool was used which assessed whether the Consultant leading the ward round introduced themselves, explained what their role was and introduced the rest of the team. Results: Out of the 348 patients who were on the CCU within this time period, the majority of these patients were not ventilated. Saying that it was found that only 53% of Consultants introduced themselves, 52% explained their role and 45% introduced the rest of the team. There seemed to be a significant difference in the communication between Consultants and ventilated patients, showing that only 16% introduced themselves, 14% explained their role and 11% introduced the rest of the team. Conclusions: The communication between patients and doctors within the CCU calls for vast improvement. Given the evidence as mentioned, it is imperative that strategies are put in place to help improve the results which were found. This would include a simple adjunct next to the bedside to help remind the Consultant about the importance of effective communication. It is worthwhile re-auditing once this implementation has been put in place to see whether communication has improved. Introduction: Communication with patients and families in critical care medicine (CCM) can be complex and challenging. Physicians benefit from dedicated communication teaching, although the ideal model has not been established. Our purpose was to develop and assess the impact of a communication skills curriculum for CCM fellows. Methods: Surveys including multiple choice and free-text questions were sent to all CCM fellows, staff physicians, nurses and social workers at our institution. The results informed the design of a longitudinal communication skills curriculum for CCM fellows. The effectiveness of the curriculum is being assessed through trends in clinician feedback. Results: The survey response rate was 7/7 fellows, 15/29 staff physicians, 56/404 nurses, and 1/5 social workers. More than 50% of non-fellow respondents identified that fellows ranked below expectations in counseling about the emotional impact of emergency situations; fellows reported they had the least amount of training in this area and were least comfortable addressing patient and family emotional issues. Staff physicians were least comfortable teaching and assessing fellows' capacity to address emotion, and had received the least amount of training in this area themselves. All non-trainee groups described fellows' focus on giving information over building rapport; fellows indicated their challenges may be related to the absence of a prior relationship with patients and families, and their own discomfort in talking about death and dying. Despite challenges in the relational aspects of communication, topics of greatest interest to fellows included organ donation, adverse events, conflict, and family meetings. Preferred learning methods included simulation and feedback in clinical practice. This data guided the development of a communication skills curriculum involving 5 formal sessions over a year, and structured feedback during CCM rotations. Each formal session consisted of a didactic presentation and simulated practice. One formal session was dedicated to basic principles of communication, which were incorporated into each of the topic-based sessions. A form to guide multidisciplinary preceptors in providing feedback to fellows in clinical practice was also developed and implemented. Preliminary data indicate that fellows value the curriculum and feedback; more objective curricular evaluation is ongoing. Conclusions: Kern's model has been valuable in developing a blended communication skills curriculum tailored to CCM fellows' needs. The curriculum has been received favorably. Introduction: Conflicts between healthcare professionals and patients' relatives are rather common in the intensive care unit (ICU). As a result of societies' increased ethnic diversity these conflicts more often involve actors with a different ethno-cultural background. However few is known about the specific nature of staff-family conflicts in a multiethnic ICU environment. In this study, we give an overview of some characteristics of conflicts between family members from ethnicminority groups and staff members and compare them to characteristics of staff-family conflict in general. Methods: Ethnographic fieldwork was done in 1 ICU of a multiethnic urban hospital in Belgium. During 6 months, data were collected through negotiated interactive observation, in-depth-interviews with staff, from patients' medical records, and by making notes in a logbook. Data were analyzed via grounded theory and compared to the literature on staff-family conflict in general. Results: It is known that, in general, physicians identify a situation less often as conflict-loaded than relatives, suggesting the presence of hidden conflicts during critical care [1] [2] . However, in our studied multi-ethnic ICU conflicts were found to be explicitly recognized by both relatives as well as healthcare professionals, and to be very visible and auditable on the ward. Moreover where in general relativestaff conflict tend to be centered around crisis moments (end-of-life decision making, patients' death) [3] , we found that in a multi-ethnic ICU conflicts tend to be present during various care phases (curative phase, end-of-life decision making, non-curative phase, patients' death) and concern a broad spectrum of care aspects ( e.g. bedside care activities, seeking a second opinion), easily assaulting the core of actors' identity. Consequently, end-of-life decisions, often a difficult assignment in ICUs, might be even more problematical in a multiethnic context than in general, as tensions in the pre-end of life decision making phase might worsen conflicts in the end-of-life decision making phase. Conclusions: In ICUs, staff-family conflicts tend to be more severe and overtly present in a multi-ethnic context than in general. Therefore we urge for the development of specific and effective stafffamily conflict prevention strategies in a multi-ethnic ICU. Introduction: We sought to evaluate whether the succinct Critical Care Family Needs Inventory (CCFNI) [1] , a validated survey, may serve as a suitable alternative to the FS-ICU. Undertaking relative experience surveys annually was outlined as a quality indicator by the Scottish Intensive Care Society Quality Improvement Group in 2012. Over the past 5 years we have completed the FS-ICU annually. This is a lengthy survey covering aspects of care & decision making. The FSS has been previously validated in large US/Canadian studies [2] . More recently the Family Reported Experiences Evaluation (FREE) validated the FS-ICU in the UK, employing the survey across 20 different ICUs with approximately 7000 forms completed. Methods: The audit was conducted across both hospitals in the trust. Over 4 weeks we distributed the CCFNI to relatives of patients on ITU/HDU once decision to step down to a ward was made. We included a section for open comments at the end of the survey. Results: Over 4 weeks a total of 44 surveys were completed. Selected results are detailed below with comparison to results from similar questions in the FS-ICU survey. Quality of care: CCFNI: 77% of relatives felt that the best possible care was being given to the patient almost all the time. FS-ICU: 72% felt the care received from doctors was excellent & 71% felt the care received from nurses was excellent. Communication: CCFNI: 73% felt that explanations about the patient's condition were given in terms that they could understand almost all of the time & the remaining 27% felt explanations were understandable most of the time. FS-ICU: 69% felt staff provided understandable explanations excellently, 31% described this information as very good. Empathy: CCFNI: 84% stated that staff members showed an interest in how relatives were feeling almost all of the time and most of the time. FS-ICU: 72% stated that ICU staff were excellent in showing interest & consideration to relative needs, and 28% said staff were very good. Conclusions: Results suggest that relatives are satisfied with the care that their relatives received at the trust. The results appear to correlate well with those from the FS-ICU in several domains such as care, communication and empathy. We therefore propose that the CCFNI can be used as an alternative annual survey to measure relatives experience on ICU. sought to identify the support required by volunteers from healthcare professionals involved in the project. Methods: Six in depth semi structured interviews were undertaken with volunteers (both patients and family members) involved in the InS:PIRE clinic by an assistant psychologist. A predetermined topic guide was utilised to guide interviews. Interviews were audio recorded and transcribed verbatim. Interpretative Phenomenological Analysis was used to analyse the transcripts (3). Peer Review was undertaken to ensure credibility of the findings. Results: Findings: Six key themes were identified from these interviews: the social impact of volunteering, shared experiences; supporting others; personal boundaries; support needs and personal gain. The importance of peer support and having a shared understanding of participants needs were key themes for the volunteers. Volunteers described the need for further support in areas such as: confidentiality; listening skills and understanding boundaries. Conclusions: The use of peer volunteers in this ICU rehabilitation service has been successful within this local context. Further, larger scale research studies, which explore further the impact of volunteering for ICU survivors are required. Introduction: Intensive care unit is one of the most stressful places in a Hospital. Relatives are exposed to many factors that may cause symptoms of anxiety and depression. The aim of this study was to determine the prevalence of anxiety and depression symptoms 72 hour and 90 days after ICU adimssion, and identify factors associated with this symptoms. Methods: Relatives of Patients admitted to the National Burn Center in Montevideo between february and october 2015 were invited to participate in this study. 72 hours and ninety days after ICU admission, family members completed a survey that included the Hospital Anxiety and Depression Scale. Results: 95 and 45 relatives reponded the survey at 72 hours and 90 days after ICU admission repectively. The prevalence of Anxiety symptoms were 60% and 28% 72 hours and 90 days after ICU admission respectively. Symptoms of depression were present in 47% of relatives at the third day and 18% at 90 days after ICU admission. 72 hous after ICU admission, a longer time lived with the patient was associated with anxiety 18 (3-26) years vs 2 (0-23) years (p = 0,02), and with depression 18 (7-26) years vs 4(0-23) (p = 0,02). At this moment, a higher total burn area 27% (18-37) vs 19% (10-27) (p = 0,02) was associated with depression. Factors associated with anxiety 90 days after ICU admission were: felt that treatment information was incomplete (80% vs 22%,p = 0,02), and not being able to return to work (80% vs 22%p = 0,02). The only factor associated with depression symptoms 90 days afetr ICU admission was not being able to return to work (38% vs 11%, p = 0,04). Conclusions: Prevalence of Anxiety and Depression among relatives was very important after 72 hours of ICU admission, and remained elevated 3 months later. This study shows factors associated with the apearence of anxiety and depression. Introduction: St Elisabeth Hospital has a 30 bed ICU for critically ill adult patients. We facilitate family centred care, but in earlier years children were not allowed to visit family members in our ICU. The objective of our project was to facilitate paediatric visitation and to provide support related to the distress and needs of children with a relative in our ICU. Methods: Practical aids were developed for our caregivers. An information leaflet was made with practical information for parents. This was subdivided in developmental stages. Furthermore we created a book in which a child visits his father at our ICU. Pictures show what children can expect, which helps prepare the child for visiting. An instruction box is included with ICU materials (Fig. 108) . These materials are used to give children a more tactile experience of the ICU. Guidance materials for when a patient may die were developed included. For example: there are little boxes that can be decorated by the children to put in a memory or a token of grief. Pedagogical staff are available to support parents, children and staff. If there are more profound problems there will be referral to our children's psychologist. Our hospital photographers can be called upon to commemorate the last moments (Fig. 109) . We instructed our nurses and doctors on how to use these materials and how to guide children. We held a survey among our staff to inquire if this met their needs. Results: There's an increasing awareness to the needs of children among our staff. Children and parents are welcomed and guided by our staff. Our survey amongst our staff showed a great appreciation for the materials we developed. When relatives of children die, children can be included in the farewell visit. Conclusions: When a child is confronted with a sick or dying parent, loss and sorrow are inevitable. We, as care givers, can be more than bystanders and can help parents guide their children in difficult times.

Annnotations TAB TSV DIC JSON TextAE-old TextAE

  • Denotations: 9386
  • Blocks: 0
  • Relations: 0