> top > docs > CORD-19:2e2bc679c8393230bac2af64115bab7ed1ba62c0

CORD-19:2e2bc679c8393230bac2af64115bab7ed1ba62c0 JSONTXT

P1 Comparison of carbamylated versus recombinant erythropoietin during spinal cord ischemia/reperfusion injury P2 Sodium 4-phenylbutylate protects against myocardial ischemia-reperfusion injury by reducing unfolded protein response-mediated apoptosis in mice P3 Time-dependent eff ects of intravenous H 2 S during long-term, resuscitated porcine hemorrhagic shock Abstract We previously showed that erythropoietin (EPO) attenuates the morphological signs of spinal cord ischemia/reperfusion (I/R) injury in swine [1] without, however, improving neurological function. The clinical use of EPO has been cautioned most recently due to serious safety concerns arising from an increased mortality in acute stroke patients treated with EPO and simultaneously receiving systemic thrombolysis [2]. Carbamylated EPO (cEPO) is an EPO derivative without erythropoietic activity and devoid of the EPO side eff ects, but with apparently well maintained cytoprotective qualities [3]. We therefore tested the hypothesis whether cEPO may be equally effi cient as EPO in reducing morphological as well as functional aortic occlusion-induced spinal cord I/R injury. Methods In a randomized and blinded trial pigs received either vehicle (control, n = 9), EPO or cEPO, respectively (n = 9 each; 5,000 IU/kg over 30 minutes before and during the fi rst 4 hours of reperfusion). Animals underwent 30 minutes of thoracic aortic balloon occlusion with catheters placed immediately downstream of the A. subclavia and upstream of the aortic trifurcation. Spinal cord function was assessed by motor evoked potentials (MEP as percentage of the amplitude before aortic occlusion) and lower limb refl exes (assessed as the subjective strength of response) for a period of 10 hours after reperfusion. Tissue damage was evaluated using Nissl staining. Results Both EPO-treated and cEPO-treated animals presented with attenuated spinal cord injury in the Nissl staining (median (quartile) percentage of damaged neurons in the thoracic segments: control 27 (25,44), cEPO 8 (4,10), and EPO 5 (5,7), P <0.001 vs control group; in the lumbar segments: control 26 (19,32), cEPO 7 (5,13), EPO 8 (5,10), P <0.001 vs control group). However, while only cEPO treatment was associated with recovery of the MEP amplitude to pre-occlusion values when compared with the control group (P <0.05), lower limb refl ex response was comparably restored stronger in both treatment groups (P <0.05 vs control). Conclusions In a clinically relevant porcine model mimicking aortic crossclamping during vascular surgery repair of thoracic aortic aneurysm, cEPO protected spinal cord function and integrity as eff ective as EPO when applied at equipotent doses. Acknowledgements Supported by the Deutsche Forschungs gemeinschaft (SCHE 899/2-2). Figure 1 (abstract P2). Phenylbutyrate reduced the unfolded protein response. Introduction In awake, spontaneously breathing mice, inhaling hydrogen sulfi de (H 2 S) induced a hibernation-like metabolic state characterised by reduced energy expenditure and hypothermia [1], which protected against otherwise lethal hypoxia [2] and hemorrhage [3] as a result of impaired cellular energy metabolism [4]. Therefore, we investigated the metabolic eff ects of inhaled H 2 S in our model of resuscitated murine septic shock. Methods Sixteen hours after induction of sepsis by cecal ligation and puncture (CLP) or sham operation, anesthetized and mechanically ventilated mice received 100 ppm H 2 S or vehicle over 5 hours at body temperatures of 38 and 27°C, respectively. During the observation period, hyperdynamic hemodynamics were maintained by colloid resuscitation and noradrenaline infusion [5]. Endogenous glucose production was calculated from blood 13 C6-glucose isotope enrichment derived from the rate of appearance of stable, non-radioactive labeled 1,2,3,4,5,6-13 C6 glucose during continuous isotope infusion [6]. Whole-body glucose oxidation rate was derived from the total CO 2 -production rate, the mixed expiratory 13 CO 2 / 12 CO 2 isotope ratio and the 13 C6-glucose infusion rate after the steady state was achieved. Results While endogenous glucose production was not aff ected by hypothermia, it was signifi cantly higher in the septic animals when compared with the corresponding sham operated groups, most likely due to the ongoing noradrenaline infusion. In contrast, despite the catecholamine infusion and higher glucose release, whole body glucose oxidation was signifi cantly reduced in normothermic septic animals. During hypothermia, H 2 S shifted substrate towards preferential glucose utilisation, but this eff ect disappeared in the septic mice. Conclusions H 2 S inhalation alone does not infl uence glucose metabolism once temperature is maintained at normothermic levels in anesthetised and mechanically ventilated mice. The H 2 S-related shift of energy metabo lism towards preferential carbohydrate oxidation present during hypothermia is blunted during sepsis, possibly as a result of the ongoing catecholamine treatment. Introduction Unfolded protein response (UPR)-mediated apoptosis plays a pivotal role in ischemia-reperfusion injury. Sodium 4-phenylbutyrate (PBA) has been reported to act as a chemical chaperone inhibiting UPR-mediated apoptosis triggered by ischemia in various organs other than the heart. Therefore we investigated whether PBA reduces UPR-mediated apoptosis and protects against myocardial ischemia-reperfusion injury in mice. Methods C57BL/6 mice were subjected to 30 minutes LAD ischemia followed by reperfusion. PBA (100 mg/kg) or PBS (control) was administrated intraperitoneally just before ischemia. Apoptosis, infarct size and tissue protein levels of Grp78 and caspase-12 (UPR-mediated apoptosisassociated protein) were evaluated by TUNEL, TTC stain and western blot analyses, respectively, at 48 hours after ischemia (n = 5 for each group). Echocardiography was performed at 3 weeks after ischemia and the survival ratio was observed (n = 9 for each group). Results Compared with PBS, PBA reduced apoptotic cells (30.8 ± 0.2% vs 20.5 ± 0.5%, P <0.05) and infarct size (32.0 ± 3.8% vs 13.0 ± 2.1%, P <0.01) after ischemia-reperfusion. Grp78 and caspase-12 were increased in mice with PBS, but PBA attenuated the increase in Grp78 (P <0.05) and caspase-12 (P <0.05). PBA inhibited the deterioration of cardiac parameters including LVEDD (3.35 ± 0.08 mm vs 2.74 ± 0.11 mm, P <0.01), LVESD (2.30 ± 0.08 mm vs 1.54 ± 0.12 mm, P <0.01), and %FS (31.3 ± 2.2% vs 39.4 ± 2.2%, P <0.05). All mice with PBA survived, but 33% animals with PBS died. Conclusions PBA maintained cardiac function and improved survival ratio after myocardial ischemia-reperfusion by reducing UPR-mediated apoptosis in mice. References Introduction In awake, spontaneously breathing mice, inhaling hydrogen sulfi de (H 2 S) induced a hibernation-like metabolic state characterised by reduced energy expenditure and hypothermia [1] , which protected against otherwise lethal hypoxia [2] and hemorrhage [3] . In contrast, other authors reported that inhibition of endogenous H 2 S synthesis attenuated posthemorrhage organ dysfunction [4, 5] . All these data originate, however, from unresuscitated models using a pre-treatment design. Therefore we investigated the time-dependent eff ect of intravenous H 2 S in a clinically relevant, long-term model of porcine hemorrhage and resuscitation. Methods After surgical instrumentation, pigs were subjected to 4 hours of hemorrhagic shock induced by removal of 40% of the calculated blood volume and thereafter by additional removal or retransfusion of blood boli as needed to maintain MAP = 30 mmHg. Animals randomly received vehicle (control, n = 14) or the intravenous H 2 S donor Na 2 S started 2 hours before hemorrhage (pre-treatment, n = 11), at the beginning of blood removal (early post-treatment, n = 10) or at the beginning of resuscitation (late post-treatment, n = 10). In all groups the Na 2 S infusion was continued over the fi rst 10 hours of reperfusion. Resuscitation comprised retransfusion of shed blood, colloid volume expansion, and noradrenaline titrated to keep MAP at pre-shock levels. Systemic, renal and liver perfusion, O 2 exchange, and organ function were assessed before and at the end of hemorrhage as well as at 10 and 22 hours of resuscitation. Results Survival (71% in the control vs 100, 91, and 90% in the pretreatment, early post-treatment and late post-treatment groups, respectively) was signifi cantly improved in all treatment groups. The noradrenaline infusion rate required to maintain hemodynamic targets was signifi cantly reduced in the early post-treatment group only, which coincided with a progressive drop in core temperature and attenuated kidney dysfunction (blood creatinine levels, creatinine clearance) in these animals. Conclusions Na 2 S application improved survival regardless of the drug timing. The less benefi cial eff ect of pre-treatment on organ function may be due to the higher total amount of drug infused, possibly suggesting some toxicity at these doses. Translation of previous animal studies into human ICU clinical trials has frequently produced negative results. Most of these animal studies have had high baseline mortality and have not employed standardised management of sepsis as usually provided in an ICU. The aim of this study was to develop a large animal model of septic shock receiving standardised intensive care management, thus replicating the management of septic shock in humans. Methods Eleven Marino ewes (weight 60 to 70 kg, hemiazygous vein ligated) were anaesthetised and had radiological guided catheters inserted into the iliac, renal, and hepatic veins, coronary sinus, and the pulmonary and carotid arteries. Tracheostomy tubes were inserted and the animals mechanically ventilated while supported in a sling. Six sheep were administered intravenous E. coli (ATCC 25922) 1.0 x 10 8 orgs/kg over 1 hour (septic sheep), fi ve received placebo (nonseptic sheep). For 24 hours, animals were monitored and received sedation (midazolam + ketamine), ventilation, fl uids and inotropes according to a protocol. Primary end-point was noradrenaline (NA) dose to maintain mean arterial pressure (MAP) of 75 mmHg. Secondary end-points included haemodynamic variables, respiratory, hepatic, and renal function, haematology, acid-base status and global, hind-limb, renal, hepatic and coronary oxygen extraction ratio (OER). Results Sheep were successfully instrumented, monitored and supported for 24 hours. Septic sheep required NA (mean dose 0.28 μg/kg/min vs 0.00, P <0.001), developed a higher cardiac index (6.6 l/m 2 vs 4.3, P <0.05) and lower SVRI (769 dynes/m 2 vs 1,804, P <0.05). At 24 hours, septic sheep had renal impairment (creatinine 286 mmol/l vs 76, P <0.05; urea 12 mmol/l vs 7, P <0.05), metabolic acidosis (pH 7.21 vs 7.39, P <0.05; lactate 10.9 mmol/l vs 1.2, P <0.01; pCO 2 32 vs 31, P = 0.63), coagulopathy (INR 5.9 vs 1.9, P <0.05; fi brinogen 0.9 g/l vs 2.7, P <0.05) but preserved respiratory and hepatic function. Global OER was lower in septic sheep (0.16 vs 0.29, P <0.05) as was coronary OER (0.36 vs 0.68, P <0.05). OER did not change with sepsis in the kidney (0.09 vs 0.11, P = 0.52), liver (0.24 vs 0.31, P = 0.48) and hind-limb (0.31 vs 0.42, P = 0.23). Conclusions We have developed a large animal model of septic shock that receives intensive care support and standardised management. This model replicates much of the pathophysiology and management that occurs in human septic shock. It allows a large range of physiological parameters to be assessed when investigating new therapies for sepsis. Introduction Sepsis-induced lymphocyte apoptosis plays a fundamental role in the pathophysiology of sepsis. Recent animal models of sepsis have identifi ed anomalies in the extrinsic apoptotic pathway, a key pathological occurrence in sepsis [1] . Specifi cally apoptosis markers such as caspases 1, 3, 8, 9 and FADD have been shown to be signifi cant in animal models of infection [2] .We investigated mRNA transcription of these markers in a human model of severe sepsis. We hypothesized that ICU mortality from severe sepsis is associated with distinctive gene expression of extrinsic apoptosis markers. Methods A prospective observational study of patients with severe sepsis was performed. Mononuclear cells were isolated from 48 patients with severe sepsis. Total RNA was extracted from samples for day 1 of admission and again on day 7. FADD, caspase 1, 3, 8, and 9 mRNA was quantifi ed with quantitative real-time polymerase chain reaction (qRT-PCR). Standard demographic and outcome data were recorded. Between-group comparisons were performed by Wilcoxon rank sum test. All values are stated as median and interquartile range. Results Sixteen of the 48 patients died in the ICU. Caspase 9 mRNA copy numbers were signifi cantly increased on day 7 in the survivor group (5.4 x 10 6 ; 7.4 x 10 6 to 8.9 x 10 6 ) compared with death in the ICU group (1.9 x 10 6 ; 3.0 x 10 6 to 1.2 x 10 6 ) P = 0.001. FADD, caspase 1, 3 and 8 mRNA copy numbers were not signifi cantly diff erent between patients who died and those discharged from the ICU on either day 1 or day 7 of admission. Conclusions Caspase 9 may be an important regulator of apoptotic mechanisms in humans with late sepsis. Pro-apoptotic mechanisms may have a role in the resolution of severe sepsis. Conclusions Individual variability occurs in both faecal and zymosan peritonitis models as shown by heterogeneous clinical responses and local immune cell numbers to the same dose in similar animals. The cellular immune response in both models is consistent with current understanding of infection-induced infl ammation. Neutrophils, but not macrophages, rose in proportion to worsening clinical severity. The signifi cance of F4/80 + /GR-1 + cells in the FS model requires further evaluation. Introduction T lymphocytes are crucial immune cells. We analysed T-cell subsets phenotypes and tested, on a single cell level, their ability to produce key cytokines in early human sepsis. Methods Whole blood was collected from septic patients on ICU admission. Peripheral blood mononuclear cells (PBMC) were isolated and T-cell subsets analysed. To study cytokine production, PMBC were cultured in the presence of PMA/ionomycin (50/750 ng/ml) in supplemented RPMI 1640 for 5 hours. Intracellular cytokines IL-4, IL-10, IL-17, IFNγ were stained in CD3 + /CD4 + , CD3 + /CD8 + cells using fl ow cytometry for both. The number of cytokine producing cells was compared with age/sex-matched healthy human volunteers. Data are expressed as mean ± SEM. Results There were 12 patients (66 years old, six males, APACHE II-24, eight survivors) and nine volunteers. We found a relative increase in the frequency of Treg cells while the proportion of CD4 + cells remained unchanged in septic patients. The PMA/ionomycin lead to maximal T-cell stimulation, testing the ability of individual cell subsets to produce cytokines. Septic patients displayed reduction of IFNγ (10.5 ± 0.8% vs 14.7 ± 1.9%, P <0.01) and a tendency to higher number of IL-10 (1.7 ± 0.3% vs 0.5 ± 0.1%, P = 0.10) producing CD4 + cells, while the proportion of IFNγ-positive CD8 + cells increased (42.8 ± 5.8% vs 28.1 ± 4.9%, P = 0.03). However, the overall CD8 + T-cell population was reduced (14.29 ± 1.6% vs 25 ± 1.2%) following ex vivo activation in patients. The number of IL-4 and IL-17 staining cells was unchanged ( Figure 1 ). Conclusions Our results confi rm a relative increase of Treg [1] and a skew towards Th2 lineage in the CD4 + cells. The highly activated CD8 + cells appear to be more susceptible to activation-induced cell death. Introduction It is currently being understood that most of the agents modulating host response in sepsis have failed because they act to refrain an over-exaggerated immune response whereas immunoparalysis takes place on the time of their administration. The eff ect of IFNγ on immunoparalysis of monocytes in sepsis was assessed. Introduction Nitric oxide (NO) plays a central role in the pathogenesis of sepsis. Recently, we demonstrated that endothelial NOS (eNOS) contributes to endogenous NO-production and modulates infl ammation, associated with preserved cardiac function resulting in prolonged survival of eNOS -/compared with wild-type (WT) mice. The role of neuronal NOS (nNOS) in septic cardiomyopathy remains unclear. This study's aim is to elucidate the infl uence of nNOS in the presence/absence of eNOS on cardiac function and survival in a clinically relevant model of sepsis. Methods Inhibition of nNOS was achieved via continuous application of the selective nNOS-inhibitor Vinyl-L-NIO (VL-NIO) (0.02 mg/kg BW/hour) using an osmotic mini pump. B6/c57 WT and eNOS -/mice were rendered septic by cecum ligation and puncture (CLP). After 12 hours heart function was analyzed using a pressure-volume catheter placed in the left ventricle. For catecholamine responsiveness, norepinephrine was applied (0.4 μg/kg BW/minute, intraperitoneally). NOx-levels in plasma were measured using high-pressure performing liquid chromatography. Results Inhibition of nNOS via VL-NIO application resulted in signifi cantly reduced nitrate plasma levels and prolonged survival (WT CLP + VL-NIO = 38 Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 hours vs WT CLP = 29 hours). However, cardiac function and norepinephrine responsiveness were not improved compared with untreated septic WT. In contrast to WT, application of VL-NIO in eNOS -/had no infl uence on plasma nitrate levels, while cardiac function and survival were signifi cantly impaired compared with untreated septic eNOS -/-. Impaired cardiac function was accompanied by decreased survival time (eNOS -/-CLP + VL-NIO = 29.5 hours vs eNOS -/-CLP = 69.5 hours). Conclusions Pharmacologic inhibition of nNOS result in signifi cant reduction of plasma nitrate levels and prolonged survival compared with untreated septic WT despite unimproved septic cardiomyopathy. In contrast, the signifi cant survival benefi t of septic eNOS -/compared with septic WT was abrogated by pharmacologic nNOS inhibition. Furthermore, the latter developed severe septic cardiomyopathy. Whether eNOS acts as modulator of nNOS in this setting remains to be clarifi ed by further studies. Introduction High mobility group box protein-1 (HMGB) is released by activated monocytes/macrophages during sepsis. Levels correlated with mortality in patients and anti-HMGB antibodies prevented endotoxininduced death in animal models [1] . In marked contrast to these fi ndings are reports of cytoprotective eff ects because HMGB protected liver from ischemia/reperfusion (IR) [2] and HMGB-transgenic mice were resistant to myocardial infarction [3] . Despite these reports, little is known of the role or mechanism of HMGB. We aimed to further evaluate HMGB in rat hearts and in pulmonary and renal cell cultures. Methods Isolated hearts received HMGB/vehicle before 30 minutes ischemia and 120 minutes reperfusion. In separate studies, rats were rendered septic by caecal ligation and puncture (CLP) prior to IR. Left ventricular (LV) function was measured with a latex LV balloon. Infarct size was measured by TTC staining. For human cell culture studies, A549 alveolar cells or HK-2 renal tubular cells were treated with HMGB/vehicle before incubation in normoxia or hypoxia. Fluorimetric caspase-3 activity was used as measure of apoptosis. Cytochrome C was measured by ELISA. For HK-2 cells, the MTT assay was used to test viability. Results HMGB, prior to cardiac IR, preserved developed (d) LVP at 120 minutes reperfusion compared with controls (44 mmHg vs 27 mmHg, P <0.05) and infarct size, expressed as %V weight ± SEM, was reduced (31 ± 5.4 vs 44 ± 4.2, P <0.05), n = 6/group. Hearts from CLP rats had reduced baseline dLVP (59 mmHg ± 5.1 vs 97 ± 5.3, P <0.05) but infarct sizes after IR were signifi cantly reduced (27 ± 8.8, P <0.05 vs 44 ± 6.2, P <0.05). In A549 cells, HMGB inhibited apoptosis (49% reduced caspase-3 activity (8.6 vs 17.1 a.u., P <0.01)). This was associated with reduced cytochrome C release (250 vs 389 pg/ml, P <0.05). Using MTT assay, HMGB protected against hypoxia-induced cell death vs controls (viability 75% vs 61%, P <0.05). Conclusions HMGB, or sepsis, similarly precondition the heart against IR injury. HMGB has anti-apoptotic eff ects that protect against hypoxic renal and alveolar cell injury. We conclude that HMGB has potent anti-ischemic eff ects in multiple organs and may function as an innate cytoprotective mediator in sepsis. The purpose of the study is the evaluation of B-type natriuretic peptide (BNP) as a predictor of septic complications and ICU mortality in patients with a new onset of fever during the fi rst 3 days of hospitalization in the ICU. Methods Thirty-one ICU patients (21 males and 10 females) with new onset of fever and leukocytosis within the fi rst 3 days of ICU admission were prospectively included in the study. Exclusion criteria were heart or renal failure, chronic obstructive pulmonary disease and head trauma. Serial plasma samples were taken on days 1, 2 and 4 after the onset of fever for BNP level measurement. BNP values were correlated with severity scores (Acute Physiology and Chronic Health Evaluation (APACHE II) and Sequential Organ Failure Assessment (SOFA)), the progression to septic shock and the fi nal outcome. Results According to the clinical and laboratory fi ndings within the fi rst 3 days of hospitalization, the patients included in the study were divided into three groups: Group A = systemic infl ammatory response syndrome (SIRS) (seven patients), Group B = sepsis (14 patients) and Group C = septic shock (10 patients). The BNP value on days 1 and 2 was signifi cantly associated with the SOFA Max value (P <0.001). The BNP value on day 4 was signifi cantly associated with ICU mortality (P = 0.006). The optimal cutoff BNP value for diff erentiating between nonsurvivors and survivors was estimated to be 203.55 pg/ml (sensitivity = 100%, specifi city = 61.1%). In Group B patients, BNP value on day 2 was signifi cantly higher in patients who fi nally progressed to septic shock (P = 0.001). The optimal cutoff BNP value for identifying these patients was estimated to be 212.45 pg/ml (sensitivity = 85.7%, specifi city = 64.3%). Conclusions In ICU patients with new onset of fever during the fi rst 3 days of ICU hospitalization, the BNP value on day 4 seems to be a good predictor of ICU mortality. In patients with sepsis, a cut-off BNP value of 212.45 pg/ml on day 2 could be a predictor of progression to septic shock. Due to the small number of patients included in our study, further studies are needed to confi rm these fi ndings. Introduction CD64 is the high-affi nity receptor of IgG. It is upregulated by infl ammatory cytokines on neutrophils. The upregulation of CD64 is linked with PMN activation in SIRS or sepsis. Our aim is to verify these correlations. Methods We enrolled 48 ICU critical patients and three groups were created: SIRS -(17), SIRS + (13), Sepsis (17). We evaluated CD64 X-mean values among the three groups as shown in Figure 1 . We used a t test to check the correlation between uprising CD64 and diff erent groups. We verifi ed the correlation CD64/SOFA ( Figure 2 ), WBC, NE, and patients' age using r. Results We found a CD64 higher level in sepsis compared with SIRS -(t = 8.095; P <0.001) and with SIRS + (t = 4.938; P <0.001). There were a high positive linear correlation between CD64 and SOFA score (r = 0.806; Introduction The incidence of sepsis is reported around 37% in European ICUs [1] . The mortality rate depends on the severity of organ failure, up to 65% if four or more organs are involved. Multiple organ failure (MOF) is due to microcirculatory dysfunction with microthrombosis resulting from coagulation disorders including platelets' activation. An early diagnosis should identify the microcirculatory dysfunction before MOF became clinically evident. The diagnosis of sepsis is commonly based on clinical criteria, pathogen identifi cation and use of markers like procalcitonin (PCT) and C-reactive protein (PCR) associated with infection. The aim of our study is to evaluate whether the routine measurement of immature platelet fraction (IPF), considered a precocious marker of platelet production, is associated with sepsis and its severity and/or whether it could be used as a predicting marker of sepsis. Methods We enrolled 66 consecutive patients admitted to the ICU, dividing them into two groups: septic (n = 44) and no septic (n = 22). The severity of sepsis was evaluated. The exclusion criterion was a platelet count <150,000/mm 3 . Blood count, coagulation, PCR, PCT, and IPF were collected every day. The IPF values between septic (4.6 ± 3.1) and no septic patients (3.3 ± 1.5) did not diff er (P = 0.16). No correlation was found between IPF values and the severity of septic condition (no sepsis 11.7 ± 10.1; sepsis 14.3 ± 10.5; severe sepsis 10.5 ± 9.1; septic shock 19.5 ± 12.4; P = 0.3). When we considered only subjects who did not have sepsis at the ICU admission we found that patients who developed sepsis during the recovery had IPF values higher than patients who did not develop sepsis (Table 1) . Conclusions From our results IPF cannot be considered a marker of sepsis. Conversely it could be used as predictive index of sepsis because it can identify patients who will develop sepsis. Introduction The role of matrix metalloproteinases (MMPs) and tissue inhibitors of matrix metalloproteinases (TIMPs) in sepsis remains unclear. Introduction Serum C-reactive protein (CRP) is synthesized in response to infl ammatory status. Elevated CRP levels are associated with an increased risk of multiorgan failure in ICU patients. Preoperative elevation of serum CRP is a prognostic indicator in gastric and colorectal surgery. The aim of this study was to determine serum CRP as prognostic variable in patients undergoing esophagectomy with gastric tube reconstruction in contrast to other ICU admitted patients. Methods Data were collected retrospectively for a total of 208 patients admitted to the ICU following elective surgery from October 2007 to December 2008. Patients included underwent esophagectomy with gastric tube reconstruction, liver transplantation, hemihepatectomy, neuro-and abdominal aneurysm surgery. Postoperative serum markers were measured and the relation between the course of postoperative serum CRP, development of complications and prognosis of the patients was investigated. Results Postoperative serum CRP was signifi cantly higher at T24 in patients undergoing esophagectomy with gastric tube reconstruction, compared with all other patients. Higher serum CRP levels correlated with the occurrence of complications in the heterogeneous ICU population but especially in the esophagectomy patients. Within this group, serum CRP levels at T24 and T48 were signifi cantly higher in patients with a postoperative pneumonia, which in itself was associated with increased 1-year mortality. Conclusions Postoperative serum CRP levels can easily be monitored in the ICU in order to identify patients at risk for the development of postoperative complications. Especially in the esophagectomy patients, the occurrence of postoperative complications is associated with reduced survival. Introduction To examine whether we can safely shorten the periods of antibiotic administration for septic patients with procalcitonin (PCT) measurements compared with no PCT measurements. Methods The participants were septic patients including (1) hospitalized to our ICU from February 2009 to November 2009, (2) administered antibiotics for 4 days and over, and (3) stopped the antibiotics during their ICU stays. We treated the patients from February to June without PCT measurements (Group A). On the other hand, we treated the patients from July to November referring to serum PCT level (Group B). In group B, when : sepsis secondary to hydronephrosis. This graph clearly shows the rise in procalcitonin (PCT) during early onset of sepsis and its fall during antibiotic treatment. Antibiotics were discontinued when the PCT level fell to 80% of its peak value on day 8. Example 2 (right): laparoscopic cholecystectomy complicated by bile peritonitis. Good response to antibiotics initially, but PCT started to rise on day 12 which prompted us to change antibiotics. Serial PCT is useful to assess the response to treatment. Introduction Bloodstream infection is a life-threatening condition with a high mortality rate, especially in intensive care and neutropenic patients. Standard diagnostics is based on blood culturing (BC). However, limitations of BC include relatively low sensitivities and a long time-toresult for the identifi cation of the pathogen, generally over 2 days and more. On the grounds of data from a multicentre study using a universal 16S rRNA gene PCR assay, SepsiTest™, molecular diagnosis is discussed as a rapid and sensitive tool for the detection and identifi cation of pathogens supportive of BC. A new commercial PCR test, SepsiTest™, for direct detection of bacteria in whole blood was compared with BC in terms of sensitivity, specifi city, predictive values and time to positivity (TTP) of bacterial infections of the bloodstream of critically ill patients. Methods The test, SepsiTest™ (Molzym, Bremen), comprises the extraction and 16S rRNA gene PCR detection of bacterial DNA in whole blood samples. Bacteria in positive samples were identifi ed by sequence analysis of the amplicon. In a prospective multicentre study, 342 blood samples from 187 patients with systemic infl ammatory response syndrome (SIRS), sepsis, or neutropenic fever were included. Results Compared with BC, the diagnostic sensitivity and specifi city of PCR/sequencing was 87.0% and 85.8%, respectively. The positivity rate of PCR/sequencing (25.7%) was higher than BC (15.8%). Of 31 PCR/ sequencing-positive, BC-negative patients, most of whom received antibiotics, the PCR results of 25 were judged as true or possible to bacteraemia. Using a routine testing workfl ow, time to positivity of the PCR test was on average decreased by 40 hours for anaerobe/fastidious infections and by 54 hours for yeast infections. Modest diff erences in circulating concentration of coagulation-fi brinolysis markers on ED presentation (day 1) and over the fi rst 7 days. P 1 , day 1 P value from ANOVA model adjusted for sex, race, co-morbidity, and smoking status; Tobit models used for censored data. P 2 , day 1 to 7 P value from mixed eff ects model adjusted for sex, race, comorbidity, and smoking status; Tobmit models used for censored data. Introduction Prompt initiation of appropriate antibiotic therapy improves outcome in critically ill patients [1] . In a tertiary-level ICU, we evaluated the appropriateness of, and adherence to an antibiotic guideline (based on local bacterial epidemiology) in CDC-defi ned hospital-associated infections. Methods We conducted a 6-month prospective observational study (April 2008 to October 2008) in consecutive ICU admissions of patients who satisfi ed investigator-adjudicated classifi cation of ICU-acquired infections according to CDC criteria [2] . We assessed patient characteristics including severity of illness at admission, ICU length of stay (LOS), appropriateness of initial antibiotic choice as judged by in vitro sensitivity results and appropriateness of the current guideline. Results are presented as mean (SD) or median as appropriate. Results During the study period there were 101 antibiotic starts in 65 patients with sepsis secondary to ICU-acquired infections. Medical patients formed 44% of the study cohort; whilst 23% of patients were general surgical and the remaining 33% were post cardiothoracic surgery. The age and admission APACHE II score of the study cohort was 61.8 (16.3) years and 18.4 (5.6) . The median LOS and ICU mortality of the cohort was 24 days and 27.6%. The most common CDC reportable diagnosis was clinical or microbiological confi rmed pneumonia (PNU1/PNU2/LRI) (n = 57), followed by intra-abdominal infection (SSI-GIT) (n = 10) and urinary tract infection (SUTI) (n = 8). The culture positivity rate was 71.2%. The appropriateness of the ICU antibiotic guideline is summarised in Table 1 . Monotherapy was used in 52.5% of episodes. The median length of antibiotic treatment with positive cultures was 7 days, and 5 days for culture negative episodes. In sepsis episodes with negative culture, antibiotics were stopped within 3 days in 17% of the episodes. Introduction Secondary peritonitis is the most frequent form of peritonitis, characterized by a high disease burden and a high mortality rate. Choice of adequate antibiotics is an independent factor for survival. The aim of this study was to compare treatment of secondary peritonitis with tigecycline (TG) with standard regimens (SR) from an economical standpoint. Methods After ethics committee approval, the study was performed as prospective, non-interventional cohort trial in 23 medical centers in Germany. Patients could be included if suff ering from severe secondary peritonitis treated in an ICU. Patients with pregnancy, aged below 18 years and milder forms of diseases (APACHE II score <15) were not eligible. In order to compare treatment with TG with SR, the following data were documented: demographic data, disease severity scores, causing microorganisms, laboratory parameters, and length of stay (LOS). Patients were analysed according to initial antibiotic choice except for perioperative prophylaxis. In order to balance for diff erences in co-morbidities and severity of disease, a matched-pairs analysis was performed based on logistic regression analysis. Results A total of 178 patients were enrolled (49 TG/129 SR). After logistic regression analysis and matching for gender, age ± 3 years, APACHE II score ± 1 and (non)existence of liver cirrhosis, arterial sclerosis and coronary heart disease, 15 matched pairs were built. Compared with the SR group, in the TG group was a tendency for higher creatinine, urea and glucose levels, higher number of co-morbidities (3.3 vs 3.0, NS) and higher number of pathogens isolated on initial surgery (2.2 vs 1.6, NS). There was a higher number of discharges 9/15 in the TG group (vs 7/15 SR, NS) and 6/15 patients died (vs 4/15 SR, NS). Considering these factors, there was a trend for shorter LOS in patients treated with TG (11 days vs 18 days and 8 days vs 16 days for survivors and nonsurvivors, respectively, NS) and total costs of ICU stay were signifi cantly lower in the TG group (€8,832 vs €15,482, P = 0.023). Conclusions In our non-interventional study, tigecycline tended to be used in patients with more severe co-morbidities. In spite of this, there was a trend for shorter LOS and treatment costs were signifi cantly lower, which make tigecycline an attractive treatment, also from a pharmacoeconomical standpoint. Introduction The purpose of our study was to compare the eff ectiveness of colistin monotherapy and colistin combination in the treatment of nosocomial infections with multi-resistant germs. Methods Retrospective study including 63 patients realized during 3 years from January 2006 to December 2008 in the medical ICU of University Hospital Ibn Rochd, Casablanca, Morocco. The study includes only the patients who suff er from nosocomial infection with multi-resistant germs, all of the sites are concerned. The patients have been divided into groups: group 1 including 30 patients treated by colistin only, and group 2 including 33 patients treated by the association colistin-rifampicin. The colistin has been administered intravenously and/or in nebulization and/ or in an intrathecal way according to the considered infectious site. The main criterion of judgment was the rate of mortality in the resuscitation unit, the second criteria of judgment were the ventilator weaning, the introduction of the vasoactive drugs and the supervening of side eff ects. Results Sixty-three patients judged as appropriate, have been included. The mean age of the patients was about 43.62 ± 17.34 years and APACHE 2 score at the admission was about 15 ± 5.69. The total mortality caused by infection was about 41.27%. The basic characteristics of the two groups were similar. The mortality in group 1 was about 36.66%, and about 69.69% in group 2 (P = 0.001), the rate of introduction of vasoactive drugs was about 23.33% in group 1 versus 48.48% in group 2 (P = 0.03). In group 1, 6,66% of the patients developed renal failure, against 12.12% of the patients in group 2 (P = 0,46). With the rifampicin, 27.27% of the patients of group 2 presented cytolysis. Conclusions This study suggests that colistin represents a good therapeutic alternative for the treatment of nosocomial infection with multi-resistant germs. However, our study is not without limits; it is a retrospective study, absence of randomization and the control group of patients. This study compared the eff ect on renal function of fl ucloxacillin and vancomycin antibiotic prophylaxis for elective fi rsttime coronary artery bypass grafting (CABG) surgery, using both direct biochemical markers and indirect clinical outcome measures. Recent evidence has suggested that vancomycin may be nephrotoxic in patients undergoing cardiac surgery. Methods A retrospective observational study of patients undergoing elective fi rst-time CABG was performed, covering a 13-month period. All patients received prophylactic antibiotics: fl ucloxacillin 1 g pre-operatively and three 1 g doses post-operatively. Patients who were MRSA-positive, MRSA unknown or penicillin allergic received an alternative regimen: vancomycin 1 g pre-operatively and 1 g post-operatively. Exclusion criteria: preoperative creatinine >133mmol/l, any antibiotics other than prophylaxis and haemodynamic support except <5 μg/kg/hour dopamine. Results Of 1,413 patients in the study period, 415 met the study criteria: 360 patients received fl ucloxacillin and 55 patients received vancomycin. There were no signifi cant diff erences between the two groups in sex, age, BMI, euroSCORE, diabetes status, ejection fraction, pre-operative creatinine, eGFR, sodium, or potassium. Comparing change in renal function pre-operatively to post-operatively, there were no signifi cant group diff erences in change in: creatinine (mmol/l; VAN median 0 (IQR 11); FLU -2 (19); P = 0.22), eGFR (ml/min; VAN 0 (14); FLU 2.4 (19.3); P = 0.22), sodium (mmol/l; VAN 1 (4) ; FLU 1 (4); P = 0.28). Change in potassium diff ered signifi cantly (mmol/l; VAN 0.7 (0.9); FLU 0.5 (0.7); P <0.05). In clinical outcome measures, the groups were similar. Most patients in both groups stayed in ITU for 1 day and there was no signifi cant diff erence in the number of patients staying for longer than 1 day (VAN 7/55 (13%); FLU 29/360 (8%); P = 0.30). There was no diff erence in hospital length of stay (days; VAN 7 (4) ; FLU 6 (3); P = 0.19). Conclusions In elective fi rst-time CABG patients, there is no signifi cant diff erence in change in renal function between those given vancomycin antibiotic prophylaxis and those given fl ucloxacillin prophylaxis, as assessed by creatinine, eGFR and sodium levels, and indirect clinical outcome measures. Potassium increased more in the vancomycin group but the clinical signifi cance of this is unclear. Our data suggest that prophylactic vancomycin does not impair renal function relative to fl ucloxacillin. Introduction Antibiotic dosing recommendations are usually based on plasma or cerebral spinal fl uid pharmacokinetic (PK) studies. However, as infections mainly occur in extracellular tissue fl uid (ECF), corresponding unbound ECF antibiotic concentrations are responsible for the antimicrobial eff ect. Because of the blood-brain barrier, the cerebral antibiotic distribution is thought to be reduced compared with tissues without any physiological barrier. This study aims to determine meropenem (MPM) unbound concentrations in the brain and compare them with MPM concentrations in plasma to explore cerebral distribution of MPM in patients with acute brain injury. Methods After local ethic approval and written informed consent, two brain-injured patients, sedated, mechanically ventilated, receiving MPM for an infection and monitored by cerebral microdialysis (CMA 71, membrane length 10 mm, membrane diameter 0.6 mm, molecular cut-off 100 kDa; CMA, Stockholm, Sweden) were enrolled. The PK study succeeded to 1 g meropenem over 30 minutes and brain dialysates and blood samples were collected over 420 minutes. Probe recoveries were evaluated individually by retrodialysis. MPM was assayed by HPLC coupled with tandem mass spectrometry (LC-MS/MS). Results For each of the two patients, the MPM brain AUC is much lower than plasma AUC and accordingly brain to serum AUC ratios are respectively 0.73 for Patient 1 (P1) and 0.14 for Patient 2 (P2). MPM concentration versus time curves in brain are delayed (time-to-peak in P1 = 100 minutes, in P2 = 80 minutes) and present a smooth peak compared with the corresponding curves in plasma ( Figure 1 ). Mean probe recoveries are respectively 19 ± 7% for P1 and 29 ± 7% for P2. Conclusions The MPM brain AUC is much lower than plasma AUC for the two patients enrolled, consistent with the PK theory in the presence of tissue with effl ux transporters. More patients are needed to better understand MPM brain distribution characteristics. [1] . The aim of this study was to explore the eff ect of changing this antibiotic prophylaxis protocol on microbial fl ora in sputum. Methods Retrospective study. All patients admitted >10 days on the ICU after elective CV surgery between 1 June 2006 and 1 June 2007 (Group 1) were compared with the same group of patients between 1 July 2007 and 1 August 2008 (Group 2). Patients received single-dose prophylaxis (bypass surgery), or 2 days of prophylaxis (cardiac prosthetic surgery). Group 1 received cefuroxime and Group 2 received cefazolin. Patients did not receive selective digestive tract decontamination. Isolated pathogens from elective sputum cultures of all patients were registered and compared between groups after dividing pathogens into six classes based on the profi le of intrinsic antibiotic susceptibility. Comparative data between groups were analyzed with Pearson's Χ test. Results One hundred and fi fty-eight patients had positive sputum cultures. Comparing total amounts of positive cultures, there was no signifi cant diff erence between colonization with pathogenic micro-organisms between Group 1 (n = 77 patients) and Group 2 (n = 76 patients). In Group 1 more colonization with pathogens from class 4 (β-lactamase-producing Enterobacteriaceae and Pseudomonas aeruginosa) was observed compared with Group 2 (45% vs 34%, P = 0.079). This nonsignifi cant diff erence between groups was mainly attributable to a diff erence in colonization with Pseudomonas. In the other classes of pathogens, no diff erences were observed. No diff erence in postoperative wound infections was noted between groups. Conclusions Prophylactic use of cefazolin instead of cefuroxime after CV surgery resulted in a trend towards reduction of colonization of the respiratory tract with intrinsic β-lactam-resistant microbial fl ora in patients with prolonged ICU stay, without adverse eff ects on the incidence of postoperative wound infection. Reference Introduction We sought to study the characteristics and outcomes of ICU patients with carbapenem-resistant (CRKp) and carbapenem-sensitive (CSKp) K. pneumoniae infections. Fifty-one (49%) were males. The mean APACHE II score was 17.9 ± 6.9. The median duration of hospital stay until the infection was 28 days. Fortyeight patients (46.2%) had bacteremia, 27 (30%) urinary tract infections, 15 (14.4%) pneumonia, seven (6.7%) peritonitis and seven (6.7%) skin and soft tissue infections. Fifty-eight (56.9%) and 39 (39%) patients had previous and concurrent infections, respectively. Seventy-six patients (73.1%) died. The univariate analysis showed that prior hospitalization (P = 0.049), dialysis (P = 0.034), and history of urologic neoplasia (P = 0.041) were associated with the development of carbapenem-resistant infections. No independent risk factors were found in the multivariate analysis. APACHE II score (P = 0.003), need for dialysis (P = 0.034), shock prior and after the infection (P = 0.006 and P <0.001, respectively), respiratory distress prior and after the infection (P = 0.021 and P = 0.032, respectively), multiorgan failure prior and after the infection (P = 0.02 and P = 0.003, respectively), treatment failure (P <0.001), and acidosis after the development of infection (P = 0.003) were associated with death in the univariate analysis. Shock after the infection (P = 0.016) and treatment failure (P = 0.001) were independent predictors of mortality in the multivariate analysis. No diff erence in mortality was found between patients with CRKp and CSKp isolates. Conclusions Infection due to K. pneumoniae in the ICU is associated with high mortality. Infection treatment and hemodynamic support of the patient may be important determinants of the clinical course in critically ill patients with such infections. Reference Introduction Multidrug-resistant (MDR) pathogens constitute an emerging threat with increasing incidence and uncertain outcome. On the other hand, patients undergoing open heart surgery represent a vulnerable population. Methods Evaluation of the incidence and identifi cation of MDR pathogens after cardiac surgery in 2,803 patients for a 2-year period. Examination of the clinical features of patients with MDR infection. Elucidation of the impact on the outcome. In 18 patients (0.64%) consisting of 12 males and six females, at least one MDR pathogen was isolated. Gram-positive pathogen was certifi ed in fi ve patients (27.8%) and Gram-negative in 13 patients (72.2%). Specifi cally, four patients were infected with vancomycin-resistant enterococcus (VRE), six with Klebsiella pneumoniae, three with Acinetobacter spp., one with VRE and Acinetobacter spp., four with Acinetobacter and K. pneumoniae. Low output syndrome (CI <2.0l/min/m 2 ) was common in all these patients and essentially contributed to the deterioration of clinical situation with dependence on inotropic support, prolonged mechanical ventilation (>10 days), acute renal failure and need for haemodilution (66.6%). Therefore, the ICU and hospital stay is prolonged (>20 days and >30 days, accordingly) and pathogenesis of MDR infection is provoked after 20 days of ICU stay. Consequently, nine patients with MDR infection (50%) died. All were critically ill patients with multiple organ dysfunction syndrome under broad-spectrum antimicrobials with hospital-acquired bloodstream MDR bacteremia. Conclusions Infection with MDR pathogens, while rare, constitutes a notable prognostic marker of increased mortality after cardiac surgery. It is worth noting that the higher mortality rate is mainly attributable to the severe co-morbidity in haemodynamically compromised patients. Management must concentrate on the implementation of eff ective preventative strategies. Introduction Prior antimicrobial therapy is one of the most important factors leading to the acquisition of MDR organisms. Formulating antibiotic policy and choice of empirical antibiotic selection will be helped by knowing the association of MDR Gram-negative organisms with a previous exposure of particular antibiotic. Methods Prospective observational study during January 2008 to June 2009 in a 50-bed ICU in a tertiary care hospital. Specimens sent after 2 days of the start of the antibiotic and no more than 90 days from the stop date have been included in the study. Analyses were based on those specimens that resulted in detection of MDR Gram-negative organism. Observed relative risk (RR) of an antibiotic class was computed with respect to an MDR infection. RR was computed as the ratio of the risk of the event (acquiring the infection) occurring in the exposed group vs in the nonexposed group. A logistic regression model was used where multiple antibiotics were applied. Results A total of 1,072 specimens from 500 patients met the criteria as specifi ed above. Of these, 423 (39.4%) specimens resulted in detection of MDR bacteria, 186 (17.4%) resulted in detection of non-MDR bacteria and no bacteria were detected in the remaining 463 (43.2%) specimens. Of the total 423 cases of MDR acquisitions, ESBL Enterobacteriaceae (151 or 35%), MDR Acineto (89 or 21%) and MDR Pseudo (58 or 14%). Risk of isolating ESBL Enterobacteriaceae was highly signifi cant with the prior exposure to third-generation cephalosporin (RR = 5.8 and P <0.001). Risk of isolating MDR Acinetobacter spp. was highly signifi cant with the exposure to piperacillin-tazobactam (RR = 2.7 and P <0.001). Risk of isolating MDR Pseudomonas spp. was signifi cant with the exposure to Group 2 carbapenem (RR = 2.2 and P >0.001). Group 1 carbapenem, aminoglycosides have not been found to have signifi cant association with any individual MDR organisms. Conclusions Previous exposure to antibiotics leads to increased acquisition of MDR organisms. There is a signifi cant association of isolating diff erent MDR organisms with previous exposure to a particular class of antibiotic. Introduction The emergence of multidrug-resistant Acinetobacter baumanii (MDRAB) poses a serious threat to patients on the ICU. The production of metallo-β-lactamase leaves colistin as the only therapeutic option. Outbreaks due to MDRAB can persist for months. Traditional decontamination methods fail to deal with this level of colonisation and contamination eff ectively. We tackled a recent outbreak of MDRAB eff ectively using gaseous ozone. To our knowledge it is the fi rst time ozone has been used to control an outbreak of MDRAB. Methods An external company (Hydrozone Environmental Ltd) was hired to perform the fumigation. The ICU was divided into three decontamination areas using heavy-duty polythene sheets. Patients were in turn relocated from contaminated to clean areas before each area was sealed and fumigated. Humidity levels within were raised to 70 to 80% using a humidifi er. An Ozone Ultra Pro 16 g/hour ozone generator with ozone destruct capability, operated remotely, delivered ozone to a target concentration of >2.0 ppm for 15 minutes. A fan was used to achieve even dispersal. For safety reasons perimeter ozone concentrations were monitored with a UV photometer and kept below 0.05 ppm. The effi cacy of the fumigation was measured by environmental microbiological sampling before and after fumigation. Results All fumigated areas received ozone concentrations of 4.62 to 5.66 ppm for 21 to 32 minutes. Ozone was not detected outside the treatment areas. Prior to fumigation, 72 (38%) of 188 environmental samples were MDRAB-positive. Following fumigation, nine (5%) of 158 samples were positive. Most of these samples were from nontouch areas, for example ceiling, above door frame with signifi cant dust collection and without daily cleaning. Considering that dust may impede ozone penetration, Introduction In the current pandemic it is likely that some patients will be admitted to hospital and require respiratory support including mechanical ventilation. These patients are likely to have a profound systemic infl ammatory response syndrome (SIRS); consequently they may have multiorgan failure (MOF) requiring renal replacement therapy (RRT) with haemofi ltration. Two questions then arise -what dose of oseltamivir (Tamifl u) should we give these patients to shorten the duration of the H1N1 infection? How should we modify the dose in response to altered renal drug clearance and in those requiring RRT? Methods A young adult female patient with H1N1 infection and MOF was given oseltamivir 75 mg BD nasogastrically. Failure to respond changed the risk: benefi t ratio and justifi ed doubling the dose despite uncertainties over an overall reduced clearance. Enteral nutrition absorption was uncertain, and thus we sampled her blood to ensure adequate oseltamivir absorption and that activation of the pro-drug was not inhibited. We undertook serial sampling for blood concentration assay to determine the pharmacokinetic parameters in this diffi cult scenario. Blood samples were collected in plain serum tubes (without sodium fl uoride). The samples were spun and refrigerated within half an hour, then batched and shipped to Bangkok for drug concentration measurement. Results We report both the parent oseltamivir phosphate (OP) and that of the active metabolite oseltamivir carboxylate (OC). OP levels were low at 10 to 77 ng/ml. but OC concentrations were high at 2,600 to 5,000 ng/ml. Conclusions The population normal parameter for half-life OP is 1 hour and for OC it is 3 to 5 hours. A single dose of 150 mg OP is expected to achieve an OP level of 50 to 150 ng/ml and an OC level of 1,000 to 1,500 ng/ml. Our slightly low OP levels are likely to be due to ex vivo hydrolysis in the collection tube due to a lack of esterase inhibitor. The high OC levels are most probably due to reduced renal elimination despite being on haemofi ltration. Our concerns were focused on the possibility of viral mutation (subsequently shown to be negative) or poor enteral absorption/activation. What we found was that 150 mg BD produces more than adequate OC levels to treat H1N1 infection. Introduction Invasive candidiasis (IC) is associated with increasing morbidity and mortality in critically ill patients. This, in conjunction with diffi culties in diagnosis, underscores the need for novel treatment strategies based on the identifi cation of signifi cant risk factors for IC. The aim of the study was to evaluate the effi cacy and safety of a protocol for pre-emptive antimycotic treatment. Methods A randomized prospective controlled trial was carried out in a general ICU for 2 years. After the implication of the inclusion and exclusion criteria, patients were submitted to block randomization and stratifi ed on the basis of their initial SAPS II expanded score. We have developed a protocol for pre-emptive antimycotic treatment. Having reviewed the current literature, we combined the most signifi cant risk factors for IC with tree major clinical criteria for persistent nonbacterial sepsis and assumed this algorithm as an indication for starting pre-emptive therapy. According to the protocol, antimycotic therapy was started on the day of inclusion in the treatment group and only with proven IC in the control group. Initial data were gathered on demographic characteristics of the patients, proven risk factors for IC-related mortality (malnutrition, non-albicans colonization, creatinine clearance) and severity of infl ammatory response and organ dysfunction. Dynamics of SIRS and SOFA, subsequent Candida isolates, ventilator-free days, length of ICU stay, outcome and eventual adverse reactions were followed. Results A total of 110 patients (equal in both groups) were enrolled. No statistically signifi cant diff erences in the basal characteristics of the patients, length of ICU stay and the number of ventilator-free days were found. The delta SOFA score was signifi cantly lower in the treatment group (P = 0.019).The in-hospital mortality was 38.2% in the treatment group vs 61.8% in the control group (P = 0.013). The associated with pre-emptive therapy relative risk was 0.62 (95% CI = 0.4 to 0.94). Signifi cant diff erences between the Kaplan-Meyer estimates of survival were found (log-rank test P = 0.007). A total of 15 (13.6%) adverse reactions were observed among treated patients in both groups which was not associated with higher mortality risk. Conclusions The implementation of the developed protocol reduced the degree of organ dysfunction severity and was associated with signifi cant survival benefi t. The 2009 IDSA Treatment Guidelines for Candidiasis favor an echinocandin for initial treatment of candidemia in patients with severe illness. In a prospective, randomized study anidulafungin resulted in improved global response (GR) and a trend toward improved survival compared with fl uconazole [1] . Methods Retrospective analysis in patients classifi ed as severely ill at study entry: treatment initiated in an ICU (Group 1), APACHE II score ≥15 (Group 2) or presence of severe sepsis (Group 3). Within groups, anidulafungin was compared with fl uconazole for GR rate at the end of intravenous therapy and 14-day and 28-day mortality. Results In Group 1 (n = 89), GR was 63.3% vs 45.0% (95% CI: -2.2 to 38.8); Group 2 (n = 113), GR was 68.3% vs 46.0% (95% CI: 4.3 to 40.2); and Group 3 (n = 118), the GR was 67.7% vs 51.8% (95% CI: -1.6 to 33.5), in patients with MOD (n = 45), the GR was 76.2% vs 29.2% (95% CI: 21.3 to 72.8) anidulafungin versus fl uconazole, respectively. Across groups, an association with anidulafungin use and lower day 14 mortality was suggested (12.2% to 14.3% for patients receiving anidulafungin vs 19.6% to 28.0% for those receiving fl uconazole) (P = NS). See Figure 1 . Conclusions In patients with severe illness, anidulafungin was associated with greater GR than fl uconazole, signifi cantly so for those with APACHE II score ≥15 or with MOD, supporting the IDSA Guidelines. Introduction Direct hemoperfusion using a polymyxin B-immobilized fi ber column (DHP-PMX; Toray Industries Inc., Tokyo, Japan) was fi rst developed in 1994 and has since been used for the treatment of septic shock. Methods A total of 47 patients with septic shock who received DHP-PMX for 2 to 6 hours were retrospectively reviewed to examine any improvement in sepsis-related factors after DHP-PMX and to analyze the relationship between any such improvement and increase in SBP. Introduction Direct hemoperfusion using a polymyxin B-immobilized fi ber column (DHP-PMX) has been used for the treatment of septic shock. As an alternative method for acute blood purifi cation therapy, continuous venovenous hemodiafi ltration (CVVHDF) has been reported to be an eff ective clinical treatment for critically ill patients; however, the optimal column for performing CVVHDF remains controversial. On the other hand, recently, one of the lipid mediators, endocannabinoids (N-arachidonoylethanolamine (AEA) and 2-arachidonoyl glycerol (2-AG)) have been reported with a blood pressure decreased eff ect. Methods We investigated 14 polymethylmethacrylate (PMMA) membrane hemofi lters and three polyacrylonitrile (PAN) membrane hemofi lters after use in patients with septic shock. Therefore, in clinical study, we used CVVHDF after DHP-PMX to treat 32 patients with septic shock. To determine the optimal acute blood purifi cation therapy, we subsequently divided the patients into two groups: group A underwent CVVHDF using a PMMA membrane hemofi lter after undergoing DHP-PMX (n = 25); group B underwent CVVHDF using a PAN membrane hemofi lter after undergoing DHP-PMX (n = 7). In addition, the levels of endocannabinoids (AEA, 2-AG) were measured. The severity scores and the improvement of endocannabinoids were compared between the two groups. Results Endocannabinoids (AEA, 2-AG) were adsorbed more in the PMMA column (AEA; 506.3 ± 680.2 ng/column, 2-AG; 23.0 ± 38.5 μg/column) than in the PAN column (AEA; 1.5 ± 0.7 ng/column, 2-AG; 0.1 ± 0.1 μg/column). The average Acute Physiology and Chronic Health Evaluation (APACHE) II score and the average sepsis-related organ failure assessment (SOFA) score did not diff er signifi cantly between the two groups. Group A showed a better outcome compared with Group B (P = 0.05). In addition, only group A showed a signifi cant improvement in the blood AEA level on day 1 (P = 0.0185). Conclusions Our study suggests that the PMMA column might be the better column for performing CVVHDF after DHP-PMX treatment, as suggested by the adsorption and blood purifi cation of endocannabinoids. Reference Introduction Diagnosis and treatment of acute pneumonia (Pn) is a problem of high signifi cance in modern medicine. The incidence of complicated and lethal acute Pn has increased [1] . Candidate genes are of great importance in the course of acute diseases and their complications. Cytokines and xenobiotic detoxication genes are the most investigated. The aim of the investigation was to study genetic predisposition to acute Pn. Methods Results of the associative DNA polymorphism studies in 243 patients with acute community-acquired Pn are presented; 178 healthy individuals formed a control group. Genetic variability of the candidate loci was studied: renin-angiotensin ACE system gene, chemokine receptor CCR5 gene and four genes controlling xenobiotic detoxifi cation (CYP1A1, GSTM1, GSTT1, GSTP1). Multiplex polymerase chain reaction was utilized for genotyping of insertion-deletion polymorphism on loci ACE (287 nucleotide pairs) and CCR5 (deletion of 32 nucleotide pairs). The odds ratio index was used to describe the degree of association of the genotypes with the diseases. Statistical analysis was done by means of Fisher's exact test and the online program SNPStats (http://bioinfo.iconcologia.net). Results An increased predisposition to Pn development was registered in homozygotes in deletion at the ACE locus (OR = 1.8; P = 0.013), positive genotypes of the GSTM1 locus (OR = 1.7; P = 0.010) and homozygotes in allele 606T of the CYP1A1 gene (OR = 1.6; P = 0.020). Conclusions A combination of positive genotypes of the GSTM1 locus and homozygotes in allele 606T of the CYP1A1 gene (OR = 1.9, P = 0.006; incidence in controls >20%) presented with a most eff ective prognostic power. Reference associated with the progression of airway colonization (AC) to ventilatorassociated tracheobronchitis (VAT) and from VAT to ventilator-associated pneumonia (VAP) has been analyzed but not been fully elucidated. We endeavored to study the relationship between AC and development of infections (VAT and VAP) of the LRT in ICU patients. Methods Retrospective study of 400 consecutive ICU patients ventilated >48 hours in the past 4 years. Patients age, gender, APACHE II, prior illness, cause of admittance, length of stay (LOS), time of mechanical ventilation (MV), outcome, time of appearance of AC and infection of the LRT were registered. MODS and CPIS were measured at onset of infection (VAT-VAP) and 3 days after. MODS was also calculated on the day AC appeared. Bronchial secretions were cultured at admission, at least once a week and whenever there was a change in the amount and quality of bronchial secretions or clinical infection was suspected. The Mann-Whitney test was used for statistical analysis and statistical signifi cance was set at P <0.05. Results From the 400 studied patients 68 (17%) were colonized and 153 (48.25%) developed infections: 54 (13.5%) VAT and 99 (24.7%) VAP. Colonization appeared after 3.1 ± 0.8 days of ICU admission exclusively with Gram-negative microorganisms. Twenty-fi ve (36.7%) of the colonized patients developed infections of the LRT (with the same pathogen): 8 (11.7%) VAT after 4.6 ± 2.7 days and 17 (25%) were diagnosed with VAP after 9.8 ± 1.8 days. From the eight VAT patients, four developed VAP (50%) after 5.4 ± 0.4 days. None of the colonized patients died. The colonized patients who developed infections were elderly (P <0.05), more severely ill (P <0.04) and had at the time of infection a diagnosis higher temperature (P <0.01), more severe leukocytosis (P <0.02) but not statistically signifi cant organ dysfunction (P >0.3). The appearance of VAP caused more severe organ dysfunction (P <0.002), longer MV duration (P <0.01) and longer LOS (P <0.001) but did not infl uence mortality. On the day of AC detection MODS was 2.4 ± 0.5, on VAT detection MODS was 4.8 ± 0.9 and on VAP appearance MODS was 9.6 ± 0.7. CPIS at VAT detection was 3.9 ± 0.9 and at VAP detection 6.1 ± 0.5 (day I) and 7.2 ± 0.8 (on day III). Conclusions AC resulted in LRT infections in only one-third of our patients and the majority of LRT infections are not preceded by AC. Introduction In our murine model of infl uenza, signifi cant weight loss occurs up to day 7 post-infection [1] . We sought to determine whether weight loss from infl uenza could be altered by rehydration and whether this aff ects pulmonary immune responses. Methods Adult BALB/c mice were infected with X31 (H3N2) infl uenza (1:80) via the intranasal route and randomized to intraperitoneal rehydration with 20 ml/kg compound sodium lactate (CSL), normal saline (NS) or no rehydration (NR) starting on day 3 following infection and continued for 4 days (n = 5/group). On day 7, mice were challenged with 1 x 10 6 Streptococcus pneumoniae (serotype 2). Two further cohorts of mice were challenged with diff erent doses of infl uenza and rehydrated from day 3 to 7 to investigate pulmonary immune responses in the absence of bacteria. Mice were infected with 1:80 (n = 10/group) infl uenza and rehydrated once daily or 1:60 (n = 5/group) infl uenza and rehydrated twice (1:60) daily with 20 ml/kg CSL or not. Daily weight, survival following secondary bacterial pneumonia, number of colony-forming units (48 hours after bacterial challenge) from peripheral blood, lung, and nasal wash and cellularity in lung compartments were measured. Results Rehydration did not aff ect weight loss following 1:80 infl uenza infection (naïve mice (+0.3 ± 0.4 g), infl uenza plus NR (-1.58 ± 0.4 g), infl uenza plus CSL (-1.1 ± 0.7 g) and infl uenza with NS (-1.3 ± 0.4 g)). A repeat experiment with CSL once daily or twice daily did not alter weight loss compared with NR (P >0.05). Survival or CFU counts following bacterial pneumonia did not diff er between the groups (P >0.05). The total number or activational status of bronchoalveolar, lung macrophages/monocytes and lymphocytes was not aff ected by rehydration following infl uenza infection or 48 hours following bacterial pneumonia (P >0.05; P <0.05 vs naïve mice). Conclusions Rehydration does not aff ect immunity or pathophysiology in a murine infl uenza infection model. Assuming these results can be extrapolated to the clinical setting, our fi ndings support the use of conservative fl uid resuscitation strategies in patients with infl uenza. Reference Introduction Bacteremic pneumonia is associated with worse outcome including higher mortality. The ATTAIN program compared telavancin (TLV), a lipoglycopeptide antibiotic, with vancomycin (VAN) for treatment of nosocomial pneumonia (NP) due to Gram-positive pathogens including MRSA. This subgroup analysis examined the baseline characteristics and clinical outcomes in bacteremic HAP cases. Methods ATTAIN 1 and 2 were methodologically identical, randomized, double-blind, phase 3 studies. Adult patients with NP due to presumed or confi rmed Gram-positive pathogens were randomized (1:1) to TLV 10 mg/ kg intravenously every 24 hours or VAN 1 g intravenously every 12 hours (adjusted per site-specifi c guidelines) for 7 to 21 days. The modifi ed alltreated (MAT) population consisted of patients who received ≥1 dose of study medication and who had a respiratory pathogen recovered from baseline cultures. Bacteremic NP was defi ned by the identifi cation of a pneumonia-causing pathogen in the blood or of the same pathogen in lung and blood with identical susceptibility profi les. Clinical outcomes were assessed at test-of-cure (TOC) 7 to 14 days after end of study treatment. Results All MAT patients with bacteremic NP (n = 73) were included in this analysis. At baseline, more TLV patients than VAN patients were in the ICU (TLV 74%, VAN 62%) and had ventilator-associated pneumonia (TLV 59%, VAN 44%); APACHE II scores were similar between groups (mean ± SD, TLV 16 ± 6, VAN 17 ± 6). S. aureus was the most common pathogen (TLV 76%, VAN 69%) and included MRSA (TLV 41%, VAN 49%). Cure rates for TLV and VAN were 44% and 36%, respectively (diff erence TLV -VAN (95% CI) = 7.3% (-15.9%, 30.5%)). On-study mortality was similar, 41% in each treatment group. Incidences of adverse events (AE) were similar between groups, except for nausea (TLV 21%, VAN 3%) and vomiting (TLV 15%, VAN, 0%). Proportions of patients who discontinued the study medication due to AEs were similar (TLV 12%, VAN 13%). Conclusions TLV and VAN had similar cure rates in a subgroup of ATTAIN patients with bacteremic NP. The safety profi les of TLV and VAN were mostly comparable in these patients. modifi ed all-treated population (MAT; patients with baseline respiratory pathogen(s) who received ≥1 dose of study medication) and the microbiologically evaluable population (ME; protocol adherent MAT patients with baseline Gram-positive pathogen(s)). Patients with mixed Gram-positive/Gram-negative infections were excluded in this analysis. Results A total of 197 late VAP cases were analyzed. Baseline characteristics, including the APACHE II scores, were balanced between the treatment groups ( Figure 1 ). At least one adverse event (AE) was reported by 95% (106/112) and 93% (79/85) of MAT patients in the TLV and VAN groups, respectively, and 21% (23/112) of the TLV group and 22% (19/85) of the VAN group died during the study. Conclusions In this exploratory sub-group analysis, numerically higher cure rates were observed for TLV than for VAN in patients with late VAP. Incidences of reported AEs and mortality rates were similar between the TLV and VAN groups. Introduction VAP rates in Brazil are higher than those related in Europe and USA. The study objective was to examine the eff ect of the Institute for Healthcare Improvement's ventilator bundle plus oral decontamination with chlorhexidine (ODC) in the incidence of VAP in an ICU. Methods The study was conducted in a 20-bed, medical-surgical ICU. Criteria for nosocomial pneumonia are those from the CDC. Strategy was to implement the IHI's ventilator bundle plus ODC. The goals were the ICU team adhesion of 80% achieved in the ninth month after bundle implementation and 98% after 1 year of follow-up. These measures included fi ve strategies to prevent ventilator-associated pneumonia: 30 to 45° elevation of the head of the bed, adequate sedation level (Ramsay 2 or 3), DVT/PE prevention, peptic ulcer prophylaxis and oral decontamination with chlorhexidine 0.12%. From February 2009 onwards, the ICU nursing staff and ICT performed a daily checklist in order to observe the fi ve issues accomplishment. If any item was found to be inadequate it was promptly corrected. Introduction This systematic review aims to evaluate evidence from randomised controlled trials (RCTs) for oral chlorhexidine in preventing nosocomial pneumonia in intubated mechanically ventilated critically ill adults. Use of oral chlorhexidine appeals since it should reduce bacterial aspiration from the orophayrnx. A number of RCTs have recently been published on this topic. Methods Search of Medline, Embase, Cochrane library, grey literature registers, conference proceedings and reference lists for RCTs comparing chlorhexidine with placebo or standard care for prevention of pneumonia in the critically ill. Outcomes: episode of nosocomial respiratory tract infection (RTI), mortality, duration of mechanical ventilation (MV) and length of ITU stay (ITU LOS). Review Manager 4.2 (Nordic Cochrane Centre) was used for data synthesis. Eff ect estimates (odds ratio for dichotomous and weighted mean diff erence for continuous data) were calculated using a random eff ects model. Fourteen studies were identifi ed, three involving patients undergoing cardiac surgery (1,841 patients) and 11 involving patients in noncardiothoracic ITUs (1,497 patients; see Table 1 ). Five studies (including two cardiac studies) found a signifi cant reduction in episodes of nosocomial RTI in the chlorhexidine-treated group versus placebo or standard care. Pooled data indicated a signifi cant reduction in nosocomial RTI in the treatment group among all patients, and among cardiac and noncardiac sub-groups (odds ratio 0.57 (95% CI 0.42 to 0.77), 0.52 (0.37 to 0.75) and 0.6 (0.4 to 0.89), respectively). However, no signifi cant diff erences in mortality, duration of mechanical ventilation or ITU stay were demonstrated. Signifi cant heterogeneity (I 2 statistic >40%) was detected for all outcomes except mortality. Conclusions Use of oral chlorhexidine is associated with reduction in nosocomial respiratory tract infection in intubated mechanically ventilated critically ill adults. We have shown that a the combination of selective digestive decontamination with topical antibiotics (SDD) and a decontamination regimen using nasal mupirocin with chlorhexidine bodywashing (M/C) markedly reduced acquired infections (AI) in intubated patients as compared with SDD alone, M/C alone or none [1] . We report the surveillance of AI in our ICU before and after the implementation of multiple site decontamination (MSD) as a routine prevention procedure. Introduction The aim of this study was to analyze the correlation between antiviral therapy effi cacy and the negative profi le of the RT-PCR made on pharyngeal swab, subglottic aspiration, and bronchoalveolar lavage in patient aff ected by ARDS caused by H1N1 infection. Methods A prospective analysis was performed on 11 patients admitted to the ICU of a tertiary referral center (Careggi Teaching Hospital, Florence, Italy). All patients underwent daily RT-PCR monitoring on pharyngeal swab, subglottic aspiration, and bronchoalveolar lavage. All patients were treated with oral administration of oseltamivir (75 mg twice daily) and inhaled zanamivir (10 mg twice daily) since ICU admission. Six patients were treated with extracorporeal membrane oxygenation (ECMO) due to their critical respiratory conditions. Two of them resulted co-infected by legionella pneumophila. Results As shown in Figure 1 , RT-PCR from pharyngeal swab at ICU admission failed to demonstrate the viral infection in four patients, whereas RT-PCR from bronchoalveolar lavage had a sensibility of 100%. Similarly, the time course showed that RT-PCR from pharyngeal swab resulted negative in an average time of 3 days after therapy start, while RT-PCRs from bronchoalveolar lavage continued to permit infection monitoring and therapy regimen conduction. None of RT-PCRs on subglottic aspiration samples resulted positive. All patients recovered and were discharged alive from ICU in spontaneous breathing. Conclusions In our experience, the most reliable method to diagnose and monitor H1N1 infection was RT-PCR from bronchoalveolar lavage, since pharyngeal swab do not off er enough sensibility, either for antiviral therapy initiation or for antiviral therapy management. Samples from subglottic aspiration can be avoided due to a low sensibility. Bedside lung ultrasound is able to identify with high sensibility and sensitivity most pulmonary pathological patterns and is widely adopted in critically ill patients' daily management. It is a feasible and reliable method for the identifi cation of the lung pathological patterns caused by H1N1 infl uenza infection. Methods The study took place in the ICU of a regional referral center, with ECMO availability, for respiratory failure (Careggi Teaching Hospital, Florence, Italy). Eight patients admitted for H1N1-induced ARDS (September to October 2009) underwent daily LU examination. The examination was standardized with a procedure ad hoc in order to achieve complete and comparable reports for every patient. Patients were examined supine, taking lateral and anterior views and, if possible, on one side. Intercostal spaces were used as acoustic windows. Every single examination and its fi nding have been reported in our ICU database. Results Pleural thickness was present in 100% of cases, mostly related to little and multiple pleural consolidations. Anyway, pleural gliding was present most of the time (87.5%), even if with a visible decrease of its movement. This was replaced with the lung pulse only in the presence of important lung consolidation. Alveolar interstitial syndrome (AIS) was always present on the whole lung, ranging from moderate to severe (100%), with high positivity at the base, posteriorly (100%). White lung appeared in every patient, most at the lung base and in the middle fi elds, posteriorly. As far as consolidation is concerned, its presence was confi rmed in 100% of the patients, associated with satellite multiple subpleural consolidations in 37.5% of patients. Basal lung was always involved (100%), followed by middle (50%) and apical fi elds(25%). Bronchogram was present in 100% of patients in bigger consolidations. Their aspect was aerial and only in one patient, with severe consolidations, turned into a fl uid one. Anechoic pleural eff usions were found in 37.5% of patients. No cases of pneumothorax were detected. Conclusions In this group of patients, H1N1 infection shows diff erent lung patterns altogether, where the most frequent seemed to be severe basal posterior AIS, multiple subpleural lung consolidations, and multiple parenchymal consolidation with bronchogram. The presence of spared areas did not seem to belong to H1N1 LU patterns. Introduction Pneumocystis pneumonia (PCP) in HIV-negative patients frequently presents as fulminant respiratory failure and is associated with a high mortality rate when the patient requires mechanical ventilation. The aims of this study were to evaluate the outcome and prognostic factors in the patients with HIV-negative PCP requiring mechanical ventilation (MV). Methods We retrospectively reviewed the medical records and collected the HIV-negative patients who were microbiologically confi rmed as PCP and required MV in ICU over a 10-year period in a tertiary care teaching hospital. Results A total of 51 patients were identifi ed. Mean age was 55.4 ± 15.0 years. Mean APACHE II score at ICU admission was 25.7 ± 5.8. The 28-day mortality and in-hospital mortality were 45.8% and 66.7%, respectively. Between survivors and nonsurvivors, there were no signifi cant diff erences in baseline characteristics, APACHE II score, PaO 2 /FiO 2 ratio, and absolute neutrophil counts on ICU admission day. Also the mortality was not diff erent in relation to the presence of barotrauma, application of noninvasive ventilation, timing of susceptible antibiotic administration, changing or not to salvage regimens, presence of cytomegalovirus coinfection and even the microbiologic persistency in follow-up specimens. Based on the types and intensity of previous immunosuppressive therapy, we classifi ed patients into three subgroups: patients receiving low-dose steroid maintenance ± other immunosuppressive agents (LS), which represent previously stable organ transplants; another group consisting of patients receiving recent intensive chemotherapy (CTx); and the other group refers to patients receiving high dose (defi ned as >2 weeks at least 1 mg/kg dose) steroid therapy ± other immunosuppressive agents (HS). Signifi cant diff erences of outcome were observed among the three diff erent groups (28-day mortality: LS = 22.2%, CTx = 29.4%, HS = 71.4%, P = 0.01; 60-day mortality: LS = 33.3%, CTx = 64.7%, HS = 81.0%, P = 0.04). Methods BSA was derived using the Mosteller formula on the metric equivalent of simulated patients ranging from 5 ft to 7 ft and 100 lbs to 700 lbs. A priori, we defi ned normal CO = 4 to 8, normal CI = 2.5 to 5.0, normal SV = 60 to 100 ml/beat, and normal SI = 33 to 47. Algebraic analysis was used to determine BSA levels that would classify an SV or CO as abnormal. Results Critical BSA thresholds (T) are presented in Table 1 . For example, at SV = 100, a BSA higher than 3.03 (to the second decimal place) would classify the patient as having a low SI. Conclusions Patients with extreme BSAs are increasingly encountered in the ICU, especially larger BSAs related to obesity. We provide threshold values where extreme BSAs will classify high SV or CO values as low indexed values. The ranges considered normal for SI and CI may be inappropriate for patients with extreme BSAs, particularly in the obese. We caution against relying solely on the SI and CI to assess hemodynamic performance. Instead, the SV and CO along with other physiological parameters should also be considered before making therapeutic decisions. Introduction It remains a great challenge to measure systemic blood fl ow in critically ill newborns, especially during the transitional period with intracardiac and extracardiac shunts. Due to technical restraints, size limitations, the necessity for a relatively large amount of blood withdrawn and possible indicator toxicity, many methods of cardiac output monitoring are not feasible, hence cardiac output is generally estimated from indirect parameters of systemic blood fl ow. In a former study we assessed the agreement for cardiac output using the ultrasound dilution method (UDCO) and ultrasound transit time-based measurement of main pulmonary blood fl ow in a juvenile piglet model without shunts [1] . In the present study we analyzed the infl uence of a left-to-right shunt on the agreement between UDCO and ultrasonic transit time pulmonary blood fl ow in a juvenile lamb model. Methods In this prospective, experimental animal study, which was approved by the Ethical Committee on Animal Research of the Radboud University Nijmegen, we placed a Gore-Tex® shunt between the left pulmonary artery and the descending aorta in eight randombred newborn lambs (3.5 to 8.3 kg). This aortopulmonary shunt was intermittently opened and closed while cardiac output was manipulated by creating hemorrhagic hypotension. Ultrasound dilution cardiac output (Q-UDCO) -using repeated injection of 1.0 ml/kg isotonic saline at body temperature -was compared with pulmonary blood fl ow (Q-MPA) invasively measured by a perivascular fl ow probe around the main pulmonary artery. Figure 1 ). An increase in NICOM SV >8% showed a sensitivity of 76% and a specifi city of 75% to predict PulseCO changes >10%. Conclusions NICOM demonstrated a moderate agreement with LiDCO but showed excellent agreement with PulseCO in tracking CO changes following therapeutic interventions. Introduction Arrhythmias are common among high-risk surgical and ICU patients. The PulseCO pressure waveform algorithm is used for both LiDCO™plus and LiDCOrapid hemodynamic monitors, which are frequently used to estimate cardiac output (CO) in critically ill patients. Cardiac arrythmias could increase the variation of both the lithium dilution (LiDCO) and/or the PulseCO measurement. At set-up the algorithm is calibrated by comparing the PulseCO CO estimate, averaged over 30 seconds, with a known CO (normally LiDCO) to generate a calibration factor (CF) [1] . This study was designed to explore the eff ect of arrythmias on the accuracy of CF generation in the PulseCO monitor. Methods LiDCO™plus hemodynamic data fi les were obtained retro spectively from a university hospital medical/surgical ICU. Files were separated into those records with and without arrhythmia -defi ned as heart rate variation >5% during at least one additional CF determination after monitor set-up. Previous studies have established the coeffi cient of variation (CV) of a single LiDCO determination at 8% [2] and the PulseCO measurement at 2.4% [3] . A combined CV, refl ecting the eff ect of calibration, is estimated at 8.5%, resulting in an expected precision for the CF of 17%. Data were analysed for variation in CF against HRV using linear regression and the Student' s t test. Results Twenty-eight records were collected and analysed. Twenty-one records contained 32 post set-up calibration events. Of these 17 occurred with HRV ≤5% (median = 1%, range: 0 to 5%) and 15 occurred while HRV >5% (median: 19%, range: 7 to 26%). The average variation in CF during HRV was 5.4 ± 4.0% and for high HRV was 8.9 ± 8.1% of the initial value. The t test indicated no diff erence in the mean variation of CF (P = 0.162) or median. There was no correlation between HRV and CF variation (r 2 = 0.002). Ninety-one per cent (29/32) of the observed CF variation were less than 17% of the initial CF value. Conclusions CF determinations are not signifi cantly aff ected by HRV. The validity of an arterial waveform-based device for measuring cardiac output (CO) without the need for invasive calibration x COe x MTt, where S1 and S2 are respectively the maximum upslopes and down-slopes of the dilution curve. One hundred and thirty-seven measurements were done during inotropic stimulation (dobutamine), during hypovolemia (bleeding), during hypervolemia (fl uid overload), and after inducing acute lung injury (oleic acid). Introduction As the need for recalibration and the best time point to recalibrate has been a matter of debate since the introduction of the PiCCO monitor (Pulsion, Germany), we set out to analyze the performance of its pulse contour (PC) analysis without recalibration over a 24-hour period. Methods We studied the cardiac index (CI) over a 24-hour period in eight nonoperative patients admitted to our ICU. Seven CI measurements (median number; every 4 hours) by thermodilution (TD) were performed in each patient (in triplicate), PC-derived data were recorded continuously. We used a special PiCCOplus monitor with a disabled auto-recalibrate feature; that is, TD did not lead to an automatic calibration of the PC analysis. Calibration was only performed manually once at the start of the analysis for each patient. Later TD measurements were recorded but had no eff ect on the CIPC. An additional comparison was performed using the FloTrac/Vigileo System (Edwards, USA), which does not need manual recalibration and instead recalibrates itself every 60 seconds based on the arterial waveform. The FloTrac/Vigileo monitor used was a secondgeneration device (software version 1.14). Both the pressure transducers of the PiCCO and the FloTrac were connected in series to the same femoral artery catheter. Introduction Cardiac output (CO) monitoring is one of the key points in the hemodynamic evaluation of critically ill patients, and can be useful in various settings of high-risk surgery. There is a lack of evidence that the extensive use of invasive devices in the hemodynamic monitoring has a good impact in terms of outcome [1] , and less invasive systems have been proposed. Our aim was to compare the CO estimated by Vigileo/FloTrac with the blood fl ow in thoracic aorta as measured by transoesophageal Doppler in patients undergoing open abdominal aortic aneurysm repair, during the aortic cross-clamping (AoX) phase. We have measured the Augmentation Index (AI), a parameter related to vascular stiff ness, using the applanation tonometry method, in order to have a better understanding of the eff ect of AoX on blood pressure waves. Methods We enrolled 10 consecutive patients (10 men; age 66 ± 6 years) undergoing elective open AAA repair (ASA II to III) under general anesthesia. Radial arterial access was used for semi-invasive determination of blood pressures and CO (APCO) with the Vigileo. An esophageal Doppler was positioned after clinical stabilization. Applanation tonometry was measured just before and after the aortic clamping. We found a signifi cant (P <0.05) increase in CO reported by Vigileo/FloTrac system in the post-clamping phase, when compared with the pre-clamping and basal phases, while the blood fl ow in thoracic aorta resulted decreased, according with the theory of redistribution of fl uids in the splanchnic venous vasculature [2] . There was an important contribution of the wave refl ection to the aortic pulse pressure wave after the AoX, as expressed by a signifi cant increase in the AI. The Vigileo/FloTrac system appears to overestimate CO after AoX when compared with the measure of blood fl ow in thoracic aorta, and this result could be infl uenced by the pulse pressure wave refl ection occurring after clamping. In high-risk surgical settings, other situations of rapid change of systemic resistance vessels could be similarly misread, thus suggesting the necessity of a more tailored Vigileo algorithm. Introduction Postoperative hemodynamic optimization has been proved to reduce morbidity in high-risk patients [1] . Nowadays stroke volume index (SVI) monitoring is available with diff erent less invasive techniques that have shown diff erent levels of agreement and precision with the pulmonary artery catheter [2] . The aim of this study was to evaluate agreement and precision between SVI obtained with a calibrated (LiDCO™plus; LiDCO Ltd, Cambridge, UK) and an uncalibrated pulse contour analysis device (FloTrac/Vigileo; Edwards Lifesciences, Irvine, CA, USA), in patients undergoing postoperative hemodynamic optimization. Methods Patients undergoing a hemodynamic optimization protocolized care according to a previous published trial [1] to reach an oxygen delivery Methods Prospective study in the respiratory critical care of a university hospital. Eleven patients with spontaneously breathing activity considered for volume expansion. An increase in stroke volume index (SVi) of 15% or more after volume expansion defi ned a responder patient. We measured the response of the bioreactance stroke volume to PLR and to saline infusion (500 ml over 15 minutes). The proportional changes in NiCOM-SVi induced by PLR were correlated with the proportional changes in NICOM-SVi induced by volume expansion (r = 0.67, P = 0.02). The proportional changes in NICOM-cardiac index (CI) induced by PLR were also correlated with the proportional changes in NICOM-CI induced by volume expansion (r = 0.63, P = 0.03). A PLR-induced increase in stroke volume of 9% or more predicted an increase in stroke volume of 15% or more after volume expansion with a sensitivity of 100% and a specifi city of 80%. Conclusions The response of NICOM-stroke volume to PLR was a good predictor of volume responsiveness. In our hemodynamically unstable patients with spontaneous breathing activity, fl uid responsiveness can be assessed totally non-invasively with a bioreactance device. Introduction Transpulmonary thermodilution (TPTD)-derived volumetric parameters such as global end-diastolic volume (GEDI) and ELWI have been established as hemodynamic cornerstones for assessment of preload (GEDI) and pulmonary hydration. Normal values of GEDI have been created more than a decade ago based on studies in pre-selected patients. Therefore, it was the aim of our prospective study to investigate the correlation of GEDI to cardiac index (CI) in clinical routine. Methods Over a 6-month period all 1,574 routine TPTD measurements in 78 consecutive patients (APACHE II: 23.5 ± 8.6) of an internal ICU with a PiCCO catheter were prospectively documented and analysed: correlation (Spearman) and multiple regression analysis; SPSS 17.0. Results Including all 1,574 measurements, CI was univariately correlated to GEDI (r = 0.251; P <0.001), dP max (r = 0.221; P <0.001) and heart rate (r = 0.102; P <0.001), but not to CVP (r = 0.001; P = 0.962). The correlation of GEDI, dP max and heart rate to CI was confi rmed in multivariate analysis (P <0.001 for all three variables). Changes in CI (Delta-CI) were univariately correlated to changes in GEDI (r = 0.414), dP max (r = 0.240) and ELWI (r = 0.152; P <0.001 for all comparisons). In a multivariate analysis of all measurements, Delta-CI was independently associated with changes in GEDI (P <0.001), dP max (P <0.001) and CVP (P = 0.017). Subgroup analysis of all measurements with GEDI below the lower normal level 680 ml/kg/m 2 demonstrated an independent association of CI to GEDI (P <0.001), dP max (P <0.001) and ELWI (P = 0.041) but not to CVP. Similarly, Delta-CI was independently associated with changes in GEDI and dP max (P <0.001). Similar results were found for the measurements with GEDI within the normal range (680 to 800 ml/kg/ m 2 ): signifi cant and independent correlation of CI to GEDI (P <0.017) and dP max (P <0.001). Changes in CI were independently correlated to changes in GEDI (P <0.001), dP max (P <0.001) and CVP (P = 0.035). Interestingly, even in measurements with GEDI >800 ml/kg/m 2 , CI was independently correlated to GEDI (P = 0.009) and dP max (P <0.001). Changes in CI in this group were independently associated with changes in dP max and GEDI (P <0.001).In the subgroup of measurements with GEDI >1,000 ml/kg/m 2 there was no correlation of any parameter to CI, however changes in CI were independently correlated to changes in GEDI (P <0.001) and dP max (P = 0.003). Conclusions GEDI and dP max and their changes have an independent and positive correlation to CI and its changes even in patients with increased GEDI. Relationship of stroke volume variation, pulse pressure variation and global end-diastolic volume in patients undergoing brain surgery A Rieß 1 , S Wolf 2 , C Lumenta 2 , L Schürer 2 , P Friederich 1 Introduction Monitoring intravascular volume in patients with intracranial pathology is often mandatory for maintaining hemodynamic stability [1, 2] . Cyclic changes in cardiac stroke volume [1, 3] and pulse pressure induced by positive pressure ventilation as well as target values of global end-diastolic volume index (GEDVI) (ml/m 2 ) [2] allow volume therapy guidance. The relationship between stroke volume variation (SVV) (%) and pulse pressure variation (PPV) (%), as well as between SVV or PPV and values of GEDVI has not been established in patients with intracranial pathology. Methods In this prospective investigation the correlation between dynamic and static hemodynamic parameters of 38 patients undergoing brain surgery was studied. Measurements were performed using the PiCCO technology. For statistical analysis, nonparametric correlation analysis and hypothesis testing were applied. Results SVV correlated signifi cantly with PPV (r 2 = 0.87, P <0.001). Neither SVV (r 2 = 0.14, P = 0.13) nor PPV (r 2 = 0.07, P = 0.81) correlated with GEDVI. Threshold values for SVV (9.5%, 11.6%) as well as for PPV (12.5%) allowed discrimination between groups with signifi cantly diff erent values of stroke volume index, while failing to discriminate between groups with signifi cantly diff erent values of GEDVI. Dichotomizing the patients into groups of GEDVI ≤680 ml/m 2 and >680 ml/m 2 resulted in groups with signifi cantly diff erent values of stroke volume index as well while failing to discriminate between groups with signifi cantly diff erent values of SVV and PPV. Conclusions Static (GEDVI) and dynamic (SVV, PPV) parameters of cardiac preload may refl ect diff erent properties of the cardiovascular system. The combination of SVV, PPV, and GEDVI may off er more precise information on the cardiovascular system than either parameter alone. The prediction value of pulse pressure variation (ΔPP) in patients ventilated with low VT is not well studied. A ΔPP of 12 to 13% is validated as a predictor of volume response in several studies, but in patients ventilated with VT >8 ml/kg. One study has shown that the ΔPP of 12 to 13% does not predict volume response in patients ventilated with a low VT. We hypothesized that a lower cut-off value for ΔPP can predict volume response in patients with low VT. Methods Thirty-seven adult patients mechanically ventilated with a tidal volume <8 ml/kg (PBW), without cardiac arrhythmias, with a pulmonary artery catheter and a peripheral arterial catheter were included. An increase in cardiac index (thermodilution)) >15% output after a fl uid challenge (Crystalloid 1,000 ml or Colloid 500 ml) was considered a positive response. Results Seventeen patients were responders. The ROC curve showed that the best cut-off value for ΔPP was 10% (ROC area = 0.74, 95% CI: 0.51 to 0.9; sensitivity 53%, specifi city 95%, positive likelihood ratio 9.4 and negative 0.34).Twelve patients consisted of a heterogeneous group of patients (liver transplant, acute pancreatitis, aortic surgery). Among 25 septic shock patients, a ΔPP >10% showed a ROC area of 0.84 (sensitivity 78%, specifi city 93%). In any case, the greater ΔPP, the greater the fl uid response. ΔPP >10% was a better predictor than CVP or PAOP. Conclusions ΔPP has a limited value in patients ventilated with low VT. However, a ΔPP >10% may help identify septic shock patients that will respond to a fl uid challenge. We have already demonstrated that in mechanically ventilated patients, the respiratory change in the pre-ejection period (ΔPEP) is a reliable dynamic index for the prediction of increase in cardiac output after volume infusion [1] . However, in an animal study, Kubitz and colleagues showed that the pre-ejection period is not sensitive to the changes in intravascular volume status [2] . Methods This study investigated the infl uence of changes in intravascular volume status on ΔPEP. In 17 pigs, ECG, arterial pressure and cardiac output derived from a Swan-Ganz catheter were recorded. Measurements were performed during normovolaemic conditions, after haemorrhage (25 ml/ kg) and following re-transfusion (25 ml/kg) with constant tidal volume (10 ml/kg) and respiration rate (15/minute). After the early phase of sepsis, excessive fl uid administration may worsen pulmonary edema and prolong mechanical ventilation [1] . Accurately predicting fl uid responsiveness obviates unnecessary fl uid loading, and helps to detect patients who may benefi t from a volume expansion. Pulse pressure variation (DPP) is a reliable predictor of fl uid responsiveness in mechanically ventilated patients only when tidal volume is at least 8 ml/kg [2] . The aim of this study was to evaluate the predictive value of DPP for fl uid responsiveness after a maneuver to change tidal volume to 8 ml/kg in patients ventilated with 6 ml/kg. Methods Prospective clinical study in 40 patients ventilated with 6 ml/kg after resuscitation phase of severe sepsis and septic shock. Fluid challenge was indicated by the attending physician (7 ml/kg of 6% hydroxyethyl starch 130/0.4). Complete hemodynamic measurements including DPP (DPP 6 ml/kg) were obtained at baseline. The tidal volume was then changed to 8 ml/kg and the DPP (DPP 8 ml/kg) was measured after 5 minutes. The ventilatory settings were returned to 6 ml/kg before fl uid challenge. Patients whose cardiac output (CO) increased by ≥15% were considered to be fl uid responders. Receiver operating characteristic (ROC) curve analysis was used to evaluate the predictive value of DPP. In 19 patients (responders), CO increased by >15% after fl uid infusion. Fluid responsiveness was better predicted with DPP 6 ml/kg (ROC curve area 0.92 ± 0.05) than with pulmonary artery occluded pressure (0.56 ± 0.09) and right atrial pressures (0.74 ± 0.08). Increasing tidal volume to 8 ml/kg did not improved prediction as the ROC curve area with DPP 8 ml/kg was 0.94 ± 0.03. The best cut-off values defi ned by the ROC curve analysis was 6.5% and 10.5% for DPP 6 ml/kg and DPP 8 ml/kg, respectively. Conclusions The maneuver to change tidal volume to 8 ml/kg in patients ventilated with protective ventilatory strategy to better predict fl uid responsiveness is not useful. Fluid responsiveness can be correctly predicted in patients ventilated with tidal volume of 6 ml/kg. Major head and neck surgery involving reconstructive free fl aps for oropharyngeal cancers are complex and prolonged operations during which appropriate fl uid management can become diffi cult. Potential adverse eff ects of fl uid mismanagement in this group of patients include fl ap hypoperfusion related to hypovolaemia or fl ap oedema and deterioration in alveolar-arterial gradients related to excessive fl uid administration. This study looked at ongoing standard practice, to determine whether the use of a cardiac output monitor could improve fl uid management in this subset of patients. Methods Single-blinded, prospective observational study conducted on consecutive adult patients undergoing major head and neck reconstructive free fl ap surgery. All patients were anaesthetised by the same individual, using a standardised technique. Volume-controlled positive pressure ventilation was initiated in all patients. Patients received maintenance crystalloid fl uids at a rate of 5 ml/kg/hour. Additional fl uid challenges (250 ml crystalloid boluses given over 5 minutes) were administered at the discretion of the anaesthetist. A priori criterion of fl uid responsiveness was defi ned as an increase in stroke volume (SV) >10% as measured with the LiDCOrapid after a fl uid challenge. The anaesthetist was blinded to the LiDCOrapid data and observations were made by an independent investigator. Data were collected on: heart rate (HR), mean arterial pressure (mAP) and central venous pressure (CVP). Results Forty-seven fl uid boluses were assessed. The median age of the patients was 72 years (range 68 to 81 years) and their median weight was 71.2 kg (range 63 to 88 kg). The ventilatory set-up used a median tidal volume of 7 ml/kg (range 450 to 600 ml). Fifteen out of 47 (32%) of the fl uid challenges were positive when assessed against the change in stroke volume. When comparing the fl uid responders with the nonresponders, there were no diff erences in HR (62 ± 10 bpm vs 62 ± 10 bpm), mAP (63 ± 6 mmHg vs 63 ± 7 mmHg) or CVP (8 ± 3 mmHg vs 10 ± 3 mmHg). Conclusions This preliminary report suggests that only 32% of fl uid challenges given in theatre were eff ective in increasing the SV >10% and potentially two-thirds of the fl uid challenges may have been detrimental. This can be seen only if SV is monitored continuously. If further data confi rm this, LiDCOrapid may be useful to guide fl uid administration in order to optimise the SV, thereby minimizing the risk of fl uid overload. Introduction Patients undergoing elective abdominal aortic surgery (EAAS) are at risk of developing complications due to preoperative co-morbidity, surgical trauma, blood loss and infl ammatory injury [1] . Individualized goal-directed therapy (IGDT) has been proposed to improve outcome in patients undergoing high-risk surgery [2] . The aim of this study was to investigate whether IGDT, targeting stroke volume (SV) and oxygen delivery (DO 2 ), can be performed safely in EAAS. Methods Sixty-three EAAS patients were randomized to IGDT or conventional therapy. The LiDCO™plus system was used for SV and DO 2 monitoring. SV was optimized by 250 ml fl uid challenges intraoperatively and the fi rst 6 hours postoperatively. DO 2 was optimized 6 hours postoperatively targeting a DO 2 I level of 600 ml/min/m 2 , by infusion of dobutamine, if necessary. Hemodynamic data were collected at baseline (t0) preoperatively (t1), before aortic cross-clamping (t2), at the end of surgery (t3), and the fi rst 6 hours postoperatively (p1 to p6). All patients were monitored with fi velead ECG during dobutamine infusion and dosage was reduced at signs of ischemia or heart rate >20% above baseline. Dobutamine dosage was limited to a maximum of 10 μg/kg/minute. The mean SVI level was 19.4% higher at p4 to p6 in the IGDT group compared with the control group (P = 0.02), and 12.1% higher in the entire intervention period (t1 to p6) (P = 0.07). The mean DO 2 I was 18.0% higher at p4 to p6 (P = 0.01) and 12.9% higher in the entire intervention period (t1 to p6) in the IGDT group (P = 0.03). Mean arterial pressure and heart rate did not diff er signifi cantly (P = 0.12 and P = 0.21). There was no diff erence in the frequency of postoperative cardiac complications between the groups. Conclusions The results of this study demonstrate that IGDT targeting SV and DO 2 can be performed safely in patients undergoing EAAS. Whether this intervention is benefi cial is being evaluated in the current study. Introduction Studies have suggested that tissue oxygenation (StO 2 ) measured on the thenar muscle is not sensitive to track acute changes in hemodynamics due to reductions in central blood volume. We aimed to investigate the feasibility of StO 2 measurements in the mouth as a quantitative indicator comparable with StO 2 measurements obtained from the thenar eminence during changes in central blood volume (CBV). Methods We performed a head-up tilt (HUT) test in 10 healthy volunteers as an experimental model to reduce CBV. StO 2 was continuously measured using two devices (InSpectra model 650): a multiple depth optical probe was placed over the thenar eminence (15 and 25 mm) and a 1-mm probe was placed in the mouth. Subjects were placed on an electrically driven tilt table with a footboard. After 5 minute baseline measurements in the supine position, the table was tilted up to 70° and returned to the supine position after 10 minutes. StO 2 readings were analyzed at the lowest stroke volume value. Results All subjects (mean age: 23 ± 6; six males) tolerated well the supine and head-up positions. Cardiac output signifi cantly decreased in the HUT position; simultaneous decrease in StO 2 was observed in the mouth, but not in the thenar. The general results of the HUT test are shown in Tables 1 and 2 . Introduction Hemodynamic optimisation based on fl ow variables allows an early detection and correction of possible occult organ hypoperfusion in patients undergoing major surgery. Shoemaker described a markedly decreased cardiac index (CI) in nonsurvivors, which remained signifi cantly below the values compared with survivors during the surgery. The aim of this study was to evaluate the length of ICU stay, overall in-hospital stay and the postoperative outcome in a group of patients undergoing major urological surgery, while the CI is maintained within the normal range during intraoperative period. Methods Patients were randomised into groups the day before surgeryconventional management group (decision about fl uid therapy and vasoactive support was based on internal guidelines to preserve normal macrohemodynamic variables), and protocol group. Each patient in the protocol group received an oesophageal Doppler probe (TED) (Hemosonic 100; Arrow International, USA) after the start of general anaesthesia and then hemodynamic optimisation (fl uid management and vasoactive drugs), according to TED variables, was performed to keep CI between 2.6 and 3.8 l/minute/m 2 . We enrolled 230 patients. The control group: n = 115 and the protocol group: n = 115. High-risk criteria surgery was fulfi lled in 43% patients in protocol group and 45% in control group. There were no signifi cant diff erences in baseline variables between both groups (age, gender, length of surgical procedure, estimated blood loss and also in intraoperative values of MAP and CVP). In the protocol group was observed a high frequency of CI <2.6 l/minute/m 2 after induction of anesthesia 75% with fast recovery of CI. Introduction Fluid therapy in ICU patients serves to maintain tissue perfusion and is directed at increasing cardiac stroke volume (SV) through an increase in preload: fl uid responsiveness. Although there is increasing evidence that the regional blood fl ow cannot be predicted from global hemodynamic measurements, the relation between SV and parameters of peripheral perfusion is not clear. The aim of our study was to evaluate the eff ect of an increase in preload on commonly used parameters of peripheral perfusion. Methods Hemodynamically unstable patients with clinically suspected fl uid responsiveness underwent a passive leg raising (PLR) test, which consisted of 5 minutes of rest in a semirecumbent position of 30°, followed by 5 minutes PLR (lower limbs elevated at 30° and trunk in supine position). SV was measured continuously by pulse contour analysis using PiCCO (Pulsion). Peripheral perfusion was measured continuously with sidestream dark fi eld imaging (sublingual area) and laser Doppler fl owmetry (LDF) (fi nger). Results Sixteen patients (age: 63 years (55 to 72), APACHE II: 25 (20 to 28), SOFA: 10 (7 to 13)) were included in our study. Of these 16 patients, six (38%) increased their SV by >10% in response to a PLR. Flow indices (LDF and sublinguale microcirculatory fl ow) did not change. However, there was a trend in increase of the functional capillary density in the responders (see Figure 1 ). Conclusions These data suggest that increasing SV in hemodynamically unstable patients might improve peripheral perfusion, however only in the sublingual area and not in all patients. There was no relation between systemic circulation and peripheral perfusion: it remains to be investigated whether optimizing SV actually results in improved tissue perfusion. Among all patients, at baseline, median CI and dIVC% were 2.6 l/minute/ m 2 and 29%, respectively. Volume expansion signifi cantly increased the median CI from 2.6 (2 to 3.3) to 3 (2.1 to 4) l/minute/m 2 (P = 0.005) and decreased dIVC% from 29.4% to 12.6% (P = 0.003). The median dIVC% in R was higher than NR: 31.3% vs 17% (P <0.05). Fluid therapy decreased more dIVC% in R than in NR: R 31% to 12% (P = 0.03), NR 17% to 12% (P = 0.04). The dIVC% showed similar trend in both groups of septic shock (SS) and trauma shock (TS) patients before and after fl uid therapy: dIVC% 27% in SS and 24% in TS before fl uid therapy; 15% in SS and 11% in TS after therapy. Conclusions Our data suggest that dIVC% is a sensitive index of fl uid responsive ness in septic and trauma patients in shock. Limitations: few patients. Introduction Myocardial depression occurs in 40% of patients presenting with sepsis. In critically ill patients, the peak fi rst derivative of aortic pressure (Ao_dP/dt max ) derived from a fl uid-fi lled catheter has been commonly used by clinicians for decades to assess directional change in left ventricular (LV) contractility. However, this parameter remains questionable because of its preload sensitivity. The aim of this study was to test whether Ao_dP/dt max represents an accurate method for assessing LV contractility when preload independence, based on dynamic indices, is achieved. Methods LV pressure-volume data obtained with a conductance catheter and invasive aortic pressure obtained with a fl uid-fi lled catheter were continuously recorded in six anaesthetized and mechanically ventilated pigs. After a stabilization period, endotoxin was infused to induce septic shock. Fluid administration was continuously controlled by preload responsiveness by holding pulse pressure variation (PPV) <13%. Catecholamines were transiently administrated during shock. Ao_dP/dt max was compared with end-systolic elastance (E es ), the gold standard method for assessing LV contractility. Results Endotoxin-induced septic shock and catecholamine infusion lead to signifi cant variations in LV contractility. The best correlation (r 2 = 0.76) and agreement between Ao_dP/dT max and E es were obtained when PPV <11% ( Figure 1 ). Conclusions Ao_dP/dT max is a minimally invasive and accurate method for assessing LV contractility when eff ective preload independence, defi ned as PPV <11%, is achieved. Introduction The pressure recording analytical method (PRAM) is the only pulse contour method that does not need any calibration since it estimates in vivo, beat to beat, the impedance of the cardiovascular system [1] [2] [3] . Cardiac cycle effi ciency (CCE) is a novel parameter that is directly related to the cardiovascular impedance. Since the aortic valve contributes to impedance, we hypothesized during an aortic valvuloplasty, performed for severe aortic stenosis, the cardiovascular impedance may decrease by the reduction of the transvalvular gradient. Methods In a cath lab, fi ve consecutive patients undergoing aortic valve plasty for severe aortic stenosis were monitored by means of PRAM during the procedure. Systolic (SAP), diastolic, and mean (MAP) arterial pressures, stroke volume (SV), heart rate (HR), cardiac output (SV x HR), CCE, and dP/ dt max were continuously collected and afterwards analyzed. The stroke work (SW = SV x MAP) and minute work (MW = SBP x CO) were also measured. Results In all cases of NOMI, they were diagnosed at laparotomy. The mean age was 69.7 years (50 to 83 years old), and the male:female ratio was 1:1. One patient received off -pump coronary artery bypass surgery, and fi ve underwent thoracic aortic surgery. Hemodialysis was initiated in one of six patients before operation, while the continuous hemodiafi ltration was initiated in all patients postoperatively. After operation, high-dose catecholamines were necessary in fi ve of six patients for long periods because of severe hypotension. In four of six patients, abdominal pain was the presenting symptom. The rest of two patients had a nonspecifi c presentation because they were ventilated and sedated. All patients presented abdominal distension and their abdominal X-rays showed paralytic ileus features. The serum values of AST, LDH, CK, and lactate were slightly elevated in most patients. Five of six patients died from septic shock and multiple organ failures, and the mortality rate of patients with NOMI was 83%. Potential risk factors contributing to the occurrence of NOMI and sensitive markers might be the following: continuous hemodiafi ltration (6/6); hypotension (5/6); high-dose catecholamines (5/6); dehydration (6/6); abdominal pain (4/6); paralytic ileus patterns of abdominal X-rays (6/6). Conclusions The increase of NOMI incidence following cardiothoracic surgery might be related to continuous hemodiafi ltration, hypotension, dehydration, and uses of high-dose catecholamines. Identifi cation of patients at NOMI risk and prevention of hypovolemic hypotension and use of vasodilator may help to reduce the incidence of NOMI. increased prevalence of CHF in the elderly. The aim of present study was to investigate the major causes, comorbidities and in-hospital mortality of patients with CHF. Methods A retrospective study was performed in 6,960 patients (4,352 males, 2,608 females) with a validated primary discharge diagnosis of CHF hospitalized from 1 January 1993 through 31 December 2007, at Chinese PLA General Hospital in Beijing. The patients were divided into fi ve groups based on the number of etiologies and comorbidities from one to fi ve or more than fi ve. A comparative analysis was performed to explore the major causes, comorbidities and in-hospital mortality of patients among the groups. The mean (± SD) age of patients was 53 ± 17 years in the onecomorbidity group, 60 ± 16 years in the two-comorbidity group, 65 ± 14 years in the three-comorbidity group, 70 ± 13 years in the fourcomorbidity group and 72 ± 11 years in the fi ve-comorbidity group. Introduction Sildenafi l is a phosphodiesterase-type-5 inhibitor and selectively decreases pulmonary artery pressure. So far, the mechanism underlying sildenafi l's eff ects on pulmonary vascular remodeling and potassium channel activity in pulmonary artery smooth muscle cells (PASMCs) has not been clearly addressed in pulmonary hypertension secondary to increased pulmonary blood fl ow. Methods A total of 27 male SD rats were randomly divided into a sham group (n = 9), a shunt group (n = 9) and a shunt + sildenafi l group (n = 9). A left-to-right shunt was established by performing an abdominal aorta to inferior vena cava fi stula both in the shunt group and the shunt + sildenafi l group. Rats in the shunt + sildenafi l group received oral sildenafi l 10 mg/kg/ day, whereas the rats in the sham group and the shunt group were fed with normal saline of the same volume. Eleven weeks later, mean pulmonary artery pressure (mPAP) was measured. Meanwhile, the ratio of right ventricular mass to left ventricular plus septal mass (RV/(LV+S)) was detected as a marker of the degree of right ventricular hypertrophy. The relative medial thickness (RMT) of middle and small pulmonary muscularized arteries was calculated as a sign of pathological changes of pulmonary vasculature. The voltage-gated potassium channel Kv1.5 mRNA expression of pulmonary vasculature was detected using real-time PCR. Results Eleven weeks later, the rats in the shunt group developed pulmonary hypertension as evidenced by signifi cantly increased mPAP, RV/(LV+S), as well as higher RMT of middle and small pulmonary muscularized arteries (all P = 0.01). In addition, the rats in shunt group had decreased Kv1.5 mRNA expression in pulmonary vasculature (P = 0.01). The rats in the shunt + sildenafi l group had a signifi cantly decreased mPAP, and RV/(LV+S) ratio, and a lower RMT as well (all P = 0.01), whereas the levels of Kv1.5 mRNA expression were signifi cantly upregulated (P = 0.01). Furthermore, there were no statistically signifi cant diff erence in mPAP, in RV/(LV+S) ratio, RMT of middle and small pulmonary muscularized arteries and Kv1.5 mRNA expression between the shunt + sildenafi l group and the sham group (all P = NS). Conclusions Oral sildenafi l attenuated pulmonary vascular remodeling and upregulated Kv1.5 mRNA expression in the rats with pulmonary hypertension secondary to left-to-right shunt. Methods Eleven anesthesized pigs were instrumented for the measurement of arterial blood pressure, central venous pressure, RV and pulmonary pressure. An ultrasonic fl owprobe (MA14PAX; Transonic) was positioned on the main pulmonary artery to obtain pulmonary fl ow. Distal to the fl owprobe, a balloon-occluder was positioned facilitating gradual constriction of the pulmonary artery. To obtain a stepwise pressure diff erence increment over the banding of 10 mmHg at each measurement, we gradually infl ated the balloonoccluder. After 10 minutes, all invasive hemodynamic data were registered and an epicardial echocardiography was performed to obtain tricuspid fl ow velocities, isovolumetric and isovolumetric relaxation time. To calculate the TEI index during one heartbeat, echocardiographic measurements were synchronized with fl owprobe measurements to obtain ejection time. The E/E΄ ratio was obtained with tissue Doppler echocardiography of the lateral tricuspid annulus. All echocardiographic measurements were performed in triple and averaged. The ejection period and mean pulmonary acceleration and cardiac output were calculated from the pulmonary fl ow curve derived from the ultrasonic fl owprobe. Resistance over the pulmonary banding was calculated by pressure gradient divided by cardiac output. The pathobiology of persistent right ventricular (RV) failure observed after an acute increase in pulmonary artery pressure (P pa ) remains incompletely understood. We hypothesized that these severe complications might be related to an activation of apoptotic pathways. Methods Fourteen anesthetized dogs were randomised to a transient 90 minutes pulmonary artery constriction or to a SHAM operation, followed 30 minutes later by hemodynamic measurements including eff ective arterial elastance (E a ) to estimate RV afterload and end-systolic elastance (E es ) to estimate RV contractility, and sampling of cardiac tissue to assess apoptosis by real-time quantitative polymerase chain reaction, enzyme-linked immunosorbent assay and immunohistochemistry. Results Transient increase in P pa persistently increased E a from 0.75 ± 0.08 to 1.37 ± 0.18 mmHg/ml, and decreased E es from 1.06 ± 0.09 to 0.49 ± 0.09 mmHg/ml, E es /E a from 1.44 ± 0.06 to 0.34 ± 0.03 and cardiac output from 3.78 ± 0.16 to 1.46 ± 0.10 l/minute, indicating RV failure. As compared with the SHAM-operated group, and with left ventricular tissue in animals with persistent RV failure, there were decreased gene expressions of RV and septal Bcl-2, with no changes in the gene expressions of Bax and Bak, and an increase in the Bax/Bcl-2 ratio. RV and septal Bcl-XL, and RV Bcl-w gene expressions were decreased as compared with the SHAM-operated group. There were activations of RV caspases 8 and 9, and of RV and septal caspase 3. Diff use RV and septal apoptosis was confi rmed by TUNEL staining. There were also increased RV and septal protein expressions of TNFα. Conclusions Acute afterload-induced persistent RV failure appears to be related to an early activation of apoptotic pathways, and to a myocardial increase of TNFα. Currently approved treatments such as nicardipine (N) and dihydralazine have drawbacks. Since hypertension in PE is associated with increased sympathetic activity, urapidil (U), a peripheral α 1 antagonist, has potential for BP control in PE, but no controlled comparison of U with N is available. This preliminary randomized controlled trial aims to compare effi cacy and safety of U and N to reduce BP in severe PE. Methods After IRB approval and signed informed consent, 18 women with severe PE without previous antihypertensive treatment were randomized to U or N groups. The therapeutic goal was to achieve a mean BP (MBP) between 105 and 125 mmHg. The U patients fi rst received U 6.25 mg boluses every 5 minutes until the diastolic BP dropped below 105 mmHg, followed by a 4 mg/hour infusion adjusted as needed. In the N group, patients fi rst received a N 1 γ/kg/minute infusion until a 15% reduction in mean BP, followed by a N 0.75 γ/kg/minute infusion adjusted as needed. Non-invasive BP was assessed every 5 minutes during treatment titration and then every 15 minutes. The time needed to reach the therapeutic goal was registered. The main endpoint was the achievement of the BP goal in 2 hours or less. Tolerance was assessed by the number of episodes of hypotension (HO) (defi ned as MBP below 100 mmHg) and side eff ects. Severe HO, defi ned as MBP below 80 mmHg or two episodes of HO, was considered as treatment failure and led to exclusion. Further assessment was limited to safety, amount of ocytocics used and neonatal evaluation by ICU paediatricians until discharge from ICU. Results were compared using analysis of variance. Side eff ects were compared with the chi-square test. Results One U patient was excluded from the effi cacy assessment due to a protocol violation. The main endpoint was reached in all 17 patients, after 50 minutes in both groups. During the fi rst 2 hours, the needed treatment adjustment median value was 1 (0 to 10) in the U group and 1 (0 to 13) in the N group. Side eff ects attributable to the study treatment were observed in six of the nine cases in the N group and in one of the nine cases in the U group (P <0.02). There were no severe side eff ects or neonatal side eff ects. Conclusions No diff erence in effi cacy could be shown in this preliminary series. Both treatments were easy to titrate. Fewer side eff ects were recorded in the U group. Further studies are needed in order to compare U and N. Introduction Patients frequently refer to the emergency department with hypertension and related disorders. If hypertensive crisis is not diagnosed in these patients, urgent treatment is not necessary. However, taking patient satisfaction into consideration, the emergency physicians usually discharge these patients after lowering the blood pressure with various medications [1] . This prospective, randomized, placebo-controlled study is designed in order to compare the eff ects of captopril, furosemide and lorazepam on lowering blood pressure and increasing patient satisfaction. Methods One hundred patients with uncomplicated hypertension were included in the study. All were randomized into four groups: (1) captopril group, (2) furosemide group, (3) lorazepam group, (4) placebo group. The blood pressure was measured at baseline, 30th, 60th and 90th minute. The patient satisfaction was assessed with a visual analog scale (VAS) at baseline and 90th minute. Results Captopril (23.64 mmHg), lorazepam (24.90 mmHg) and furosemide (24.10 mmHg) were found similarly eff ective in lowering blood pressure when we compare baseline and 90th minute, but all three drugs were superior to placebo (15.94 mmHg) (P <0.05) ( Table 1 ). When patient satisfaction was assessed with the VAS, captopril (30.12 mm), furosemide (28.04 mm) and lorazepam (32.88 mm) were statistically similar and all three drugs were superior to placebo (22.76 mm). In conclusion, all three drugs can be used in subjects referred to the emergency department with uncomplicated hypertension. They are similarly eff ective in both lowering the blood pressure and increasing patient satisfaction. Introduction Hypertensive crises (HC) are common among patients admitted to emergency rooms [1] . However, data are lacking about prevalence in the critically ill admitted to the ICU. The aim of the study was to assess the rate of HC in a cohort of patients admitted to a medicalsurgical ICU, to look for risk factors for HC and to assess outcome of patients with HC. Patients were divided into two groups: the fi rst group included patients who underwent electrical cardioversion (CVE group, n = 10) and were discharged in sinus rhythm; the second included patients who did not undergo electrical cardioversion and were discharged with chronic atrial fi brillation (CAF group, n = 40). In both groups, mortality, antiarrhythmic drug therapy and oral anticoagulation regimen were monitored. The fi rst group's follow-up evaluated maintenance of sinus rhythm. The number of DC shocks required for the maintenance of sinus rhythm in patients of the CVE group was also collected. The main outcome parameter considered was 28-day mortality. Results Patients of both groups resulted in similar demographic and clinical parameters. In patients included in the CVE group, the maintenance of sinus rhythm was achieved with one DC shock in 66.6%, two DC shocks in 22.2% and three DC shocks in 11.1% of cases. The mortality rate at 24 hours was 0% while at 28 days was 11.1%, whereas patients in the CAF group had a 28-day mortality rate of 35%. See Figure 1 . Conclusions Our pilot study indicates that the improvement of ventricular performance with the contribution of atrial systole might improve critically ill patients' outcome. Considering the small number of cases, a prospective study, based on these preliminary results, is ongoing. The incidence of acute cardiogenic pulmonary edema (ACPE) is increasing and it is now one of the leading causes of morbidity and mortality in our society. As the global population of European countries is getting older, the impact of arterial stiff ness is becoming more and more important in the pathophysiology of ACPE, besides coronary vascular disease and valvular disease. The aim of the study was to evaluate whether acute modifi cations in elastic artery distensibility can be involved in the pathogenesis of ACPE. Methods Six consecutive patients (four men and two women; age 76 ± 7.1 years) were admitted to our ICU with the clinical diagnosis of ACPE. All patients were studied with transthoracic echocardiography (TTE) and by evaluation of pulse wave analysis (PWA) and pulse wave velocity (PWV) with the applanation tonometry method, performed at admission and after clinical stabilization. In all patients, left ventricular systolic function was not signifi cantly reduced when evaluated with transthoracic echocardiography, while a diastolic dysfunction was always demonstrated. We have recorded a signifi cant (P <0.05) decrease in blood pressure values after clinical stabilization. Estimation of the central aortic pressure waveform by mathematical transformation of radial tonometry pressure was similarly reduced. Other tonometric parameters, such as the Augmentation Index, which represents the contribution of the wave refl ection to the global pulse pressure wave, and PWV, which is inversely correlated to the arterial compliance, were also signifi cantly decreased, when compared with the values at the admission time. The subendocardial viability ratio (SEVR), an indirect index of myocardial perfusion relative to cardiac workload, increased after treatment with vasodilatators. Conclusions Our data confi rm the determining role of the increased arterial stiff ness in the pathogenesis of ACPE, and how a therapeutic strategy able to ameliorate this target can be associated with a clinical improvement for the patient. The applanation tonometry could be an interesting method to evaluate these patients in the ICU setting. Introduction Several studies using angiographic or echocardiographic methods have shown that in septic shock (SSh) or severe sepsis (SeS), cardiac abnormalities, which are often documented [1] , seem to have a relevant prognostic signifi cance [2, 3] . Conclusions This study reinforces the concept that changes in systolic and diastolic functions are common in septic shock and severe septic patients, and that the development of these changes (as an adaptation mechanism) seems to correlate inversely with the acute mortality rate during ICU stay [2, 4] . Introduction While advantages of pulsatile perfusion (PP) during cardiopulmonary bypass (CPB) in terms of clinical outcome remain the subject of debate, possible benefi ts are generally thought to occur via improvements in microvascular fl ow [1] . However, this is currently not supported by human clinical data. Therefore we used real-time human microvascular imaging to test our hypothesis that pulsatile perfusion would enhance microvascular perfusion. Methods We used sidestream dark fi eld imaging to record video clips of the human microcirculation in 16 patients undergoing routine CPB for cardiac surgery. Following administration of cardioplegia, CPB was continued in either pulsatile (PP, n = 8) or nonpulsatile (NP) mode. After 10 minutes, microvascular recordings were made. The perfusion mode was then switched from PP to NP or vice versa. Ten minutes later, a second series of microvascular video recordings were obtained. Global hemodynamic and laboratory data were recorded and the energy equivalent pressure (EEP) and pulse pressure (both mean ± SD) were calculated to quantify PP generated surplus energy. Microvascular analysis was performed both for smaller and larger microvessels with a diameter cut-off of 20 μm. Assessments included perfused vessel density (PVD) (mean ± SD, 95% confi dence intervals of the diff erence between NP and PP (95% CID)) and the Microvascular Flow Index (MFI) (mean ± SD and interquartile range). Results Pulsatile perfusion resulted in higher pulse pressure (27 ± 6 vs 7 ± 2 mmHg, P <0.0001) and CPB circuit EEP (184 ± 33 vs 150 ± 27 mmHg, P <0.0001) as compared with NP while MAP was similar between these perfusion modes (52 ± 11 vs 56 ± 13 mmHg, P = 0.09). Both for small and larger microvessels, we found no diff erences in indices of microvascular perfusion between PP and NP. Small microvessel PVD was similar between PP and NP (6.65 ± 1.39 vs 6.83 ± 1.23/mm; 95% CID -0.50 to 0.87/mm, P = 0.58). Conclusions In our study, a complete echocardiographic assessment was possible using TTE in 97% of patients. Two-thirds of our population had echocardiographic abnormalities, 62.1% of whom were previously unknown. The presence of LV asynergies and reduced RV systolic function were associated with a worse prognosis. Detection of cardiac abnormalities increased prediction of ICU mortality based on logistic regression. Introduction B-type natriuretic peptide (BNP) and N-terminal proBNP (NT-proBNP) are routine diagnostic and monitoring markers in patients with heart failure. LV function and prognosis of these patients have been shown to be refl ected by BNP and NT-proBNP levels. However, in patients with cardiogenic shock after myocardial infarction, the relationship and relevance of these markers has not been elucidated. Methods The IABP Shock trial was as a monocentric, randomised and prospective clinical trial to determine the role of therapeutic intraaortic balloon pump (IABP) counterpulsation after primary percutaneous transluminal coronary angioplasty (PCI) in acute myocardial infarction (AMI) complicated by cardiogenic shock. Cardiac catheterization was performed in 40 patients within 12 hours of onset of hemodynamic instability. Creatinine, hemodynamic parameters and survival were determined. Further, BNP and NT-proBNP levels were measured on admission as well as 24, 48 and 72 hours afterwards. Results BNP levels detected diff erences in treatment regarding LVunloading under IABP (632 ± 194 pg/ml vs 1,370 ± 475 pg/ml, P <0.05). However, there was no signifi cant diff erence between the group treated with IABP and no IABP with regard to NT-proBNP levels. Interestingly, NT-proBNP levels clearly diff erentiated between survivors and nonsurvivors (4,590 ± 1,230 pg/ml vs 14,370 ± 4,886 pg/ml, P <0.05), contrary to no signifi cant diff erence between survivors and nonsurvivors with regard to BNP levels (NS). Elevated levels of NT-proBNP in patients with cardiogenic shock might be more dependent on impaired renal function, which might refl ect additional organ dysfunction (creatinine >200 μmol/l vs creatinine ≤200 μmol/l: 24,965 ± 9,567 pg/ml vs 7,246 ± 2,650 pg/ml, P <0.05). Conclusions In myocardial infarction complicated by cardiogenic shock, levels of BNP and NT-proBNP both provide valuable additional information. BNP seems to closely refl ect the cardiac status and eff ects of therapy, while NT-pro-BNP seems a good indicator for prognosis in patients with cardiogenic shock by its dependency on organ dysfunction. We have previously shown how a data fusion index, based on the integration of vital signs continuously monitored in a stepdown unit (SDU), can identify the cardiorespiratory instability that often precedes adverse events [1] . We show how the correlations between vital signs, captured in a data fusion index based on a probabilistic model of normality [2] , change during times of cardiorespiratory instability. Methods An observational study was carried out in a 24-bed SDU, in which vital signs from 326 patients were continuously recorded [1] . The existing standard of care used single-channel Medical Emergency Team (MET) activation criteria to determine when individual vital signs became abnormal. Retrospective evaluation of the continuous vital sign data identifi ed that MET criteria were exceeded on 238 nonartefactual occasions. One hundred and eleven of these were indicative of suffi cient cardiorespiratory instability that they should have required an MET call (critical events). Vital-sign dynamics during these events were compared with data from stable periods, using order statistics and covariance analysis. We found the probability distributions of the most extreme and the median data for each parameter over 1-minute intervals. The bivariate Gaussian distributions of best fi t for data from critical events (abnormal respiratory rate (RR) or heart rate (HR)) show signifi cant diff erences in covariance when compared with those calculated for data intervals from stable patients (see Figure 1 , in which covariance is indicated by ellipse orientation). Under normal conditions (black ellipses), RR is correlated with HR. During critical events, tachycardia occurs with little or no variation in RR (red ellipses); tachypnoea occurs with little or no variation in HR (blue ellipses). Conclusions Dynamics and correlations that exist in vital signs during periods of stability change signifi cantly during periods of abnormality. A statistical approach that integrates vital signs and captures correlations between them will be sensitive to cardiorespiratory deterioration. Introduction In vivo microdialysis (MD) is a bedside sampling method that permits continuous analysis of a patient's extracellular tissue chemistry without consuming blood. MD is performed by implanting a commercially available catheter that mimics a blood capillary at the site of interest. Methods A total of 35 (20 men) mechanically ventilated septic patients having a median age of 60 years were studied. All patients met the ACCP/ SCCM consensus criteria for sepsis. Upon sepsis onset, a microdialysis catheter (CMA 60; CMA, Solna, Sweden) was inserted into the subcutaneous adipose tissue of the upper thigh. The dialysate samples were collected in microvials and were analyzed immediately for glucose, pyruvate, lactate, and glycerol using a mobile CMA ISCUS analyzer. The lactate/pyruvate (L/P) ratio was automatically calculated. Measurements were performed six times per day during the fi rst 6 days from sepsis onset. The daily mean values of MD measurements were calculated for each patient. Results Sepsis severity of the study group was graded as follows: sepsis (n = 5), severe sepsis (n = 2) and septic shock (n = 28). APACHE and SOFA scores at study entry were 23 ± 4 and 8 ± 3, respectively. Overall, 18 patients died yielding an ICU mortality rate of 51%. MD revealed that at study entry patients with septic shock had higher lactate (3.5 ± 1.6 vs 1.8 ± 0.9, P = 0.01), and higher pyruvate (204 ± 137 vs 95 ± 68, P = 0.04) levels compared with septic patients without shock. In contrast, the two groups had similar values for glucose (5.9 ± 2.7 vs 4.3 ± 2.9, P = 0.18), glycerol (369 ± 225 vs 252 ± 171, P = 0.21), and L/P ratio (83 ± 289 vs 75 ± 150, P = 0.92). Septic shock correlated with SOFA on day 1 (r = 0.55, P = 0.001), APACHE II (r = 0.42, P = 0.01), lactate on day 1 (r = 0.48, P = 0.004), lactate on day 2 (r = 0.50, P = 0.002), pyruvate on day 1 (r = 0.040, P = 0.018), and with pyruvate on day 2 (r = 0.35, P = 0.04). Nonsurvivors had higher glycerol (426 ± 236 vs 253 ± 157, P = 0.01) at study entry and on day 1 (437 ± 260 vs 282 ± 169, P = 0.04) compared with survivors. Nonsurvivors also had higher pyruvate levels on day 1 (170 ± 99 vs 104 ± 65, P = 0.03) and on day 2 (150 ± 81 vs 97 ± 61, P = 0.04). Logistic regression analysis showed that APACHE II (OR = 1.312, P = 0.06) and glycerol on day 1 (OR = 1.005, P = 0.04) independently predicted patient outcome. Conclusions MD seems to be a safe and promising tool in grading sepsis severity and in predicting early mortality in critically ill patients. The muscle-specifi c infl ammatory response after strenuous exercise has some similarity with those observed in sepsis and it is used as a mirror of sepsis-specifi c infl ammatory response. Near-infrared spectroscopy (NIRS) has been used to quantify sepsis-induced metabolic alterations by measuring variations of tissue oxygen saturation (StO 2 ) during a vascular occlusion test (VOT). We aimed to investigate whether NIRS dynamic variables could refl ect the magnitude of infl ammatory process that follows strenuous exercise in marathon runners. Methods VOT-NIRS (InSpectra Model 650) measurement was performed before, at the fi nish line and on the day after the marathon in 13 runners (10 males/three females). VOT-derived NIRS traces were analyzed for StO 2downslope (R dec StO 2 ) and StO 2 -upslope (R inc StO 2 ). Blood samples were collected to measure C-reactive protein (CRP), procalcitonin (PCT), white blood cells (WBC), creatine kinase (CK), and lactate. Data are mean ± SD (in Table 1 , median (IQR)). P <0.05 was considered statistically signifi cant. The results of NIRS dynamic and hemodynamic variables are shown in Table 1 . R dec StO 2 was signifi cantly increased in the day after the marathon. Signifi cant increases in lactate (2.7 ± 0.5 vs 1.2 ± 0.3) and WBC (14 x 10 3 ± 4 vs 5 x 10 3 ± 2) were observed after the completion of the marathon. Serum CRP, PCT and CK were signifi cantly increased on the day after the marathon (0.6 ± 0.6 vs 12 ± 8; 0.09 ± 0.2 vs 0.7 ± 0.9; 398 ± 198 vs 1,932 ± 1,620), indicating the muscle-specifi c infl ammatory response as a result of muscle damage. Conclusions Delayed increase in thenar VO 2 is associated with musclespecifi c infl ammatory response after strenuous exercise as a result of the muscle damage. Introduction Current clinical parameters on effi ciency of resuscitation are often insuffi cient, and unable to reliably assess tissue hypoxia. Considerable interest has been aimed to direct monitoring of tissue oxygen tension (ptO 2 ). However, no golden standard has yet been found. Animal experimental models have shown interesting results in hypovolemic and septic shock and have lead to some interesting clinical studies. The purpose of this review is to evaluate the current value of ptO 2 monitoring in animal experimental studies under various pathophysiological conditions. Methods An electronic literature search on Pubmed and Embase databases was conducted to fi nd relevant articles on tissue oxygenation and hemorrhage, trauma and endotoxemia. An initial amount of 7,876 articles were retrieved. After applying inclusion and exclusion criteria and critical appraisal, 48 articles were ranked on their level of evidence. Results After screening of the 48 articles, 19 articles were found to be suitable to answer our goal. Ten articles discussed tissue oxygenation in a hemorrhagic model and nine articles in a model of endotoxemia. No relevant articles concerning ptO 2 and experimental trauma were found. Eight articles compared splanchnic ptO 2 measurements in relation to ptO 2 measured in several peripheral sites, like subcutaneous and skeletal muscle tissue. A positive correlation between these locations was found in seven articles. A decrease in both peripheral and central tissue oxygen pressure (ptO 2 ) was found during hemorrhage. Importantly, peripheral ptO 2 changed earlier during hemorrhage before any hemodynamic parameter did. In contrast, only one article found an earlier change in hemodynamic parameters than in peripheral ptO 2 . Nine articles discussed ptO 2 in a model of septic shock. Only two studies compared peripheral ptO 2 measurements with splanchnic ptO 2 measurements. Both sites had decreased ptO 2 values in endotoxemia and both were found to change before any changes in hemodynamic parameters. Conclusions A signifi cant relation between intestinal ptO 2 and peripheral ptO 2 was found in studies on experimental models of hemorrhagic and septic shock. This suggests that peripheral sites like the subcutaneous tissue or skeletal muscles tissue are reliable locations for measurements of ptO 2 . In all, cardiac output (CO), stroke volume (SV), pulse pressure variation (PPV), and standard hemodynamic parameters were continuously recorded. Fluid responsiveness was defi ned as an increase in SV >15%, after a 500 ml colloid (HEA 130/0.4) infusion over 10 minutes. StO 2 measurements and VOT (sphygmomanometer infl ated until >50 mmHg above systolic pressure and kept infl ated until StO 2 decreased to 40%) were performed after anaesthesia induction (baseline), and before (PPV >13%, therefore defi ning hypovolemia) and after fl uid challenge. Results At baseline, no patient was preload dependent (mean PPV 7.9 ± 2%), and none required vasopressors during the procedure. Baseline mean StO 2 was 85 ± 6%, while CO and SV were 4.9 ± 0.9 l/minute and 68 ± 21 ml, respectively. During hypovolemia, compared with baseline, there was no diff erence in both StO 2 (85 ± 6 to 84 ± 9%, P = 0.76) and StO 2 desaturation rate (-9.6 ± 1.6 to -11 ± 2.7%/minute, P = 0.10), while the StO 2 recovery slope was lowered (5.1 ± 1.6 to 3.8 ± 1.5%/second, P = 0.04). Patients were all responders to fl uid challenge. After volume expansion, SV (84 ± 20 vs 62 ± 12 ml, P <0.01) and CO (6.5 ± 2 vs 4.8 ± 1.2 l/minute, P <0.01) were higher, with no signifi cant changes in both StO 2 and StO 2 desaturation rate. Fluid challenge induced a 46% increase in StO 2 recovery slope (P <0.01; Figure 1 ), which was comparable with baseline. Conclusions Whilst redistribution of blood fl ow may occur during circulatory failure (vasoconstriction of lesser vital organs), impairment of microcirculation and tissue hypoperfusion indicators are of particular importance. Both hypovolemia and intravascular volume expansion are associated with signifi cant modifi cations in the dynamic recovery slope of StO 2 . Future studies are needed to better clarify its potential clinical applications. Introduction Although the adverse eff ects of mechanical ventilation on circulation and intestinal organ perfusion are well known, the infl uence of assisted breathing versus controlled ventilation has not been investigated. We hypothesized that pressure support ventilation causes less decrease in gut perfusion and less neutrophil adhesion than controlled ventilation. Methods Male SD rats (n = 40) were anesthetized and tracheotomized and 10 animals each ventilated in pressure support (PSV) or pressure controlled (PCV) mode. PCV animals were paralyzed. After 5 hours of ventilation, intravital microscopy was performed. Endotoxin (15 mg/kg lipopolysaccharide) was given to 20 rats, 20 animals served as controls. Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 Results There were no diff erences in baseline hemodynamics or gas exchange. Animals were stable over the whole observation period with minimal changes in blood pressure and oxygenation. In PCV, the percentage of functional capillaries in the longitudinal mucosal layer was lower (57 ± 39%) than in PSV (87 ± 33%). In sepsis, there was a further decrease in PCV and a major decrease in PSV. Leukocyte adhesion in the intestinal submocosal venules (for example, V1 venules) was higher in PCV (228 ± 119) than in PSV (93 ± 65) in controls, but not in septic animals. Conclusions In septic rats with preserved macrohemodynamic stability, intestinal perfusion was impaired after LPS infusion. Pressure-controlled ventilation had a signifi cant infl uence on microcirculation and neutrophil adhesion in controls. In septic animals, pressure support ventilation did not attenuate these adverse eff ects as compared with PCV. Introduction Central venous oxygen saturation (ScvO 2 ) has been shown to be a useful therapeutic target in septic shock or high-risk surgery. The central venous-to-arterial carbon dioxide diff erence (Pcv-aCO 2 ) has been proposed as a complementary tool for goal-directed therapy (GDT) in septic shock. We tested the hypothesis that both ScvO 2 and Pcv-aCO 2 could be used as complementary tools for GDT during high-risk surgery. Methods Seventy adult patients, ASA I to III, undergoing major abdominal surgery, were randomly assigned to 6 ml/kg/hour (R-GDT group, n = 36) or 12 ml/kg/hour (C-GDT group, n = 34) of crystalloids. Additional boluses of HES (130/0.4) were given to maintain respiratory variation in peak aortic fl ow velocity (ΔPV) below 13%. In both groups, ScvO 2 , cardiac output (CO), oxygen delivery (DO 2 i), Pcv-aCO 2 and postoperative complications were blindly recorded. Results At baseline, there were no diff erences in hemodynamic variables, ScvO 2 (79 ± 7 vs 80 ± 6, P = 0.37) and Pcv-aCO 2 (6 ± 3 vs 6 ± 2, P = 0.95). The total volume of fl uid perfused was larger in the C-GDT group than in the R-GDT group (P <0.01). The two groups showed no diff erences in intraoperative blood loss, blood transfusion, mean CO and mean DO 2 i values. Overall, postoperative complications were increased in the R-GDT group (P <0.01), especially postoperative sepsis occurred more often (P = 0.0064). Minimal ScvO 2 (minScvO 2 ) was higher in the C-GDT group (72 ± 6 vs 69 ± 6%, P = 0.04). In patients with complications, minScvO 2 was signifi cantly reduced (72 ± 6 vs 67 ± 6%, P = 0.0017). minScvO 2 <70% was independently associated with sepsis (OR 4.2 (95% CI 1.1 to 14.4), P = 0.025). Intraoperative mean Pcv-aCO 2 was higher in the R-GDT group (7 ± 3 vs 5 ± 2 mmHg, P <0.01). In patients who develop sepsis, Pcv-aCO 2 was higher than in patients who did not (8 ± 2 vs 5 ± 2 mmHg, P <0.01). In patients with ScvO 2 >70% and who develop sepsis, Pcv-aCO 2 was also signifi cantly higher (P <0.01). The area under the ROC curve was 0.758 (95% CI 0.71 to 0.81) for discrimination of patients with ScvO 2 >70% who did and did not develop sepsis, with 5 mmHg as the best threshold value. Conclusions ScvO 2 refl ects important changes in oxygen delivery in relation to oxygen needs during the perioperative period and might help guiding GDT better than ΔPV alone. Pcv-aCO 2 appears a useful tool to identify persistent hypoperfusion when GDT is associated with ScvO 2 >70%. is a marker of venous fl ow adequacy to remove the total CO 2 produced by peripheral tissues [1] . Substitution of a central for a mixed venousarterial pCO 2 diff erence is acceptable [2] . The objective was to assess the relationship between ΔCO 2 and SOFA score variation in patients with severe sepsis and septic shock. Lactate and ScO 2 were also evaluated. Methods Prospective observational study in patients with severe sepsis and septic shock admitted to an adult tertiary ICU. The management of sepsis was carried out as proposed by Surviving Sepsis Campaign guidelines. The earliest simultaneous measurement of lactate, SvcO 2 , and ΔCO 2 were obtained when clinically indicated (T0) and after 8 to 12 hours (T8-12). The SOFA score was determined at T0 and after 24 hours (T24) and ΔSOFA (T24-T0) was calculated. Conclusions Patients with hyperlactatemia in the beginning of admission, which persists with increased levels of lactate, evolve to a worsening of SOFA score in 24 hours. ΔCO 2 and SvcO 2 and its trends are unrelated to the SOFA score in 24 hours. The measurement of cardiac output and its adequacy to tissue needs are essential for hemodynamic evaluation. In a recent review it was stated that the venous-to-arterial carbon dioxide diff erence (dPCO 2 ) could be considered a marker of adequacy of venous blood fl ow to remove the total CO 2 produced by the peripheral tissues. Several studies have already shown that dPCO 2 and the cardiac index (CI) are inversely correlated in critically ill patients. However, the dPCO 2 can be infl uenced not only by CI, but also by other factors governing CO 2 production and CO 2 elimination. The aim of this work was to study the behaviour of dPCO 2 , measured from central venous blood, and its evolution during the early stage of septic shock. Methods Forty-six patients with septic shock were prospectively included. dPCO 2 was calculated by the diff erence between the arterial PCO 2 and the PCO 2 from central venous blood. A value of dPCO 2 >6 mmHg was considered high. The CI was measured by transpulmonary thermodilution. ScvO 2 and serum lactate were obtained. Patients were separated into a normal dPCO 2 group and a high dPCO 2 group. The results were compared by Student t test or Mann-Whitney test. P <0.05 was chosen as signifi cance. Results At inclusion 24 patients (52%) had a high dPCO 2 . These patients had a lower CI (3.5 ± 1.1 vs 4.2 ± 0.99 l/min/m 2 , P = 0.04) and lower ScvO 2 (57 ± 17 vs 71 ± 8%, P <0.001) than patients with normal dPCO 2 . No diff erence was found in PaCO 2 or PaO 2 /FiO 2 levels between both groups, suggesting a similar CO 2 elimination. Thirteen patients had a decrease in dPCO 2 from above to below 6 mmHg (from 8 ± 2 to 5 ± 1 mmHg, P = 0.001) associated with an increase in CI (from 3.7 ± 1.3 to 4.2 ± 1.3 l/min/ m 2 , P = 0.003). Conversely, 11 patients had an increase in dPCO 2 from below to above 6 mmHg (from 3.8 ± 2.4 to 7.7 ± 2.6 mmHg, P = 0.002). In these patients, CI decreased (from 4.3 ± 1.3 to 3.5 ± 1 l/min/m 2 , P = 0.013) equally. A negative correlation was established between dPCO 2 and CI (r 2 = 0.40, P <0.001) and between changes in dPCO 2 and change in CI (r 2 = 0.16, P = 0.008) during the study. The lactate level was similar in both groups. The ScvO 2 was correlated with CI (r 2 = 0.44, P <0.001). Conclusions dPCO 2 patients with septic shock seem to be related principally to CI. dPCO 2 might be a marker of tissue perfusion adequacy to patient's metabolism and could be a resuscitation target for management of septic shock patients. Introduction High lactate (LCT) identifi es critically ill patients, predicts risk of mortality and guides resuscitation. Low central venous oxygen saturation (SvO 2 ) is associated with mortality and is used as a resuscitation target. A change in LCT, but not in SvO 2 , has been shown to relate to outcome. It is physiologically plausible that those with low SvO 2 and high LCT do worse versus those with only one of SvO 2 or LCT abnormal, or with both normal. Early changes in these combinations may also determine subsequent changes in organ performance. We hypothesised that changes in combinations in SvO 2 and LCT from day 1 to day 2 predicted changes in total SOFA between day 2 and day 4. Methods We used a retrospective cohort methodology using data obtained from the electronic clinical information system. All included patients underwent LCT and SvO 2 measurement (from internal jugular or subclavian vein) in the fi st 24 hours after ICU admission. Baseline (demographic, physiological), daily follow-up (physiological, SOFA) and 30-day mortality data were recorded. Worst admission values were used to combine SvO 2 /LCT into four groups (SL groups: 0 to 3) dichotomised by mean (SvO 2 ) and median (LCT): 0 -N/N; 1 -L/N; 2 -N/H; 3 -L/H. Variables individually associated (P <0.20) with a change in total SOFA between day 2 and day 4 were included in a multivariate linear regression model (with change in total SOFA as the dependent variable) using forward stepwise inclusion. Variables with adjusted P <0.05 remained in the fi nal model. Results A total of 1,544 patients were included. Complete data on all change variables were available for 675 patients. Mean (SD) SvO 2 was 63.6 (11.3) and median (IQR) LCT 3.85 (4.3). Mean (SD) APACHE II and age were 22.4 (6.2) and 64.9 (15.9), respectively, and 61.9% were male. The mean fall (SD) in total SOFA from day 2 to day 4 was 1.05 (2.78). APACHE II (adj. P <0.001), day 1 SL group (adj. P = 0.019) and SOFA (adj. P <0.001), and C-reactive protein >209 mg/l (adj. P = 0.006) were independently associated with Δ-SOFA. Changes in SL group were not associated with improvements in organ function (adj. P = 0.12). Conclusions APACHE II, SL group and total SOFA (both on day 1) and C-reactive protein >209 mg/l were associated with worsening organ function between day 2 and day 4. Improvement in combinations of SvO 2 and LCT were not associated with changes in organ performance. concept, alternative sets of lactate production, nowadays, challenges our comprehension about oxygen delivery and consumption. Recent studies have shown the prognostic value of oxygen and lactate gradients measured from central venous to pulmonary artery circulation. Our data represent one sample of our 4-year experimental research, trying to build a conceptual framework related to oxygen and lactate dynamic in diff erent hypoxic states. Methods Sixteen large white pigs (35 kg) and seven shams were studied. After anesthesia and full monitorization (EKG, MAP, etCO 2 , continuous gas analysis, Doppler portal fl ow, small bowel tonometry, liver tpO 2 and urine output), abdominal sepsis was induced by fecal peritonitis. Conclusions Our preliminary data show high rates of oxygen extraction rates in coronary circulation and suggest apparent lactate consumption by heart and lactate production by the lung, during experimental sepsis. of the present study is to clarify whether metformin can increase cellular lactate production by inhibiting mitochondrial function in tissues other than the liver. Methods Platelet-rich-plasma (PRP) from healthy volunteers was incubated for 72 hours with diff erent doses of metformin. The plasma lactate concentration was then measured. The proportion of normally polarized vs abnormally depolarized platelet mitochondria (FL2/FL1) was assessed by fl ow cytometry after staining with JC-1. Platelet respiratory chain complex I activity and citrate synthase (CS) activity, a marker of mitochondrial content, were measured by spectrophotometry. Data are presented as means ± SD. Analysis was performed with one-way ANOVA. Main results are reported in Table 1 . Conclusions Metformin can cause hyperlactatemia by impairing platelet mitochondrial function. Introduction A patient presented with severe acidosis, point-of-care (POC) lactate of 42 mmol/l. This led to suspicion of mesenteric ischemia and potential need for laparotomy. However, plasma lactates were <4 mmol/l, and ethylene glycol (EG) ingestion was subsequently diagnosed. We therefore wished to determine why discrepant lactates occur and whether this lactate gap (that is, the diff erence between lactate measured using two common methods) could be clinically useful. Lactate is an important substrate for intermediary metabolism and allows movement of carbon and reducing power between cells [1] . Its use as a measure of end organ perfusion and as a marker of tissue hypoxia in the critically ill is well established. It is suggested that a rise in serum lactate is a sensitive indicator of poor prognosis in this group [2] . We wished to identify whether or not this was the case in our patient population. Methods A retrospective case note review of 504 consecutive admissions to Glasgow Royal Infi rmary ICU was undertaken over an 18-month period. Serum lactate on initial blood gas analysis following admission to intensive care was retrieved from the clinical information system. Demographic, APACHE II score and outcome data were retrieved from the Ward Watcher system in the ICU. Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 Results A total of 1,544 patients were included. Complete data to calculate day 1 and day 3 parameters were available in 1,070 and 656 patients, respectively. Overall 30-day mortality was 21.3%. Mean (SD) age and APACHE II score were 63.0 (16.4) and 21.6 (6.9), respectively. For survivors versus nonsurvivors there were signifi cant diff erences in baseline SIDa (P = 0.03), iSIDa, SIDe, lactate and base excess (all P <0.001), but not in SIG (P = 0.1). An increased SIDa (P = 0.015) and SIDe (P <0.001), and decreased iSIDa and lactate (all P <0.001) but not SIG (P = 0.09) were predictive of mortality. For day 1 to day 3, only Δ-SIG was weakly predictive of mortality (OR 1.03 per 1 unit increase, P = 0.04, 95% CI = 1.00 to 1.05, n = 656) but there was no diff erence between survival groups (P = 0.051). Conclusions This study confi rms recent data suggesting baseline diff erences (but not changes over 3 days) in the relationships between strong and weak ions and pCO 2 (refl ected in SIDa, lactate, and impacts on mortality). The diff erent eff ects observed for SIG and BE imply a signifi cant unmeasured acid load infl uencing mortality. Bias and confounding may aff ect these fi ndings and they should therefore be confi rmed prospectively. Introduction Intubation and mechanical ventilation impair secretion clearance and can lead to lung collapse, consolidation and ventilatorassociated pneumonia [1] . There is, however, no valid diagnosis of secretion retention in the intubated and ventilated patient. Vibration response imaging (VRI) is a commercially developed acoustic lung imaging system that displays breath sound distribution [2] . VRI should be able to identify relationships between specifi c breath sound distributions and secretion retention. This preliminary study investigated the changes in VRI measurements before and after chest physiotherapy in adult intubated and mechanically ventilated patients. Methods Intubated and ventilated adult patients who were receiving chest physiotherapy were investigated. Lung sound amplitude at peak inspiration was measured immediately before and after chest physiotherapy using two arrays of sensors attached to the patient's back in a supine position. Chest physiotherapy included combinations of closed airway suctioning, saline lavage, postural drainage, manual techniques and/or lung hyperinfl ation, dependent upon clinical indications. Means are compared with the Wilcoxon matched-pairs signed-ranks test. Results A total of 16 patients were included in the study (12 males, four females, age 65 ± 14). Patients were predominantly ventilated with continuous positive airway pressure and pressure support. Following physiotherapy, total lung sound amplitude at peak inspiration decreased twofold from 37 ± 58.10 6 to 18 ± 23.10 6 arbitrary units (AU), with signifi cant reduction in the left lung (P = 0.03). Furthermore, the diff erence in sound amplitude between right and left lungs signifi cantly decreased posttreatment compared with pretreatment (P = 0.03). Conclusions Computerized lung sound monitoring may be useful to assess secretion retention and the eff ectiveness of secretion removal in mechanically ventilated patients. Further investigation is, however, necessary in order to distinguish between secretion-related eff ects and changes due to other factors such as airfl ow rate and pattern. Introduction Posterior auscultation of mechanically ventilated patient is challenging. Environmental ICU noises make the detection of subtle auscultatory features diffi cult. Digitalized pulmonary acoustic monitoring has recently been introduced in the ICU. We explored the possibility to listen to sound waves recorded by the VRI system. Methods Lung sounds were recorded for 20 seconds using two arrays of 17 piezoelectric contact sensors attached to the patient's back in a supine position. Sound fi les were archived in a WAV format for offl ine analysis. Three physicians and one trained respiratory therapist, blinded to the patient's clinical status, provided their fi ndings while listening to sound waves recorded by 12 sensors distributed from apex to base, six from each lung. Breath sounds were categorized as normal versus abnormal and adventitious lung sounds were characterized as wheezes, rhonchi or crackles. Findings were compared with anterior stethoscope auscultation performed at the time of the recording by one of the physicians. Results Eighteen critically ill mechanically ventilated patients (age 65 ± 17) were enrolled in this study. Chest radiography (CXR) fi ndings included consolidation in 13 patients. There was an agreement among at least three clinicians in the normal/abnormal classifi cation. Furthermore, in 10 out of these 15 patients (67%), this assessment based on sound wave analysis was in agreement with anterior stethoscope auscultation. Finally, the physician who performed both sound wave analysis and stethoscope auscultation reported crackles/rhonchi in fi ve out of 13 patients with consolidation (38%) during sound wave analysis but only in one patient during auscultation (8%). Conclusions The level of interobserver variability for sound wave characterization was comparable with levels usually reported during stethoscope auscultation. However, in patients with consolidation, a higher number of crackles/rhonchi was reported during posterior sound wave analysis than during anterior stethoscope auscultation. This discrepancy may be either due to the anterior/posterior orientation or to the higher sensitivity of the offl ine sound wave analysis. Introduction Knowledge of micromechanical properties of lung parenchyma is essential for understanding macroscopic lung mechanics. The fi rst results of in vivo and in situ measurements using an endomicroscopic device [1] are reported. The degree of local lung deformation in dependence of locally applied pressure is shown. Methods An endoscopic system including two concentric trocars was constructed to apply a defi ned local pressure within the fi eld of view (P fov ). The endoscopic system was placed between the ribs of mechanically ventilated rats with the tip placed on the pleura. P fov was applied to a circular area on to the lungs' surface and led to a local deformation. By adjusting the fl ow rate of the fl ushing fl uid from the outer trocar to the inner trocar, P fov was varied. The resulting local deformation of lung parenchyma was optically measured by estimating the diameter of ceramic particles, fi xed on the pleura for that purpose. Deformation was measured at diff erent levels of airway pressure (P aw ) with P fov ramped from 0 to -40 mbar. The healthy lung ( Figure 1A ) parenchyma showed less deformation caused by fl uidic pressure compared with the lavaged lung ( Figure 1B) . In contrast to the lavaged lung, the deformation of the healthy lung was less when higher P aw was present. Conclusions Micromechanical properties of lung parenchyma can be analyzed in vivo at an alveolar level. The healthy lung parenchyma appears to be stiff er (less deformation) at higher P aw . The stronger deformation and less dependence on airway pressure in the lavaged lung support the hypothesis that small lung compliance in lavaged lungs might not be reasoned by stiff lung parenchyma, but rather by regional collapse. References At every step we studied the changes of FRC, C rs , PaO 2 /FIO 2 ratio and performed a transthoracic echocardiography (Agilent 5500; Hewlett and Packard) to evaluate the integral of velocity time of left ventricular outfl ow tract (LVOT VTI). All data are reported as mean ± SD. ANOVA was used to compare changes during the time. Results Table 1 presents the main results of the study. Best PEEP was set at 10 cmH 2 O, at which level the decrease of FRC and improvement of C rs indicates the start of de-recruitment and end of overdistention. Introduction Electrical impedance tomography (EIT) is a promising new tool for bedside monitoring of regional lung ventilation. Several studies have focused on the ventilation distribution and relationship with regional lung volume on a lower, caudal lung level. However, no information is available at a higher, cranial lung level. Methods EIT (EIT Evaluation Kit 2; Dräger, Lübeck, Germany) was measured at cranial and caudal lung levels in 10 patients after cardiothoracic surgery. Patients were fully sedated and mechanically ventilated and a PEEP trial was performed at four PEEP levels (15, 10, 5 and 0 cmH 2 O). The center of gravity index decreased after lowering the PEEP level at both the caudal and cranial lung levels ( Figure 1 right) . Whereas the tidal volume impedance variation divided by tidal volume increased at the cranial lung level and decreased at the caudal lung level during the step-wise reductions in PEEP (Figure 1 Conclusions During decremental PEEP steps, the ventilation distribution not only shifts from a dorsal to ventral direction, but also from the caudal to cranial direction. Introduction Electrical impedance tomography (EIT) is a promising bedside tool for monitoring regional lung ventilation during mechanical ventilation [1] . The purpose of this study was to assess regional lung ventilation by EIT in a pig model of acid-induced acute lung injury (ALI) during pressurecontrolled (PCV) and volume-controlled ventilation (VCV). Methods Ten anesthetized and tracheotomized female pigs (25 to 30 kg) were randomized into two groups: mechanical ventilation by PCV (pressure adjusted to achieve and maintain 8 ml/kg tidal volume, n = 5) or VCV (8 ml/kg, n = 5) in FiO 2 50%. ALI was induced in both groups by intratracheal instillation of 4 ml/kg hydrochloric acid (HCl) 0.1 N. Impedance changes (Draeger, Germany) were recorded by a 16-electrode belt placed at the level of the sixth intercostal space, and four horizontal, equally-sized regions of interest (ROIs) were defi ned for offl ine data analysis. Measurements were performed before (T-BL) and after lung injury (T-ALI). Lung tidal volume (Vt), pulmonary static compliance (C stat ), pulmonary mean artery pressure (mPAP), peak pressure (P p ), respiratory rate (RR) and PaO 2 /FiO 2 ratio were also recorded. Statistical analysis was based on two-way ANOVA followed by the Tukey test for analysis of data within and between groups, and a t test was used analysis of impedance changes induced by ALI. Introduction Electrical impedance tomography (EIT) is a non-invasive bedside imaging tool with the potential to assess regional lung ventilation reliably [1] . The purpose of this study was to monitor changes in lung impedance in a pig model of severe shock followed by fl uid resuscitation. Methods Twelve anesthetized, mechanically ventilated (8 ml/kg, PEEP 5 cmH 2 O), supine pigs were submitted to acute hemorrhagic shock with infusion of endotoxin. Animals were allocated to positive control (PC, n = 6) or treatment group with lactated Ringer's to achieve and maintain pulse pressure variation 13% and mean arterial pressure 65 mmHg (PPV, n = 6). Ventilatory and hemodynamic parameters were recorded at baseline, 1 hour after hemorrhagic shock (Tshock), and hourly for 3 hours (T1 to T3). A 16-electrode belt was placed at the level of the sixth intercostal space for EIT measurements (Dräger, Germany). Offl ine analysis was based on four horizontal regions of interest (ROIs) over the ventrodorsal lung area. Statistical analysis was based on two-way ANOVA followed by the Tukey test (P <0.05). At Tshock there was hemodynamic compromise, statistical decrease in lung compliance (C stat ) and signifi cant increases in pulmonary vascular resistance index, mean pulmonary artery pressure and peak pressure (P peak ), with no statistical diff erence between groups. Endpoints and hemodynamic stability were achieved in the PPV group in 117 ± 28 minutes. C stat continued to deteriorate and P peak continued to rise in both groups from T1 to T3. When compared with PC, the PPV group had signifi cant impedance increases in ROIs 1 and 2 at T2 and T3 and, at T3 the increase in ROI 1 was also statistically greater than Tshock within the group. Statistical decrease in the percentage tidal distribution in ROI 3 and increase in ROI 2 of the PPV group, in relation to the PC group, were also noted. Conclusions Despite re-establishment of hemodynamic adequacy in PPV group and although ventilatory parameters were similar in both groups over time, resuscitation as performed in the study induced signifi cant changes in tidal impedance toward nondependent lung regions, implying greater lung impairment in treated animals. Introduction At the bedside, but even in most research work, the analysis of respiratory system mechanics is limited to quasi-static conditions excluding any insight into what happens during the breath. The new gliding-SLICE method helps looking into the breath. It is a further development of the SLICE method for calculating compliance and resistance of subsequent intratidal volume ranges (slices) of the pressurevolume (PV) loop by multiple linear regression analysis in a continuous way. This allows for detecting intratidal compliance and resistance nonlinearity during ongoing ventilation. Our objective was to determine whether the nonlinear intratidal compliance profi le hints at what level to set PEEP and tidal volume (VT) to make lung ventilation protective. Methods In 12 piglets, atelectasis was induced by application of negative pressure. The PV relationship and the ECG signal were recorded during mechanical ventilation at diff erent levels of end-expiratory pressure (PEEP: 0, 5, 10 and 15 cmH 2 O) and a VT of 12 ml/kg BW. Using the gliding-SLICE method [1] , intratidal compliance profi les were calculated and compared with the conventional quasi-static compliance. Results In contrast to quasi-static compliance, the gliding-SLICE method revealed pronounced intratidal nonlinearity of the compliance profi le under ongoing ventilation (Figure 1 ). At low levels of PEEP, intratidal compliance increased in the low volume range, remained at a high level while further volume was delivered, and fi nally decreased with volume >6 ml/kg BW. With higher levels of PEEP, intratidal compliance decreased from the onset of inspiration. Conclusions The gliding-SLICE method gives detailed insights into the intratidal course of compliance during uninterrupted ventilation. From Introduction Heartbeats transfer mechanical energy to the lungs causing fl ow and pressure disturbances that appear as cardiogenic oscillations (COS) at the airway opening. Here we adopt a new approach for analyzing respiratory system mechanics. We consider the beating heart as a natural intrathoracic mechanical oscillator transferring mechanical energy to the lungs that travels across the lung parenchyma. COS therefore convey information on the mechanical conditions of the lung parenchyma that they cross. Methods In 25 piglets with either healthy or atelectatic lungs, the pressurevolume relationship and the ECG signal were recorded during mechanical ventilation at diff erent levels of end-expiratory pressure (PEEP: 0, 5, 10 and 15 cmH 2 O). The heartbeat-related disturbance of the PV loop was quantifi ed as the maximal compliance following an R-wave in the ECG signal, as determined by the gliding-SLICE method. Atelectasis was assessed by CT. The intratidal pattern of heartbeat-induced C COS changed with PEEP and atelectasis in a characteristic way. With PEEP and tidal volume levels assumed to be lung protective C COS was high with little intratidal changes, compared with atelectasis and overdistension that were signaled by low C COS that either increased (atelectasis) or decreased (overdistension) intratidally. The systolic pressure variations did not parallel the C COS pattern, hinting at a negligible impact of hemodynamics on the inspiratory C COS pattern. Conclusions Heartbeats induce fl uctuations in the PV loop and, as a consequence, peaks in compliance, which show characteristic patterns depending on the presence of atelectasis or overdistension. The gliding-SLICE method has the potential to detect those intratidal nonlinearities without requiring additional technical equipment making use of the ECG signal and the pressure and fl ow signals already required for controlling the ventilator. Introduction Setting the optimal level of positive end-expiratory pressure (PEEP) in the ICU is still a matter of debate. Talmor and colleagues used the transpulmonary pressure calculated from the oesophageal balloon to set PEEP in a recent randomized controlled study. This strategy aims at preventing alveolar collapse by counterbalancing the gravitational force of the lung by an equal or higher PEEP. We evaluated the relation between ventilation distribution measured by EIT and transpulmonary pressure during a PEEP trial in porcine ALI. Methods Eight pigs (30 kg) were studied during a PEEP trial before and after the induction of acute lung injury (ALI) with oleic acid. Global lung parameters, regional compliance, and oesophageal pressure were recorded at the end of each PEEP step. Regional compliance was calculated by dividing the tidal impedance variation (EIT Evaluation Kit 2; Dräger, Lübeck, Germany) by the applied driving pressure. Results Transpulmonary pressures were negative at 0 cmH 2 O PEEP and became positive during the stepwise increase of PEEP at 5 cmH 2 O before, and 10 cmH 2 O PEEP after the induction of ALI ( Figure 1 ). Optimum regional compliance was diff erent between the ventral (nondependent) and dorsal (dependent) regions of interest (ROI). In the healthy lung, optimum PEEP was 10 in the dorsal ROI and 5 in the ventral ROI, whereas after ALI this was 15 in the dorsal ROI and 5 in the ventral ROI. Conclusions If EIT is measured at a caudal lung level, optimal EIT PEEP in the dependent lung exceeds the PEEP required for positive transpulmonary pressures as used in the Talmor study, whereas in the nondependent lung optimal EIT PEEP is equal before ALI and lower after ALI. We speculate that this is probably infl uenced by the location of the EIT slice in the cranial to caudal direction. A prospective pilot study of the eff ect of neutrophil elastase on the pulmonary vascular permeability in patients with pneumonia T Tagami Introduction Some cases of pneumonia may lead to hypoxemia with acute respiratory failure, acute lung injury (ALI) or acute respiratory distress syndrome (ARDS), which may require intensive management. Neutrophil elastase is thought to be one of the causes of ALI/ARDS, because it raises the pulmonary capillary permeability, lyses of pulmonary connective tissue proteins, and produces leukocyte chemotactic factors. However, a causal relationship between the plasma neutrophil elastase level and changes in the pulmonary capillary permeability has not been established in patients with pneumonia, one of the most serious diseases underlying ALI/ARDS. Therefore, the objective of this study was to determine whether an increase in plasma neutrophil elastase is related to elevation of the pulmonary capillary permeability in patients with pneumonia. Methods Patients with pneumonia who were hospitalized from November 2008 to April 2009 and had PaO 2 ≤60 in room air with no need for mechanical ventilation were prospectively enrolled in the study. Plasma neutrophil elastase levels were collected via blood samples at baseline; 1, 3, and 7 days after the start of the study. Of those enrolled, patients with PaO 2 /FiO 2 ≤150 also had their extravascular lung water index (EVLWi) and pulmonary vascular permeability index (PVPI) monitored using the PiCCO system (Pulsion, Munich, Germany) when required by their attending physician. Statistical analysis was performed using the Spearman correlation coeffi cient (R s ) with P ≤0.05 assumed to be signifi cant. Results Fourteen patients were enrolled in the study. In six of these patients, the EVLWi and PVPI were measured simultaneously. At baseline, the elastase level and the PVPI showed a strong and signifi cant correlation (R s = 1.000, n = 6, P <0.05). All of the plot data of the six patients showed strong correlations of the elastase level with the EVLWi (R s = 0.750, n = 25, P <0.01) and the PVPI (R s = 0.811, n = 25, P <0.01). The plasma neutrophil elastase level and the PVPI measured by PiCCO were strongly correlated in patients with pneumonia. This suggests that a rise in the blood level of elastase may elevate the PVPI, resulting in an Introduction Acute lung injury (ALI), and its more severe subset acute respiratory distress syndrome (ARDS), are a major cause of mortality in the ICU [1] . Mechanical ventilation, a supportive therapy necessary to sustain life in many cases, may contribute to and worsen ALI, termed ventilatorinduced lung injury (VILI). Fibroproliferation is an early response to lung injury [2] . Indeed, dysregulated repair resulting in pulmonary fi brosis may be at the heart of ventilator dependence in ARDS. Characterising the role of excessive lung stretch in contributing to aberrant repair mechanisms would assist in developing strategies to hasten recovery from ARDS. Methods Male Sprague-Dawley rats were anaesthetized, orotracheally intubated and subjected to injurious ventilation until a defi ned worsening of compliance was noted. The rats were then recovered and extubated. The level of ongoing injury/repair was characterised at time periods of 6, 24 and 48 hours and at 4, 7 and 14 days. Systemic oxygenation, lung compliance, wet/dry ratio, BAL total protein, cytokines and cell count and histological analysis was carried out at each time point. The results demonstrated a time-course-dependent improvement in compliance and oxygenation, together with clearance of neutro philic infi ltration at 96 hours. TNFα, and IL-1β, IL-6 and IL-10 were signifi cantly elevated in BAL fl uid early post injury. Although total lung collagen remained similar at all time points, evidence of an early fi broproliferative response was present in the form of transforming growth factor-β activation and pro-collagen I and III peptide mRNA levels. Matrix metalloproteinase 3 and 9 zymography demonstrated increased levels of these matrikines. Histologic assessment of injury revealed increased alveolar tissue fraction up to and including 96 hours post injury. Myofi broblasts were present in α-smooth muscle actin stained sections in signifi cantly increased numbers post injury. Conclusions This rat model of repair of VILI demonstrates some of the mechanisms by which excessive lung stretch can contribute to fi broproliferation in ARDS and will serve to improve our knowledge of aberrant lung tissue remodelling as well as provide a useful paradigm for testing strategies to hasten recovery in ALI. The eff ects of hemorrhagic shock on respiratory system mechanics have rarely been investigated and published data are controversial. Pulmonary compliance depends in part on intrapulmonary blood and interstitial fl uid volume. When compliance is severely reduced, small modifi cation of these components may have important eff ects. The present analysis explored the eff ect of hemorrhagic shock on respiratory system mechanics and oxygenation parameters in a model of pigs with ARDS. Methods We evaluated the dynamic respiratory system compliance (C rs = VT / (inspiratory airways pressure -PEEP)) of 14 domestic pigs. Animals were mechanically ventilated: tidal volume (VT) set at 10 ml/kg; respiratory rate at 15 bpm; PEEP at 0 cmH 2 O. Animals were separated into a control group (n = 9) and an ARDS group (n = 5). ARDS was induced by lung lavage with NaCl 0.9%. During hemorrhage 40% of the total blood volume was removed. The blood was then infused during the re-transfusion phase. In the control group, C rs (ml/cmH 2 O) did not change during hemorrhage or re-transfusion ( Figure 1 ). In the ARDS group, C rs decreased with lung lavage (31.2 ± 5.7 (baseline) to 16.4 ± 3.0; *P <0.01). After hemorrhage C rs increased (21.5 ± 2.9; **P <0.001 compared with lavage) and then decreased again after re-transfusion (18.7 ± 2.7; ***P <0.05). In the same group PaO 2 /FiO 2 (mmHg) decreased after ARDS (469 ± 50 (baseline) to 105 ± 38; P <0.001), increased during hemorrhage (218 ± 105; P <0.05) and did not change after re-transfusion (207 ± 125; P = 0.82). The shunt fraction (%) decreased during hemorrhage in the ARDS group (26.2 ± 14.9 (lavage) to 6.4 ± 6.6; P <0.05) but did not change signifi cantly after re-transfusion (13.9 ± 17.0; P = 0.3). Conclusions Acute reduction of blood volume is associated with an increase of respiratory system compliance and oxygenation parameters. Reduction of intrapulmonary blood and interstitial fl uid volume or thoracic cage compliance could be responsible for this eff ect. Introduction Trials comparing higher versus lower levels of positive end-expiratory pressure (PEEP) in adults with acute lung injury or acute respiratory distress syndrome (ARDS) were underpowered to detect small but important eff ects on mortality, overall or in any subgroups. Methods We searched MEDLINE, Embase, and the Cochrane Central Register for trials randomly assigning adults with acute lung injury or ARDS to higher versus lower levels of PEEP (minimal diff erence of 3 cmH 2 O over fi rst 3 days), while using low tidal volume ventilation, and comparing mortality. Data from 2,299 individual patients in three trials were analyzed using uniform outcome defi nitions. We tested prespecifi ed eff ect modifi ers using multivariable hierarchical regression, adjusting for important prognostic factors and clustering eff ects. Results Overall, there were 374 hospital deaths (32.9%) in the higher PEEP group and 409 (35.2%) in the lower PEEP group (adjusted relative risk, 0.94; 95% confi dence interval (CI), 0.86 to 1.04; P = 0.25). Treatment eff ects varied with the presence or absence of ARDS, defi ned by a ratio of partial pressure of oxygen to fractional inspired oxygen concentration equal to or less than 200 mmHg (interaction P = 0.02). The relative risks of hospital mortality for patients with and without ARDS were 0.90 (95% CI, 0.81 to 1.00, P = 0.049) and 1.37 (95% CI, 0.98 to 1.92, P = 0.065), respectively. Patients with ARDS were more likely to achieve unassisted breathing earlier (hazard ratio, 1.16 (95% CI, 1.03 to 1.30, P = 0.01); whereas the hazard ratio for time to unassisted breathing was 0.79 (95% CI, 0.62 to 0.99, P = 0.04) in patients without ARDS at baseline. Rates of pneumothorax and the use of neuromuscular blockers, vasopressors and corticosteroids were similar. Conclusions Higher levels of PEEP are likely to improve survival for patients with ARDS, but not for patients with less severe acute lung injury. Introduction Experimental and clinical studies have shown benefi cial eff ects of recruitment maneuvers (RMs) (sustained infl ation (SI) or SIGH) on ventilatory and gas exchange parameters. In this study we investigated the eff ect of diff erent RMs on bacterial translocation from lung to blood. Methods Thirty-two rats were anesthetized, after tracheotomy was performed ventilation was started with 10 cmH 2 O P aw , 0 cmH 2 O PEEP, 60 breaths/minute, I/E: 1/2 on pressure-controlled ventilation (PCV) mode. After cannulation of the carotid artery was performed, a baseline blood gas sample was taken. Subsequently 0.5 ml of 10 5 cfu/ml Pseudomonas aeruginosa was inoculated through the tracheotomy tube and PEEP was increased to 3 cmH 2 O and ventilated for 30 minutes before randomization. Then rats were randomized into four groups: G1; SI was performed as 40 cmH 2 O PEEP and 0 P aw for 20 seconds, four times in an hour (15-minute intervals), G2; SI was performed as 20 cmH 2 O PEEP and 0 P aw for 40 seconds, four times in an hour (15-minute intervals), G3; SIGH was performed four times in 1 hour (15-minute intervals) as 40 cmH 2 O P aw , 3 mH 2 O PEEP for 60 seconds, G4; control group that was ventilated with P aw 10 cmH 2 O, PEEP 3cmH 2 O during the study period. Multiplication of pressure and pressure performing time for each study group were equal. Blood cultures were taken at baseline, 15 minutes after randomization, which is after each RM for the fi rst hour, and last blood culture was taken after 60 minutes from the fourth RM. Then rats were sacrifi ced with intra-arterial sodium thiopental, and the lungs were extirpated; the left lung was taken for measurement of the wet weight/dry weight ratio (WW/DW). Results There were no diff erences in baseline pH, PaO 2 , PaCO 2 , MAP, HR among groups. But PaO 2 were decreased in groups G1, G2, and G3, but only in G3 was statistically signifi cant to compared baseline values. The WW/DW ratio was found higher in G3 when compared with G1, and G2, but this diff erence was not signifi cant. The amount of positive blood culture was higher in G3 at early study periods. Conclusions SIGH as a recruitment maneuver causes a high probability of bacterial translocation from the lung to the bloodstream. Introduction Alveolar recruitment/de-recruitment (R/D) seems to play an important role in the development of VILI [1] . Many clinicians base their determination of PEEP settings during mechanical ventilation of ARDS/ ALI patients on an estimate of alveolar recruitability [2] . This project aims to establish an online tool that provides estimates of R/D in patients at the bedside. Methods In volume-controlled ventilated piglets as ARDS models, the airway pressure P aw (SI-Special Instruments, Nördlingen, Germany) and the fl ow rate Q (F + G GmbH, Hechingen Germany) were continuously recorded at 200 Hz. The pressure curve shows high nonlinearity being a suspect of recruitment eff ects during inspiration and a relaxation process during the end inspiratory pause. Based on the obtained data, the parameters of the linear viscoelastic model [3] are calculated by a LSE fi tting process. As the parameter C 1 represents a static constant compliance of the lung, this model is not capable of reproducing the nonlinear eff ects during inspiration. To improve on this, the constant compliance C 1 is replaced by a nonlinear pressure-dependent compliance describing recruitment and dilatation as proposed by Hickling [1] . Results Since the nonlinear model has far more variable parameters to be optimized in the fi tting process than the linear model, an approach via fi tting the linear model fi rst is helpful. Therefore, the estimated parameters of the linear model fi t can be used as starting values for fi tting the nonlinear model where the focus can be put on the recruitment phenomena. With the new nonlinear model, using the estimated values of R 1 , R 2 , C 2 from the linear model ( Figure 1 left), it is now possible to reproduce the nonlinear characteristics ( Figure 1 Conclusions Using this new model it is possible to fi t nonlinear behavior due to alveolar recruitment separately from viscoelastic eff ects with minimized error. to 10 cmH 2 O). Images were obtained at end expiration and inspiration. Global and regional quantitative CT analyses from each step of PEEP were compared [1] . Results Analyses of 12 patients showed that MRS reduced signifi cantly the global amount of nonaerated tissue (54 ± 8% to 7 ± 6%, P <0.01), tidal recruitment (4 ± 4% to 1 ± 1%, P = 0.029) ( Figure 1 ). Most dependent regional tidal recruitment signifi cantly increased from PEEP 10 to 20 cmH 2 O (2 ± 3% to 11 ± 7%, P <0.01), but signifi cantly decreased after MRS (11 ± 7% to 2 ± 2%, P <0.01). High PEEP (25 cmH 2 O) was necessary to sustain recruitment. Increasing PEEP without full recruitment may cause lung injury exacerbation in the severe ARDS population. In patients with acute respiratory distress syndrome (ARDS) tidal ventilation is always inhomogeneous, with atelectasis in dependent lung and overdistention in nondependent lung. The aim of our study was to develop a fl ow pattern that simultaneously produces locally diff erent alveolar pressures, in order to recruit collapsed alveoli and at the same time to prevent hyperinfl ation of already open alveoli. Methods We modifi ed Horsfi eld's model of the canine lung [1] to predict alveolar pressure diff erences between the collapsed and opened alveoli. The morphometric model has an asymmetrical branching airway system with 47 orders, whereby order 47 corresponds to the trachea. Each branch is terminated with a viscoelastic tissue unit and there are in total 150,077 acini for the whole model. Using this model, atelectasis of alveoli was simulated by reducing the diameter of small airways (order 1, 2 and 3) up to 90% of their original sizes. The tissue damping factor G was 10 4 times higher in the collapsed alveoli than that in normal alveoli. The shunt gas compression compliance C g for the alveolus was 10 6 times smaller. Diff erent percentages of atelectatic area from 0 to 90% were simulated. Flow distribution and airway input impedance were calculated at diff erent frequencies from 0.1 to 5 Hz. Alveolar pressure diff erences were obtained by analyzing the products of fl ow distribution and alveolar input impedance. The alveolar pressure P alv reduced as the frequency increased (for both collapsed and open alveoli). Compared with the situation at 0.2 Hz, P alv reduced to 9.1 ± 1.0% at 4 Hz. On the other hand, P alv increased as the percentage of atelectatic area increased (for all frequencies). P alv increased to 267 ± 6% when 70% of alveoli collapsed compared with 10% of collapse. The pressure diff erences between collapsed and open alveoli increased as the frequency increased. At 0.2 Hz the diff erences were <3%, while at 4 Hz the diff erences were >20% (relative to the lower value). Introduction Low tidal volumes (Vt) are thought to protect the lung by avoiding overdistension. We recently have shown that Vt may also infl uence repeated opening and closing (O/C) [1] . The goal of this study was to determine whether decreasing Vt from 6 to 4 ml/kg was eff ective to reduce O/C, and whether it was possible to maintain alveolar ventilation at such low Vt. Methods Cross-over study at two Vt levels: 6 versus 4 ml/kg IBW. We included ALI/ARDS patients, ventilated <48 hours, and who would have a chest computed tomography (CT) scan. For the 4 ml/kg arm: we replaced the heat and moisture exchange fi lter by a heated humidifi er, and the respiratory rate was increased to keep minute ventilation constant. The protocol had two parts: one bedside and other in the CT room. Both Vts were applied in a random order. For the bedside protocol each Vt arm was applied for 30 minutes. Data on lung mechanics and gas exchange were taken at baseline and 30 minutes. For the CT scan protocol each Vt arm was applied for 5 minutes and then a dynamic CT (4 images/second for 8 seconds) was taken at each Vt at a fi xed transverse region at the lower third of the lungs. Afterwards, CT images were analyzed by software (MALUNA) and repeated O/C was determined as nonaerated tissue variation between inspiration and expiration, expressed as a percentage of lung tissue weight. We analyzed nine patients (six male), who had a median age of 39 (21 to 72) years, APACHE II score 14 (5 to 23) and SOFA score 9 (6 to 15). All patients had a pulmonary origin of their ARDS and were on their fi rst day of ventilation. At baseline patients had a PaO 2 /FiO 2 ratio of 141 (71 to 280), compliance of 32 (17 to 43) ml/cmH 2 O, and PEEP of 12 (10 to 16) cmH 2 O. In the Vt arms 4 and 6 ml/kg, Vts were 260 (210 to 300) and 350 (310 to 400) ml, respectively (P <0.01), respiratory rates were 37 (31 to 42) and 25 (21 to 28) breaths per minute (P <0.01), and PaO 2 levels were 84 (54 to 148) and 83 (61 to 162) mmHg (P = 0.3). PEEP and FiO 2 were kept constant. PaCO 2 did not signifi cantly increase with Vt 4 but repeated O/C (delta nonaerated tissue) consistently decreased ( Figure 1 ). Introduction Acute lung injury (ALI) and acute respiratory distress syndrome (ARDS) are associated with signifi cant morbidity and mortality. Mechanical ventilation is the cornerstone of supportive therapy. However, the optimal strategy of ventilation and adjunctive therapies are still evolving. There is evidence to support the use of volume-limited and pressure-limited lung-protective ventilation but practice variability in the clinical management is still a concern mainly in sicker patients. and hospital outcome have not found a consistent outcome association. We hypothesized that severity of hypoxemia on admission after optimal ventilation may predict hospital outcome if the ARDS patients are categorised based on the primary etiology: pulmonary ARDS (ARDS p ) and extra pulmonary ARDS (ARDS exp ). Our aim was therefore to ascertain the relationship between hospital outcome and the severity of hypoxemia in patients with early ARDS (days 1 to 3 following admission) after categorising based on etiology. Methods We used a prospective cohort study design and enrolled 151 consecutive patients with a primary diagnosis of ARDS on the day of admission, admitted over a 2-year period to our adult general ICU. Patients enrolled in other clinical interventional trials were excluded from the study. Protocol-based management of mechanical ventilation was used to achieve optimal ventilation. Two authors independently designated patients as ARDS p or ARDS exp and a third reviewed any confl icts (only four cases). Patients were then subcategorised by severity of hypoxaemia based on PaO 2 /FiO 2 ratio (P/F ratio) intervals. All other clinical interventions were at the discretion of treating clinician. The hospital mortality in all patients included in the study (n = 151) with ARDS was 44.1%. The patients classifi ed as ARDS p had a higher hospital mortality (50.6%) compared with ARDS exp (36.4%), but the diff erence was not statistically signifi cant (P = 0.12). Nonsurvivors with ARDS p had a signifi cantly lower P/F ratio on day 1 of ARDS diagnosis compared with survivors (12.05 ± 4.46 vs 15.39 ± 4.97 kPa; P = 0.002), a relationship not observed with ARDS exp . This association between mortality and hypoxaemia in early ARDS p persisted at even more serve levels of hypoxaemia (<15 kPa, <12.5 kPa, and <10 kPa; P = 0.01, P = 0.003 and P = 0.002, respectively), while in ARDS exp the eff ect of hypoxaemia on mortality was not observed. Conclusions Our fi ndings indicate that the hypoxaemia burden as assessed by P/F ratio intervals despite optimal ventilatory support on the day of ICU admission predicts increased risk of death in ARDS p ; but not in ARDS exp . Interventional trials need to account for the infl uence of etiology and hypoxaemia burden on outcome prior to concluding this as a negative intervention. The pathological organ dysfunction that occurs in patients with SIRS or sepsis is believed to be related to vascular endothelial damage. We investigated the role of vascular endothelial damage in the occurrence of acute lung injury in severe TBI patients with SIRS. Methods The subjects were 20 severe TBI patients with SIRS. The P/F ratio was calculated from arterial blood gas analysis data obtained for seven consecutive days from the time of admission. Peripheral blood samples were collected four times to measure the serum levels of IL-6 and IL-8, and the levels of ICAM-1 and granulocyte elastase (GE) as markers of vascular endothelial damage. The P/F ratio decreased with time and was below 300 from day 4 onwards. However, the survivors maintained a P/F ratio of 300 or more. In the patients with a fatal outcome, the ratio continued to decline and the subjects developed acute lung injury on day 3. From day 5 onward, they showed a signifi cant decrease of the ratio, with values of 200 or less and symptoms of adult respiratory distress syndrome. The levels of IL-6, IL-8, ICAM-1, and GE increased after admission, but then decreased again in the survivors. In the patients who died, these levels continued to rise and there was a signifi cant increase of IL-6, IL-8, and ICAM-1 after 1 week. The correlation between the blood level of IL-6 and the level of IL-8, ICAM-1, or GE was strong, while that between the IL-8 level and the ICAM-1 or GE levels was also strong and that between the ICAM-1 and GE level was weaker. In contrast, the correlations between the P/F ratio and the blood levels of IL-6, IL-8, ICAM-1, or GE were moderate and negative. Therefore, inverse correlations were noted between the P/F ratio and all these parameters. Conclusions By determining the changes of humoral mediators, we demonstrated that vascular endothelial damage was involved in the occurrence of the pathological state of acute lung injury, which occurs in severe TBI patients with SIRS. Introduction With adaptive support ventilation (ASV), a microprocessorcontrolled mode of mechanical ventilation, the ventilator adapts tidal volume (VT) size based on the Otis least work of breathing formula. In recent studies in patients with ALI/ARDS, ASV applied VT of 7.3 (6.7 to 8.8) ml/kg ideal body weight (IBW). It is unclear whether an open-lung approach was used in these studies. Lung recruitment improves lung compliance, and as a consequence may allow the ventilator to apply too large a VT with ASV. Methods Ten consecutive patients with ALI/ARDS, ventilated in accordance with our local protocol dictating frequent recruitment maneuvers, were observed while the ventilator was switched from pressure control ventilation (PC) to ASV. Thereafter, all patients were subjected to an additional standard recruitment procedure. The primary endpoint was VT before and after switch of the ventilator, and after standard recruitment. Results Four patients suff ered from ALI, six patients from ARDS. Seven patients had an extrapulmonary cause for ALI/ARDS. VT increased from 6.5 ± 0.8 ml/kg IBW to 9.0 ± 1.6 ml/kg IBW (P <0.01) after switch from PC to ASV. Additional recruitment after switch of the ventilator did not aff ect VT size (9.3 ± 1.4 ml/kg IBW, P >0.05). In seven patients ASV applied VT >8 ml/kg IBW, in one patient VT even increased to >12 ml/kg IBW. Conclusions Patients with ALI/ARDS may be ventilated with too large a VT when subjected to ASV. Our results contrast fi ndings of previous studies on ASV in patients with ALI/ARDS, probably because we frequently use recruitment maneuvers. Conference criteria). We measured plasma RAGE levels twice on the fi rst 2 days from intubation, then every 3 days for the fi rst month and then once a week, until ICU discharge or death (n = 188). We also measured RAGE levels in BALf obtained by means of a standardized technique when clinically indicated (n = 22). At each sampling time we recorded data on ventilator settings, gas exchange, organ function and blood cell counts. Results Day 1 plasma RAGE levels (normal values <170 pg/ml) were high (median 1,588 pg/ml, IQR 780 to 2,398 pg/ml) and then lowered over time. When all samples were considered, plasma RAGE levels were signifi cantly higher in patients with blood platelet count <100 x 10 3 /μl, with plasma creatinine level ≥2 mg/dl and with SOFA score ≥5 (P = 0.029, P = 0.001 and P = 0.045, respectively). Plasma RAGE increased also with the number of organ failures (P = 0.004). Moreover, circulating RAGE was signifi cantly higher in patients with PaO 2 /FiO 2 <150 and with PEEP ≥10 cmH 2 O (P = 0.045 and P <0.001, respectively). RAGE was present in BALf (median 174 pg/ml, IQR 46 to 1,476 pg/ml). Interestingly, plasma and BALf RAGE levels were not correlated (P = 0.2). BALf RAGE was signifi cantly higher in patients with PaO 2 /FiO 2 <150 mmHg and with administered tidal volume:ideal body weight ratio >6 ml/kg (P = 0.021 and P = 0.009, respectively). Patients with culture-positive BALf had higher BALf RAGE levels in comparison with culture-negative BALf (P = 0.014). In ALI/ARDS patients, circulating RAGE is high on day 1, decreases over time and may be related to systemic and lung injury severity. BALf RAGE obtained from these patients may be associated with lung dysfunction and infection. Introduction In animal models of ventilator-induced lung injury (VILI), mild hypothermia was found to be protective by reducing pulmonary infl ammation and possibly by reducing mechanical strain by applying lower respiratory rates. However, models are hampered by severe alkalosis or an ex vivo design. In a physiological model of VILI, we investigated whether hypothermia protects from VILI by reducing respiratory rates, or by reducing infl ammation. Methods In rats, VILI was induced using a peak inspiratory pressure (PIP) of 23 cmH 2 O and zero PEEP. Controls were ventilated with a PIP of 12 cmH 2 O and PEEP of 5 cmH 2 O. Hypothermia (32ºC) was induced by external cooling, controls were maintained at 37ºC. Normo-pH (7.3 to 7.4) or strict normocapnia (4.5 to 5.0 kPa) was achieved by adjusting the respiratory rate according to blood gases drawn every 30 minutes. After 4 hours of Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 ventilation, bronchoalveolar lavage (BAL) was done. Statistics include Kruskal-Wallis and Mann-Whitney U tests. Results A physiological model of VILI was established. In the normo-pH group, hypothermia decreased pulmonary IL-6 and neutrophil infl ux and tended to decrease pulmonary protein leak ( Figure 1 ). In the normocapnia group, hypothermia allowed for lower respiratory rates compared with normo-pH (11 ± 1 vs 17 ± 2 breaths/minute) ( Figure 2 ). However, this did not further reduce parameters of lung injury. Conclusions Hypothermia was protective in a physiological model of VILI, by reduction of infl ammation, but not by reducing the repetitive strain of respiratory cycles. Introduction Therapeutic hypothermia is applied to reduce hypoxiainduced organ injury. In the past decades, the use of hypothermia has increased in critically ill patients who are mechanically ventilated (MV). Data on the eff ect of hypothermia on gas exchange and lung mechanics in these patients are limited. In this retrospective study, we describe the eff ect of induced hypothermia and rewarming on respiratory parameters in patients after a cardiac arrest. Methods Patients with a Glasgow Coma Scale <8 after resuscitation for a cardiac arrest in whom hypothermia was applied (32 to 34°C), were enrolled. Patients with PaO 2 /FiO 2 ratio <200 or MV with positive endexpiratory pressure (PEEP) >15 cmH 2 O were excluded. Ventilator settings and arterial blood gasses were retrieved from the electronic patient database during hypothermia and at every °C grade increase during rewarming. Statistics include z paired t test. Results From a cohort of 98 patients, 35 patients were excluded, leaving 62 patients for analysis. During hypothermia, arterial pCO 2 decreased, while end tidal (et) CO 2 was low at unchanged minute volume ventilation ( Figure 1 ). Hypothermia increased the P/F ratio from 255 ± 55 to 283 ± 12 at unchanged PEEP (P <0.05), while fl uid balances were positive in all patients (2.5 ± 1.6 l). After rewarming, arterial pCO 2 was unchanged while etCO 2 increased. The P/F ratio after rewarming was unchanged compared with the start of hypothermia, while lower PEEP levels were applied (7.0 ± 0.4 cmH 2 O vs 6.1 ± 0.3 cmH 2 O, P <0.05). Conclusions Induced hypothermia improved ventilation and oxygenation in critically ill patients. Hypothermia may be considered in patients with acute lung injury, in whom low minute ventilation results in severe hypercapnia. Extracorporeal life support service in a regional referral center: The median stay of ACLS maneuvers before ECLS start was 62.5 minutes In cases of cardiogenic shock undergoing ECLS, two out of four patients were discharged from our ICU. Conclusions A complete competence acquisition for ECLS management makes this system a safe and feasible technique. The possibility to guarantee a safe treatment must involve diff erent specialists and properly trained nurses. We found the importance of a well-timed start of ECLS. Introduction Severe acute respiratory distress syndrome (ARDS) patient transportation is an extremely high-risk procedure. We report our experience in transferring these patients to our centre while on extracorporeal membrane oxygenation (ECMO). Methods After telephone referral and bed availability confi rmation, patients matching entry criteria for ECMO are evaluated for transportation to our centre. A skilled crew consisting of two expert plus one training physician, one expert plus one training ICU nurse and one ECMO specialist reaches the referral hospital for re-evaluation. If eligible, cannulation, ECLS circuit set up and ECLS start are accomplished. Ground transport is performed with a specially equipped ambulance, endowing enlarged oxygen, fuel and energy supplies. The ambulance is loaded with all of the items required for cannulation and ECMO circuit set up, additional oxygen and a nitric oxide tank. Entry criteria are: potentially reversible respiratory failure, Murray Score ≥3 or respiratory acidosis with pH <7.2, no intracranial bleeding and absolute contraindication to heparinization. Between 2004 and 2009 our crew evaluated for transfer on ECMO 15 ARDS patients (10 males), age 38 ± 15 years, BMI 28 ± 7, APACHE II score 26 ± 9, SOFA score 9 ± 4, Oxygenation Index 39 ± 17. The average distance was 133 ± 124 km. Two patients improved after NO trial and were transferred without ECMO. All of the other patients underwent venovenous ECMO: 11 with cannulation of femoral veins, one femoral-jugular veins and one with a DL cannula in the jugular vein. ECMO settings were (mean ± SD) BF 2.9 ± 0.8, GF 3.6 ± 1.6, GF FiO 2 1. Data have been recorded 30 minutes before and 1 hour after ECLS began: vv-ECMO granted a better clearance of pCO 2 (75 ± 20.5 vs 49.7 ± 7.9 mmHg, P <0.01), thus improving the pH (7.279 ± 0.10 vs 7.41 ± 0.06, P <0.01) and mean pulmonary arterial pressure (41 ± 11 vs 31 ± 5 mmHg, P <0.05) and allowing a reduction in respiratory rate (28 ± 11 vs 9 ± 4, P <0.01), minute ventilation (10.2 ± 4.6 vs 3.3 ± 1.7 l/min, P <0.01) and mean airway pressure (26 ± 6 vs 22 ± 5 cmH 2 O, P <0.01). Arterial pO 2 , mean blood pressure and heart rate did not show signifi cant variations. After ECMO began, vasoconstrictor therapy (being administered to fi ve patients) was quickly tapered. Neither clinical nor technical major complications were reported. Conclusions ECMO employment at referral centers enabled longdistance, high-risk ground transportation. Controversy exists over the possible benefi cial eff ects of hypertonic saline (HS) in pulmonary infl ammatory response, particularly in neutrophil immunomodulation [1, 2] . This study was designed to investigate possible benefi ts of HS in the treatment of pigs submitted to experimental acute lung injury. Methods Twelve anesthetized, tracheotomized pigs (25 to 30 kg) were mechanically ventilated by pressure, adjusted to 8 ml/kg tidal volume, with FiO 2 50%, and submitted to intratracheal instillation of 4 ml/kg hydrochloric acid (HCl) 0.1 N. They were then randomized to ALI group (n = 7) or ALI + HS group (n = 5), where animals of the latter group received 4 ml/kg intravenous hypertonic saline, 15 minutes after injury. Hemodynamic parameters, pulmonary compliance (C stat ), peak pressure (P peak ), plateau pressure (P plat ) and tidal volume (V t ) were analyzed at baseline (TBL), 15 minutes after HCl instillation (TALI) and hourly thereafter for 4 hours (T0 to T3). Bronchoalveolar lavage was performed at the end of the observation period for fl ow cytometry analysis of neutrophil burst activity. Postmortem histopathology of the right diaphragmatic lung was also performed in all animals. Results After TALI, animals of both groups presented signifi cant increases in P peak and P plat , and a decrease in C stat , at all time points, when compared with TBL. V t was preserved in both groups over time. There were signifi cant diff erences between groups ALI and ALI + HS, respectively, in: central venous pressure (T0, T1 and T2), pulmonary artery occlusion pressure (T1 and T2) and pulmonary vascular resistance index (TALI). In the ALI group, signifi cant diff erences related to TBL were found in mean arterial pressure (T1), mean pulmonary artery pressure (TALI, T0, T1, T2 and T3) and pulmonary vascular resistance index (TALI, T0, T3). In the ALI + HS group, there were signifi cant diff erences related to TBL in mean arterial pressure (T0, T1, T2), mean pulmonary artery pressure (TALI, T0, T1, T2, T3), cardiac index (T0, T1, T2) and pulmonary vascular resistance index (T1, T2, T3). No diff erences were found between groups regarding histopathology and fl ow cytometry analyses. Conclusions HS produced no signifi cant benefi t in the studied parameters regarding the lungs, in the proposed model of ALI. Introduction Conventional mechanical ventilation (MV) may cause additional lung injury in ALI/ARDS due to overdistention of aerated lung regions (high Vt) and cyclic lung reopening (low PEEP level). Hyperproduction of infl ammatory mediators is one of the side eff ects in these cases. This factor could delay or prevent resolution of respiratory failure [1, 2] . However, it is not clear whether conventional mechanical ventilation damages intact lungs. The aim of this study was to evaluate the eff ects of conventional and protective mechanical ventilation on intact lungs in patients with severe trauma. Methods A prospective, randomized controlled trial in trauma patients with mechanical ventilation for extrapulmonary indications. The protocol was approved by the local ethics committee. Seventy-eight patients were randomized to conventional (Vt 10 to 12 ml/kg IBW, PEEP 5 cmH 2 O -n = 39) or protective (Vt 5 to 6 ml/kg IBW, PEEP 10 cmH 2 O -n = 39) mechanical ventilation. TNFα, IL-1β and IL-6 levels in plasma and BAL fl uids were measured on 1, 2, 3, 5 and 7 days of MV. Frequency of ALI (AECC criteria) and VAP were evaluated. The endpoints of this study were the length of MV, LOS in ICU and outcome on 28 days. Results In fi rst 3 days ALI was revealed in 26 patients (66.6%) in the conventional and 10 patients (26.5%) in the protective MV groups (P = 0.001; OR 4.375, 95% CI 2.227 to 8.189). ARDS occurred in four patients (10,2%) of the conventional MV group (LIS >2) and no one in the protective MV group (P <0.0001). Levels of TNFα, IL-1β and IL-6 in BAL fl uids were signifi cantly higher in the conventional MV group from 1 to 7 days with maximal increase on day 3 (542 ± 44/91 ± 11; 315 ± 35/86 ± 10; 1,092 ± 160/111 ± 18, P <0.0001). No diff erences were found in levels of TNFα, IL-1β and IL-6 in plasma samples. VAP occurred in 31 patients (83.7%) of the conventional and nine patients (23%) of the protective MV groups (P = 0.0001; OR 17.2, 95% CI 5.5 to 54.3). The length of MV was 17.4 ± 6 vs 12.8 ± 3 (P = 0.0001; OR 4.2, 95% CI 1.5 to 11.5), LOS in the ICU was 21.9 ± 5.6 vs 15.75 ± 2.9 (P = 0.0002; OR 2.0, 95% CI 0.18 to 23.6). The 28-day mortality was not signifi cantly diff erent in the groups. Conclusions Conventional MV for more than 72 hours in patients with severe trauma and intact lungs can cause lung injury, and increase duration of MV and LOS in the ICU. Introduction Administration of 80 mg/kg methylprednisolone has been shown to prevent controlled mechanical ventilation (CMV) diaphragm dysfunction in rats, partly by inhibiting the calpain system [1] . The current experiments determined whether lower doses of corticosteroids will also provide protection against ventilator-induced diaphragm dysfunction. Methods Rats were assigned to a control group or to 24 hours of CMV receiving a single injection of saline or 5 mg/kg (low MP) or 30 mg/kg (high MP) of methylprednisolone. Results Diaphragm force production was decreased after CMV but signifi cantly more in the low MP group while similar to controls in the high MP group. Atrophy of the type IIa fi bers was only present in the low MP group. Atrophy of the type IIx/b fi bers was more severe in the low MP group than in the CMV group while no atrophy was observed in the high MP group. Diaphragm calpain activity was increased after CMV (+93%, P <0.05 vs C) and in the low MP group (+83%, P <0.05 vs C), while it was similar to controls in the high MP group. Expression of calpastatin was Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 decreased in the CMV and the low MP group (-18%, P <0.05 vs C) but its level was preserved to control levels in the high MP group. Analysis of the caspase-3 mediated cleavage of α II -spectrin revealed that CMV induced a signifi cant rise in caspase-3 activity when compared with C (+194%, P <0.001). Caspase-3 activity was similarly increased in the MP-5 and the MP-30 groups (+96% and +78% respectively, P <0.05 vs C) but this increase was signifi cantly less compared with that of CMV. Signifi cant negative correlations were found between calpain activity and diaphragm force (-0.50 <r <-0.41, P <0.05) as well as with CSA of the type IIx/b fi bers (r = -0.57, P <0.02). Signifi cant positive correlations were observed between calpastatin and diaphragm force (0.43 <r <0.54, P <0.05) and calpastatin and CSA of the type IIx/b fi bers (r = 0.57, P <0.02). Conclusions The ability of corticosteroids to protect against CMV-induced diaphragmatic contractile dysfunction and atrophy are dose dependent with only high doses of corticosteroids providing protection. Introduction Controlled mechanical ventilation (CMV) results in diaphragmatic dysfunction. Oxidative stress is an important contributor to ventilator-induced diaphragm dysfunction, since 18 hours of CMV lead to increased protein oxidation and increased lipid peroxidation. We hypothesized that administration of an antioxidant, N-acetylcysteine (NAC), would restore the redox balance in the diaphragm and prevent the deleterious eff ects of CMV. Methods Anesthetized rats were submitted for 24 hours to either spontaneous breathing while receiving 150 mg/kg NAC (SBNAC) or saline (SBSAL) or to CMV while receiving 150 mg/kg NAC (MVNAC) or saline (MVSAL). After 24 hours, diaphragm forces were signifi cantly lower in MVSAL compared with all groups. Administration of NAC completely abolished this decrease such that forces produced in the MVNAC group were comparable with those of both SB groups. Protein oxidation was signifi cantly increased in MVSAL (+53%, P <0.01) and was restored in MVNAC. Diaphragm caspase-3 activity was signifi cantly increased in MVSAL compared with SBSAL (+279%, P <0.001). Caspase-3 activity was also increased in the MVNAC group (+158.5%, P <0.01) but to a signifi cantly lesser extent compared with that of MVSAL. Calpain activity was signifi cantly increased after CMV (+137%, P <0.001 vs SBSAL), while it was similar to SB groups in the MVNAC group. Signifi cant negative correlation was found between calpain activity and diaphragm tetanic force (r = -0.48, P = 0.02). Conclusions These data show that the administration of NAC was able to preserve the diaphragm from the deleterious eff ects of CMV. NAC inhibits the increase in oxidative stress and proteolysis and reduces the decrease in force generating capacity of the diaphragm. In vitro muscle contraction force measurements on isolated and entire rat diaphragms Introduction Inactivity of the diaphragmatic muscles during mechanical ventilation leads to atrophy and contractile dysfunction. Up to now, in vitro force measurements were performed only on single diaphragmatic muscle strips. Our intention was to fi nd out how mechanical and electrical stimulation infl uences the condition of the diaphragm as a whole organ. To determine the status of the diaphragm, muscle contraction forces were measured on entire rat diaphragms. Methods We used an earlier described bioreactor [1] as the cultivation and measurement device for the whole rat diaphragm. The bioreactor consists of a pressure chamber and a supply chamber that are separated by a very fl exible and soft membrane [2] . On this membrane the sample diaphragm is placed. By application of certain gas volumes (0 to 1.5 ml) in the pressure chamber, diaphragms are defl ected to various levels of pretension. Diaphragms were electrically stimulated at each defl ection level 10 times (750 ms duty cycle, 100 ms stimulation time, 5 ms pulse width, 200 Hz frequency). Pressure changes caused by muscle contraction were recorded inside the pressure chamber and muscle contraction forces were calculated. After initial force measurements, diaphragms were exposed for 6 hours to one of four diff erent treatments: nonstimulated storage (control), cyclic mechanical defl ection, electrical stimulation every 20 minutes, combination of cyclic defl ection and electrical stimulation. After 6 hours another force measurement was performed. Supernatants were collected after 6 hours and investigated for IL-6 activity. Results Depending on the level of defl ection of the diaphragms, muscle contraction force increased from 0.1 N (volume 0.6 ml) to 0.7 N (volume 1.5 ml). A larger pretension of the diaphragm resulted in larger muscle contraction force. After treatment, muscle contraction force decreased in all groups. Muscle contraction force was smallest in the passive control group (0.05 N), larger and similar in the electrically stimulated (0.1 N) and combination (0.09 N) groups and largest in the mechanically defl ected group (0.15 N). IL-6 activity increased after 6 hours of treatment. Conclusions We conclude that it is possible to perform force measurements on whole rat diaphragms in our in vitro model. Additionally, the diaphragms can kept alive for >6 hours to apply diff erent stimulation Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 Introduction Neurally adjusted ventilatory assist (NAVA) is a new mode of assisted mechanical ventilation that uses the signal obtained from electromyography activity of the diaphragm (EAdi) to control the mechanical ventilator that delivers assist in proportion to the patient's respiratory drive. This study evaluated the monitoring Edi on conventional mode response, during the switch to pressure support assist and during the change to NAVA: a mode of predicting weaning from ventilation. Methods Sixteen adult patients ventilated with SERVO-I (Maquet) on pressure control or Bivent were randomized; we placed the specialized naso/orogastric tube (Edi catheter) and watched the Edi signal. PSV was set to obtain a Vt/kg of 6 to 8 ml/kg with an active inspiration. Were studied the peak airway pressure (P aw ) and breathing pattern. Were registered the Edi minimum and Edi maximum for determining PEEP and peak pressure, respectively, Vt, respiratory rate, FiO 2 and blood gases serially. We observed synchrony between Edi signal and ventilator breaths in pressure control or Bivent, and when the patients have started spontaneous breaths (T signal) we move to pressure support. Low Edi minimum (0 to 1) was associated with overdistending the diaphragm and required decreasing the PEEP, contrary to high Edi minimum that was associated with higher tonic activity and dictated to raise the PEEP. During NAVA, the pressure delivered was proportional to the Edi. The NAVA level was continuously readjusted in proportion to the predicted inspiratory eff ort from the Edi signal. At the highest assist level, we found lower Vt/kg (6.1 ± 3 ml/kg vs 8.1 ± 1.8, P <0.001), and higher breathing frequency (19 ± 6.9 vs 11 ± 7, P <0.001) and peak EAdi (11.8 ± 8.5 vs 8.1 ± 7.7, P <0.002) in NAVA than in PSV. The asynchrony during mechanical ventilation increases the cost of care, length of days on mechanical ventilation, and other morbidities associated with increased ICU stay. The Edi monitoring helps to understand other modes of ventilation and has been a great tool to decide the timing of a switch to NAVA and predict an early weaning. Systematically increasing the NAVA level reduces the respiratory drive, unloads respiratory muscles, and off ers a method to determine an assist level that results in sustained unloading, low Vt, and stable respiratory function when implemented for 3 hours. NAVA gives us the opportunity to augment these patients' own drive to breathe enough to recover more quickly. Introduction Sepsis and muscle inactivity may lead to neuromuscular dysfunction. Neurally adjusted ventilatory assist (NAVA) converts diaphragm electrical activity (EAdi) into airway pressure (Paw) and preserves respiratory muscle activity [1] . We hypothesised that respiratory neuromuscular function would be better preserved with NAVA compared with volume cycled ventilation (VCV) in early sepsis. Methods Twenty-eight pigs (40.0 (37.6; 41.8) kg; median (quartiles)) were randomized (n = 7 per group) to fecal peritonitis or nonseptic controls ventilated for 24 hours with either NAVA or VCV (nonparalyzed, Vt 6 to 8 ml/kg, rate adjusted to suppress EAdi). Fluids and norepinephrine (NE) were used based on protocols to keep mean arterial pressure (MAP) >50 mmHg. Before sepsis and after 24 hours, transcutaneous supramaximal stimulations of the cervical phrenic nerve (tPNS) and expiratory occlusion manoeuvres (eOM) were performed. Latency and amplitude of diaphragm compound muscle activity potentials (CMAP) were measured during tPNS; maximal defl ections of Paw (ΔPaw), esophageal (ΔPes), and transdiaphragm (ΔPdi) pressures were measured during repetitive tPNS (40 Hz, 100 stimuli) and eOM. Results Hemodynamic parameters were not diff erent among groups before sepsis and remained stable in controls. MAP did not change in septic NAVA but decreased in septic VCV animals from 91 (74; 95) mmHg before sepsis to 58 (57; 68) mmHg at 24 hours (P <0.001, MANOVA t-g interaction). The heart rate increased (P <0.001) and stroke volume decreased (P = 0.034) in both septic groups. Fluid balance was not diff erent among septic groups. Three septic NAVA and six septic VCV animals received NE. CMAP latency and amplitude, as well as ΔPaw, ΔPes, and ΔPdi were not diff erent among all groups during repetitive tPNS and eOM before sepsis and after 24 hours (P = NS). Conclusions Respiratory neuromuscular function is not aff ected by the mode of ventilation during the fi rst 24 hours of abdominal sepsis. Early eff ects of sepsis on neuromuscular function are not refl ected in respiratory muscle strength. NAVA may reduce the need for hemodynamic support in early sepsis. Introduction This study aimed to evaluate the eff ect of high versus low tidal volumes (VT) during volume control (VC) ventilation and neurally adjusted ventilatory assist (NAVA) in spinal cord injured (SCI) patients. We hypothesized that VC with higher VT would signifi cantly unload and deactivate the diaphragm, whereas NAVA would not. Methods Seven intubated C3 to C7 SCI patients (age: 28 ± 12 years) were studied. Using VC (Servo300), VT was increased progressively (not exceeding 40 cmH 2 O or 2 l VT) (high volume) and then lowered until 6 ml/kg or intolerance (low volume). NAVA was then progressively adjusted targeting similar peak pressures. Finally subjects received 15 minutes each of: high VC, low VC, high NAVA, and low NAVA in randomized order. Results Despite comparable increases in peak airway pressure (Pmo), VT increased with VC but was unaltered with NAVA. Inspiratory and total breath durations were lower and respiratory rate higher with NAVA compared with VC during both assist levels. VC resulted in signifi cant diaphragm unloading and deactivation whereas both were maintained during high and low NAVA. Complete diaphragm deactivation in more than 80% of breaths was observed during VC in fi ve of the patients. Introduction A novel methodology is proposed to adapt decisionmaking strategies into our fuzzy-based expert system, AUTOPILOT-BT [1] . The special features of this approach are: knowledge from clinical experts can be extracted in a setup simulating daily ICU routine; an automated process serves to obtain the required information and to create new fuzzy sets; and the fuzzy controller from AUTOPILOT-BT employs the newly derived fuzzy rules. Thus, the knowledge base can easily be modifi ed and the resulting mechanical ventilation therapies may be adapted to individual preferences of the clinician. Methods The methodology consists of: (i) Acquisition of decision-making strategies from single or groups of anesthesiologists. This can either be done with a questionnaire or with a PC-based program simulating the doctors every day situation in diagnosis. (ii) Defi nition of fuzzy membership functions based on the acquired knowledge (fuzzifi cation of the input). (iii) Construction of fuzzy inference rules and defuzzifi cation or calculation of change in ventilator settings (controller action). This approach allows implementing clinician-dependent decision-making that refl ects individual preferences. Thus guidelines from EBM can be used but as well ICU specifi cs can be realized, for example diff erent levels of acceptable hypercapnia in patients with acute respiratory distress syndrome (ARDS). Results Exemplarily Figure 1 shows, for healthy and ARDS lungs, the diff erence between two fuzzy sets for our paCO 2 controller given from two clinicians (C10 and C51) with diff erent expertise in mechanical ventilation. With the newly designed fuzzy sets, our AUTOPILOT-BT reacts according to the clinicians' preferences, but still minimizes the time in which the patient is not ventilated within the specifi ed limits. Conclusions The system automatically implements the know-how of medical experts in ventilation management if the clinicians are willing to interact with the query system. The resulting strategy is mainly infl uenced by the expertise, experience and demands of the clinician. Thus the AUTOPILOT-BT, has the potential to select established guidelines or to adapt the system to modifi ed ventilation therapies. Introduction Respiratory muscle weakness is an important risk factor for prolonged mechanical ventilation, and may be part of critical illness related polyneuropathy and myopathy. Animal data also strongly point to atrophy and weakness of the diaphragm due to mechanical ventilation itself, called ventilator-induced diaphragmatic dysfunction. Recently, measuring transdiaphragmatic pressure following magnetic stimulation (TwPdi BAMPS) was introduced in the ICU to evaluate diaphragm function [1, 2] . We aimed to evaluate reproducibility of twitch TwPdi BAMPS in critically ill, mechanically ventilated patients. We also aimed to describe the relationship between TwPdi and duration of mechanical ventilation. Methods Prospective observational study in a medical ICU of a university hospital. TwPdi BAMPS was measured in critically ill and mechanically ventilated patients. Briefl y, the phrenic nerves were stimulated bilaterally from the anterior approach, at the posterior border of the sternocleidomastoid muscle, at the level of the cricoids using two fi gureof-eight 45 mm magnetic coils (Magstim, Dyfed, Wales) and a bistim (Magstim, Dyfed, Wales). A custom-built occlusion valve was used to create isometric conditions during stimulation. Oesophageal and abdominal pressure changes were measured using balloon catheters (UK Medical, Sheffi eld, UK) inserted through the nose after local anaesthesia. Results Nineteen measurements were made in a total of 10 patients at various intervals after starting mechanical ventilation. In seven patients, measurements were made on at least two occasions with a minimal interval of 24 hours. The between-occasion coeffi cient of variation of TwPdi was 9.7%, which is comparable with data from healthy volunteers. Increasing duration of mechanical ventilation was associated with a logarithmic decline in TwPdi (R = 0.69, P = 0.038). This association was also found when cumulative time on pressure control ventilation (R = 0.71, P = 0.03) and pressure support ventilation (P = 0.05, R = 0.66) were considered separately, as well as for cumulative dose of propofol (R = 0.66, P = 0.05) and piritramide (R = 0.79, P = 0.01). Conclusions Increased duration of mechanical ventilation is associated with a logarithmic decline in diaphragmatic force. These fi ndings are compatible with the concept of ventilator-induced diaphragmatic dysfunction. The observed decline may also be due to the cumulative dose of sedatives/analgesics or other co-factors, such as sepsis. Introduction The aim of the investigation was to study the parameters of haemodynamics, gas exchange and volhaemic status in patients with severe thermal injury. The trial has covered 30 injured patients aged from 21 to 60 years with 25 to 78% skin burns and PO 2 /FiO 2 <300. Patients were randomized into two groups. The fi rst group was n = 13, PO 2 /FiO 2 <300, 24 to 72 h after burn. The second group was n = 17, PO 2 /FiO 2 <300, 4 to 11 days after burn. Injured persons were subjected to evaluation of cardiac index (CI), intrathoracic blood volume index (ITBI), extravascular lung water index (ELWI0 and pulmonary vascular permeability index (PVPI) by single transpulmonary thermodilution (PiCCOplus; Pulsion Medical Systems, Germany), calculation of oxygen delivery (DO 2 ), and colloidal osmotic pressure of plasma. Grade of pulmonary damage was evaluated by Murrey, severity of dispragia by SOFA. Correlation analysis was performed using Pearson and Spearman criteria (r; P). Diff erences were signifi cant at P <0.05. Results Hypoxaemia (PO 2 /FiO 2 = 259 ± 19) in the injured of the fi rst group was developing at the background of reduction of CI = 3.2 ± 0.5; ITBI = 751 ± 114; DO 2 = 596 ± 124 (r = -0.92; P = 0.02). ELWI level was normal (7.3 ± 0.8). In the second group, reduction of PO 2 /FiO 2 has been developing at the background of burn sepsis (average SOFA point = 7.0 ± 1.9, Murrey = 1.6 ± 0.4) and was caused by increasing of ELWI up to 9.0 ± 1.4 (r = -0.66; P = 0.002) and correlated with PVPI (r = 0.57; P = 0.01). There was no statistically signifi cant correlation ELWI with ITBI and colloidal osmotic pressure of plasma (r = 0.13, r = -0.42). Conclusions Gas exchange disorders in patients of the fi rst group were caused by lack of perfusion and misbalance between oxygen delivery and demand. Reduction of oxygenation index in patients from the second group was accompanied by ELWI at the background of alteration of intravascular penetration and sepsis. Introduction Prior non-invasive ventilation (NIV) is associated with an increased mortality in patients with haematological malignancies and acute respiratory failure treated by invasive mechanical ventilation (IMV). Methods We have assessed whether NIV failure is an independent prognostic factor for hospital discharge in a general cancer population treated by IMV. One hundred and six patients with solid tumours and 58 patients with haematological malignancies were eligible for this retrospective study; 41 were treated by NIV before IMV. The main indications for mechanical ventilation were sepsis/ shock (35%), acute respiratory failure (33%), cardiopulmonary resuscitation (16%) and neurologic disease (10%). Respectively, 35%, 28% and 24% of the patients were extubated, discharged from the ICU and from the hospital. For patients treated with NIV prior to IMV, the rates were 22%, 17% and 10%, respectively. In multivariate analysis, three variables were independently associated with a decreased probability of being discharged from the hospital: NIV use before IMV (OR = 0.30, 95% CI: 0.09 to 0.95; P = 0.04); leucopenia (OR = 0.21, 95% CI: 0.06 to 0.77; P = 0.02) and serum bilirubin >1.1 mg/dl (OR = 0.38, 95% CI: 0.16 to 0.94; P = 0.04). Conclusions NIV failure before IMV is an independent poor prognostic factor in cancer patients treated by IMV. The data for 102 patients who were treated for swelling of the larynx at the Department of Otorhinolaryngology of Ludwig-Maximilians-University of Munich from 2004 to 2008 were evaluated in a retrospective survey. Results Causes of laryngeal AE were in 26 patients an allergic reaction, in 33 cases were due to radiotherapy after cancer, 32 patients took an angiotensin-converting enzyme inhibitor (ACEI), one patient had a hereditary angioedema (HAE) and in 10 patients are unknown. All patients independent from the underlying cause were treated with high-dose intravenous steroids, 69 patients with antihistamines, 33 with epinephrine (inhalation or i.m.), two patients with ACEI-induced AE were successfully treated with icatibant. Ten patients needed an intubation, 11 a tracheotomy, one patient an emergency coniotomy to preserve the airway. All 11 tracheotomy patients suff ered from cancer. Nine out of 10 intubated patients took an ACEI and were extubated on the ICU after 2 to 7 days. Conclusions According to our experience 80% of the patients with laryngeal swelling and upper airway obstruction react well to the standard therapy with steroids, epinephrine and antihistamines, indicating that the AE is histamine induced. In contrast these drugs have almost no eff ect on bradykinin-induced angioedema; that is, AE induced by ACEI. There is preliminary evidence that the bradykinin B 2 receptor antagonist icatibant that is approved for treatment of HAE type I and II may also be eff ective in ACEI-induced edema. Forty-six laryngoscopes were tested. All had traditional vacuum incandescent bulbs. Twelve (26%) fell below 1,000 Lux and six (13%) fell below the 500 Lux minimum. The failures were corrected by battery replacement in 25% and by bulb replacement in the remaining 75% (see Figure 1 ). Conclusions Simply checking laryngoscopes for the presence of illumination on a regular basis is insuffi cient to ensure best or even adequate function. Poor function is as frequently related to bulb dysfunction as battery fatigue. Institutions should consider quality control and maintenance programs or consider more advanced laryngoscopic lighting (for example, LED or halogen bulbs). References (Table 1) . We did not fi nd signifi cant diff erences of respiratory and hemodynamic data between groups; MR did not induce cardiovascular instability: the mean cardiac output was 5.2 ± 2.2 vs 4.9 ± 1.5 l/min (MRa vs no-MRa, respectively). Conclusions RM with CPAP performed before PDT could be performed safely to prevent alveolar derecruitment due to PDT. In neurosurgical patients requiring ventilation on the ICU, a tracheostomy is frequently formed to facilitate airway protection and weaning from the mechanical ventilator. However, the issue of when to form a tracheostomy remains contentious. In order to better inform our decision-making processes we audited practice within our own institution. Methods A retrospective study was conducted in which the ICU charts of all neurosurgical patients admitted to a tertiary referral ICU during the calendar year 2007 were reviewed. Patients who did not require mechanical ventilatory support or who died within 7 days of admission were excluded. Demographic data, diagnosis, duration of mechanical ventilation, ICU day of stay on which tracheostomy was formed, and ICU length of stay were recorded and the data analysed accordingly. Results A total of 106 patients were included, 65 male and 41 female. The mean age was 49 years. Sixty-three patients were able to be separated from the mechanical ventilator within 7 days of commencement of ventilation via a cuff ed oral endotracheal tube. Of the remaining 43 patients, 34 (79%) went on to undergo tracheostomy formation as determined by the attending intensivist. In this group the median time of tracheostomy formation was 13 days. The median time from tracheostomy formation to separation from the mechanical ventilator was 3 days. Of those patients Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 who could not be extubated within 7 days of ventilatory support, nine (14%) were successfully separated from the mechanical ventilator without the need for a tracheostomy. There were no signifi cant diff erences in age or diagnosis between the two groups. Conclusions Our data suggest that the failure of a neurosurgical patient to separate from the mechanical ventilator within 7 days is predictive of the eventual requirement for tracheostomy formation. In light of this we intend to expand our sample size over a 5-year period and subject the data to multivariate regression. References Introduction Percutaneous tracheostomy is a common procedure in many trauma ICUs. A concern about the safety of performing percutaneous tracheostomy in patients with no cervical spine clearance or cervical spine injury has limited its adoption for some surgeons. The total number of tracheotomies during the study period was 220, of which 125 (56%) were PT and 95(44%) were OT. Both groups were similar in age, sex and ISS distribution. Of the OT group, 60 (63%) were done in patients with no cervical spine clearance or cervical spine injury. There were no immediate complications reported in the OT group. The PT group had 63 cases (50.4%) done with no preoperative cervical spine clearance or positive for cervical spine injury. The PT group underwent the procedure without bronchoscopy assistance in 95% of the cases. Two cases (1.5%) in the PT group were reported with postoperative bleeding from the tracheostomy site that did not required intervention. Both cases were PT done without bronchoscopy assistance and did not have preoperative cervical spine clearance. No other immediate complications were reported. Conclusions The results of this study suggest that PT is safe in trauma patients without preoperative cervical spine clearance or with cervical injuries as compared with the OT group. Most of the PT cases were done without bronchoscopy assistance (95%). This fi nding suggests the need for further study to clarify the role of bronchoscopy assistance in PT. Introduction Mechanically ventilated (MV) patients are prone to develop ventilator-associated pneumonia. One of the major risk factors is microaspirations of supraglottic secretions past the endotracheal tube cuff (usually in polyvinyl (PV)). A novel polyurethane (PUE) cuff was designed to minimize these leakages. We therefore compared the sealing capacities of the two tubes in MV patients. Methods Twenty-nine consecutive MV patients (mean age ± SD: 68 ± 13, 21 males), were randomly allocated to receive either a PV (HI-LO Evac, Mallinckrodt) or a PUE (SEALGUARD Evac, Mallinckrodt) cuff ed endotracheal tube (size: 9 for men; 8 or 8.5 for women, as a rule). We excluded patients with emergency intubation, unstable haemodynamics, severe respiratory failure or patients with history of tracheal/laryngeal disease. In each patient, cuff pressure was maintained at 30 cmH 2 O, and ventilator parameters were set to plateau pressure ≤30 cmH 2 O; patients were fasting and placed in a strict semirecumbent position (45°). Radioactivity of tracheal aspirates was assessed sequentially (hourly samplings from T0 to T6 hours, then T8 hours and T12 hours) after injection of 74 MBq 99m Tc-DTPA diluted in 5 ml of 0.9% NaCl just above the cuff via the aspiration channel of the tube. Additionally, kinetics of respiratory tract contamination was followed by simultaneous pulmonary images using a scintillation camera. Data were blindly analysed by nuclear physicians. The study was approved by the hospital ethics committee and informed consent was obtained from relatives. Results Sixteen PUE and 13 PV cuff ed tubes were compared. The study was performed 3.2 ± 2.8 days after intubation and 8.3 ± 9.6 days after ICU admission (mean ± SD). Ventilator parameters were the following: volume control or pressure support but one on T tube, FiO 2 was 0.43 ± 0.14, PEEP 6 ± 2 cmH 2 O. Leakages were observed in 11/29 patients (38%), with similar rate of aspiration in PUE (5/16) and PV (6/13) groups (P = NS). Leakages were more frequently observed in female (7/8) than in male patients (4/21) (P <0.001). There was a trend to decreased frequency of aspiration in patients with larger tubes (size 9 vs 8.5: P = 0.062). Conclusions Both PUE and PV cuff ed endotracheal tubes are poorly eff ective in preventing microaspirations in MV patients. Tube size and/or gender may be more important than cuff composition in the prevention of aspiration during MV. Introduction For decades it was assumed that cuff ed endotracheal tubes prevented aspiration. However, currently used cuff ed tubes do not prevent the development of post-intubation pulmonary complications caused by microaspiration [1, 2] . A new taper-shaped cuff ed tube (TaperGuard (TG)) has recently been introduced. The aim of this study was to compare this tube with a traditional tube (Hi-Lo (HL)) in an aspiration pig model. Methods Fourteen pigs between 65 and 75 kg were studied. The pigs were randomly intubated with either TaperGuard (tapered cuff ) or Hi-Lo (barrel cuff ) tubes. The tube size was either 7.0 or 8.5 depending on weight. Cuff pressures were maintained between 24 and 27 cm water. After intubation, 0.3 ml/kg acidic blue dye (ph 2.5) was placed on top of the cuff . The animals were ventilated for 3 hours. The animals were then sacrifi ced and the tracheobronchial tree and lungs examined. Aspiration was characterized as follows: dye leak, ulceration/erosion, hemorrhagic pneumonia, bronchitis/bronchiolitis. Of the 14 pigs, one had to be excluded due to accidental cuff defl ation. Tube sizes were evenly distributed amongst the groups. Cuff pressures were equal: TG 23.7, HL 25.2 -P <0.2. As seen in Table 1 , the incidence of microaspiration was signifi cantly less for TG in the Blue Dye and bronchitis groups. following extubation. However, the true incidence of laryngeal edema in postoperative patients is not clear. We assessed the relationship between upper airway obstruction and the values of cuff -leak pressure in postoperative patients. Methods One hundred and fi fty-eight postoperative patients (123 elective, 35 emergency) were included. After ventilator weaning was accomplished, we measured the airway pressure at which a sound of cuff leakage was audible by a cuff pressure monitor. In 28 cases, the cuff leak pressure was measured during both awake and sedated states. The cuff leak pressure value was 12.8 ± 10.1 (median 10) mmHg for elective cases and 12.6 ± 10.0 (median 10) for emergency cases (NS). Six patients (3.8%) were not extubated because of high leak pressure and the value was 40.5 ± 16.0 (24 to 60) mmHg. One hundred and fortythree patients were extubated and nine of those (6.3%) were diagnosed as laryngeal edema by laryngoscopy. Seven of those (5.0%) needed reintubation, one was for the reason of massive sputa, three were for granuloma formation, and three (2.1%) were diagnosed as severe laryngeal edema. Patients who developed severe laryngeal edema had a higher leak pressure (27.2 ± 22.7 mmHg) than those who did not, and all of such patients had pressure above 20 mmHg. The sensitivity and the specifi city of the test using a threshold value of 20 mmHg for severe laryngeal edema were 97.2% and 40.0%, respectively. The occurrence of severe laryngeal edema was not associated with age, gender, perioperative weight gain, duration of translaryngeal intubation, inner diameter of the endotracheal tube, or serum albumin concentration. However, the cuff leak pressure value >20 mmHg was associated with gender (female, P = 0.02) and inner diameter of the endotracheal tube (P = 0.0017) by multivariate regression. In 28 patients, the cuff leak pressure was 17.8 ± 10.6 during the awake state and 9.9 ± 4.6mmHg under sedative. Because tonus of larynx is considered to be related to cuff leak pressure, it is useful to measure the cuff leak pressure in a sedated state if the patient had a high value in the awake state. Conclusions Cuff leak pressure values <20 mmHg at any time are useful to rule out severe laryngeal edema. It may be useful to measure the pressure in a sedated state if the value in the awake state is high. The rapid shallow breathing index (RSBI) has been shown to be a parameter that predicts successful weaning. A RSBI of less than 105 is a predictor for successful weaning. We investigated whether the RSBI could be a useful parameter in our patient population. Methods From June 2008 until March 2009 all patients who had been mechanically ventilated on our six-bed ICU were studied. Only the patients who were invasively mechanically ventilated, for longer than 6 hours, who had no missing information and did not die whilst on the ventilator were included in this study. RSBIs were measured by these patients at two diff erent time points of weaning, fi rstly at the point of reduction in pressure support and then at the point of extubation. The RSBI was measured using the ventilator (Maquet servo-i) by reducing the pressure support level to zero and adjusting the PEEP to 5 cmH 2 O. A signifi cance level <0.05 was considered signifi cant. The nonparametric Kruskall-Wallis test was used to analyse the collected data. Results One hundred and nineteen patients were ventilated over this time period: 74 patients did not meet the inclusion criteria; 45 were included. At the time point of further reduction in pressure support, the RSBI showed signifi cant diff erences (P = 0.038) between the group who was ready for weaning and the group who was not ready for weaning. In the group with successful reduction of PS (n = 28) the RSBI was 83.54 ± 32.12, and in the group without successful reduction of PS (n = 2) the RSBI was 158.50 ± 38.89. At the point of extubation the RSBI was signifi cant in predicting successful extubation (P <0.001). In the group who was successfully extubated (n = 24) the RSBI was 75.25 ± 21.62, and in the group who was unsuccessfully extubated (n = 5) the RSBI was 155.20 ± 25.23. Conclusions At the moment of extubation, the RSBI showed signifi cant low values in the successful group. This was also for the time point further reduction in pressure support level. Increasing pressure support during vibrocompression is useful during respiratory therapy? W Naue, A Guntzel, A Silva, R Condessa, R Oliveira, S Vieira Methods After being placed in the supine position in bed, with head angle elevation at 30°, patients were randomized to: group 1 (G1): vibrocompression for 10 minutes in the chest, or group 2 (G2): vibrocompression plus increase of 10 cmH 2 O in IP in PSV for 10 minutes in the chest. Clinical variables and APACHE II score were registered. Parameters analyzed at the beginning (1) and at the end (2) of the protocol were: variation of peak pressure (ΔPP = Pp2 -Pp1), variation of tidal volume (ΔVT = VT2 -VT1), variation of dynamic compliance (ΔCdyn = Cdyn2 -Cdyn1). The amount of variation of mucus secretion aspirated (ΔSa = Sa2 -Sa1) at the end was also measured. The results are showed in mean ± SD. Results Both groups (G1: n = 30 and G2: n = 39) were similar in clinical characteristics and APACHE II score. The variation of variables in G1 and G2, were respectively: ΔPp (cmH 2 O) = 0.58 ± 1.41 and -0.10 ± 0.98, P = 0.02; ΔVT (ml) = 28.90 ± 119.40 and 64.26 ± 93.82, P = 0.96; ΔCdyn (cmH 2 O) = 3.03 ± 9.56 and 4.26 ± 8.24, P = 0.89; ΔSa (g) 0.29 ± 1.73 and 0.93 ± 2.28, P = 0.36. Conclusions We found no diff erences between groups, except in peak pressure (Pp2 >Pp1) that was greater in G1. This result could be probably due to decreased airway resistance. References Introduction Failure in weaning from mechanical ventilation (MV) is frequent (25 to 30%) and associated with high mortality. Indexes predicting success can be helpful clinically. However their predictive capacity can be low, principally in patients with cardiac disease. The goal of this study is to evaluate weaning predictor indexes in patients with cardiac disease during weaning from MV. Methods We included patients with and without cardiac disease under MV for at least 48 hours, submitted to spontaneous breathing trial (SBT) for 30 minutes, extubated according to clinical decision and followed for 48 hours. They were evaluated concerning age, sex, clinical characteristics, length of hospital and ICU stay and of MV. At the fi rst and 30th minutes from SBT there were analyzed: arterial blood gases, hemodynamic and respiratory parameters as respiratory rate (f ), tidal volume (VT), rapid shallow breathing index (f/VT), maximal inspiratory and expiratory pressures. Comparisons were done between this group of patients and success x failure, defi ning failure as a return to MV in the fi rst 48 hours. Results Four hundred and thirteen patients were studied, 81 with cardiac disease and 332 without. Overall mortality rate was 14%. Return to MV occurred in 19.7%. The most important diff erences comparing patients with cardiac disease with the control group were: lower mortality rate (21% x 13.5%, P <0.006), shorter length of ICU stay (9 ± 3 x 16 ± 13 days). Comparing patients who failed in the weaning process with and without heart disease (20% x 20%, P <0.55). Comparing f/VT in 30th minute of patients who have had success with those who failed the weaning process (60 ± 38 x 74 ± 47, P <0.012), a lower increase in f/VT (Δf/VT) (4 ± 27 x 11 ± 33, P <0.075) during the test. Conclusions In this group of patients a great number failed in the weaning process, showing, as expected, a higher mortality rate. Parameters most related to failure in the literature were higher age, longer length of ICU stay, mortality and f/VT. In this study, just the last parameter was sensitive principally in the 30th minute, and higher increase in f/VT (Δf/VT) during the test, demonstrating that patients with cardiac disease not fail more than others during the weaning process, as well as the effi ciency of the test to predict success in weaning. Introduction Due to the improvement of nearly all fi elds of acute care in the past decades, the burden of long-term dependency from mechanical ventilation (MV) increases incessantly. The aim of this study was to implement and test a standardised weaning protocol in patients suff ering from long-term mechanical ventilation. Methods After approval by the local ethics committee and informed consent, 644 patients were enrolled in a prospective cohort study in 1 year. The mean time on the ventilator before inclusion was 39.4 (6 to 357) days. The reasons for long-term MV were: cerebral 33.1%; cardiovascular 31.5%; pulmonary 28.7% and 5.6% neuromuscular diseases. The weaning protocol started with 6 x 5 minutes spontaneous breathing (intermission of mechanical ventilation) on day 1 and was increased stepwise up to 24 hours spontaneous breathing on day 22. If there was no improvement in weaning steps over 5 days or there were more than three steps backwards the patient was switched to the individual weaning approach. The weaning protocol was carried out by the previously trained ICU staff as well as by specialized physiotherapists. Results A total of 77.3% (n = 498) of the patients could be weaned off the ventilator by the fi rst eff ort. Out of these, 85.9% of the patients were weaned by the protocol whereas 14.1% needed an individual approach. The mean time of weaning was 17 (5 to 67) days using the protocol and 29 (1 to 88) days due to the individual approach. In 12.6% both weaning procedures failed and these patients were discharged from the hospital into a home-care ventilation program. In total, 10.1% of the patients died on mechanical ventilation during their ICU stay. Patients who had needed a second weaning eff ort (n = 111) were weaned in 36.4% by the protocol, transferred to home-care ventilation (28.8%) or died (35.1%). Conclusions Using the standardised weaning protocol, more than three-quarters of the prolonged mechanically ventilated patients could be weaned off the ventilator in a mean of 17 days. Only 14.1% needed an individual strategy that takes 29 days until complete release from the ventilator. Therefore we suggest this kind of weaning protocol as a useful Introduction This study compared the reliability and the effi cacy of three diff erent non-invasive CPAP systems, Ventumask® low fl ow (LF), Ventumask® high fl ow (HF) and Boussignac® CPAP valve (B), both in a lung model and in healthy subjects. Methods Lung model: pneumatic simulator fi rst set at respiratory rate 16/ minute with 400 ml tidal volume (L1), then RR 40/minute with 800 ml tidal volume (L2). Human study: 10 healthy subjects were asked to make an expiratory pause (T1), fi ve breaths at tidal volume (T2), and a short sequence of tachypnea (T3). The diff erences between set and eff ective PEEP and FiO 2 were used as indexes of reliability. The minimum airway pressure below the PEEP level and the airway pressure swing around PEEP were used as indexes of effi cacy. In the lung simulator protocol, the pressure-time product was measured and then correlated to the other effi cacy indexes. Results Lung model: HF showed a tendency towards a more stable PEEP (P = 0.067). In B a signifi cant fall in the FiO 2 value from L1 (96.9 (95.5; 97.6)%) to L2 (54.2 (50.4; 56.9)%) was observed (P <0.001). The airway pressure swing around PEEP was greater in LF and B compared with HF in L1 (P <0.001), while during L2 it was lower in B (P = 0.007). The pressure-time product was better described by the airway pressure swing around PEEP (r 2 = 0.95) rather than the minimum airway pressure below the PEEP level (r 2 = 0.85). Human study: a lower diff erence between set and eff ective FiO 2 was registered with HF (P <0.001). Ventumask® systems showed an overall higher minimum airway pressure below the PEEP level compared with B (P <0.001). During T3, HF (+10.1 (+7.4; +13.2) cmH 2 O) showed a smaller airway pressure swing around PEEP relative to LF (+11.9 (+10.1; +16.6) cmH 2 O, P <0.001) but not to B (+12.2 (+6.6; +15.1) cmH 2 O, P = 0.40). During the whole respiratory sequence, an increase in end-expiratory pressure was observed. This increase was greater in B valve than in HF and LF (P <0.001). Conclusions According to observations, HF seemed to be the most reliable device. In conditions of high fl ow demand, B reached FiO 2 values lower than expected. The dynamic hyper-pressurization, higher with B, is probably due to the relationship between expiratory fl ow and the resistance off ered by the expiratory valve. B, even if handy and lightweight, is less reliable in terms of fl ow supply and PEEP stability. Introduction Non-invasive mechanical ventilation (NIV) has been used in hypoxic postoperative cardiac patients but more studies are necessary to clarify its respiratory and hemodynamic eff ects. Therefore, our objective was to study its eff ects in the oxygenation index (PaO 2 /FiO 2 ) and in hemodynamic variables in this group of patients. Methods This is a randomized trial in which all postoperative cardiac patients having a pulmonary artery catheter and showing a PaO 2 /FiO 2 between 150 and 300 (with FIO 2 0.31), 1 hour after extubation, were included. The intervention group used NIV with a bilevel positive airway pressure (an inspiratory pressure to generate a tidal volume of 6 ml/kg and an expiratory pressure of 7 cmH 2 O) with FIO 2 0.4 during 3 hours. The control group used oxygen by Venturi mask in order to keep good oxygenation. In both groups, measurements were done in a basal situation (FiO 2 0.31), in the fi rst hour after beginning the treatment (FiO 2 0.4) and 1 hour after the end of the intervention (FiO 2 0.31). Variables studied included: pH, PaO 2 , PaCO 2 and PaO 2 /FiO 2 ratio, heart rate (HR), mean arterial pressure (MAP), pulmonary capillary wedge pressure (PCWP), central venous pressure (CVP), mean pulmonary arterial pressure (MPAP) and cardiac output (CO). Results Forty-two patients were included in the study period. The mean age was 65.7 ± 10 years. The basal variables were similar in the two groups. There was an increase in the PaO 2 /FiO 2 ratio in the NIV group in the fi rst hour (P <0.05, 95% CI 74.8 to 6.6) and 1 hour after stopping the treatment (P <0.05, 95% CI 90.7 to 10.8) compared with the control group. There were no signifi cant changes in hemodynamic parameters, pH and PaCO 2 during NIV compared with the control group. Conclusions There were no changes in hemodynamic variables during the NIV period. However, these results suggest that NIV improved oxygenation even 1 hour after stopping the treatment. (NIPPV) has been used with success in hypercapnic and hypoxaemic acute respiratory failure. However, the outcome of NIPPV with burn patients is less well documented. The purpose of this study is to report our experience with NIPPV in a series of burn-injured patients. Methods The records of all burn patients from July 2008 to September 2009, in whom NIPPV was used in the intensive burn care unit, were reviewed. The criteria for selecting patients for NIPPV included a combination of the following factors: patients with acute respiratory failure, haemodynamically stable, conscious and cooperative with their treatment. There had to be no need for endotracheal intubation. Results Thirty-one patients were treated with NIPPV. Nineteen were female. Mean age was 44.22 years, mean total body surface area (TBSA) was 38.37%. NIPPV was used to treat hypoxia in 21 patients, hypercapnia in four patients and both of them in six patients. The mean PaO 2 /FiO 2 ratio before NIPPV was 188.53. NIPPV was used to treat ARDS in 12 patients, pneumonia in eight patients, atelectasis in six patients and cardiogenic oedema in fi ve patients. The mean PaO 2 /FiO 2 ratio after NIPPV was 256.43. Intubation was successfully avoided in 12 out of the 31 (38.7%) patients. All of these patients progressed to self-ventilation status following NIPPV. Conclusions The use of NIPPV with burn-injured patients is, as yet, unclear because little work has been documented. In our experience, the use of NIPPV can lead to avoid the need for endotracheal intubation and mechanical ventilation. The purpose of this prospective study is to describe the hemodynamic and cardiac variation during a trial of non-invasive ventilation (NIV) in the process of weaning from ventilatory support. A continuous hemodynamic monitoring was performed with a pulse contour method, the MostCare (Vytech Health, Laboratoires Pharmaceutiques Vygon, Ecouen, France). This device could be used to identify the early warnings of cardiovascular dysfunction, which may contribute to unsuccessful weaning. Methods Fourteen patients, admitted to our ICU between January and July 2009, were included in the study: six intubated on respiratory failure due to cardiogenic pulmonary edema (CPE; 52 ± 18 years, four male (M), two female (F)), eight intubated on chronic obstructive pulmonary disease (COPD; 71 ± 16 years, four M, four F). The NIV trial was performed with a face mask: pressure support (PS) = 5 to 10 cmH 2 O, positive endexpiratory pressure (PEEP) = 5 to 7 cmH 2 O. Cardiovascular variables and gas exchange data were measured at three defi ned points in time: 1 hour before the extubation (T1), continuously during the NIV trial (a mean value was calculated and expressed as T2), and at the end of the NIV trial with the patient in spontaneous breathing (T3). The variation of cardiac index (CI) was: T1 2.53 ± 0.43 ml/minute/m 2 CPE vs 3.2 ± 0.64 ml/minute/m 2 COPD; T2 2.93 ± 0.93 ml/minute/m 2 CPE vs 2.9 ± 1.33 ml/minute/m 2 COPD; T3 1.96 ± 0.61ml/minute/m 2 CPE vs 3.3 ± 1.37 ml/minute/m 2 COPD. The variation of CI depended on the variation of stroke volume (SV), while the heart rate (HR) did not change during the trial. We calculated the oxygen delivery (DO 2 ) by the correlation of CI and gas exchange: T1 767 ± 218 ml/minute CPE vs 839 ± 134.3 ml/minute COPD, T2 917 ± 417 ml/minute CPE vs 701 ± 349 ml/minute COPD, T3 760 ± 404 ml/minute CPE vs 753 ± 340 ml/minute COPD. The variation of CI and DO 2 , observed within and among the two groups was never signifi cant. Conclusions Both groups of patients were successful in achieving spontaneous ventilation. In our opinion, continuous hemodynamic monitoring may provide helpful beat-to-beat information and it might be used, combined with the gas exchange and oxygen saturation monitoring, during the weaning process as a predictor of cardiovascular instability or respiratory failure. Moreover, continuous hemodynamic monitoring enables one to be aware of the variation of systemic oxygen delivery, and these data could be used to value critically ill patients during the weaning process. Introduction Obesity rates are increasing in the general population and it is also prevalent in ICUs. Patients are sometimes admitted to ICUs for hypercapnic respiratory failure or cor pulmonale but in general they are admitted for pneumonia, excessive daytime sleepiness, heart failure, COPD or asthma attacks or pulmonary embolism; and hypercapnic respiratory failure is noticed during this period. On the other hand, optimal non-invasive mechanical ventilation strategy is not known during their ICU treatment. The aim of this study is to assess the diff erences between NIV strategies and outcomes between obese and nonobese patients with acute hypercapnic respiratory failure. Methods In this retrospective cohort study, 73 patients were studied and all of them were ventilated with a face mask. Patients divided into two groups as obese (BMI >35 kg/m 2 ) and nonobese (BMI <35 kg/m 2 ), and whether necessary pressure, volume, mode, ventilator and time to reduce PaCO 2 below 50 mmHg were signifi cantly diff erent in obese and nonobese patients was investigated. Results Mean age of the patients was 66 ± 14 years and mean admission APACHE II score was 18 ± 4; 41 (56%) of them were female. ICU admission reasons for the obese patients were signifi cantly more frequently pulmonary edema and less frequently pulmonary infections (P = 0.003 and 0.043, respectively) than the nonobese patients. While there were no signifi cant diff erences across the groups between the ventilators, modes, and inspiratory pressure levels, obese patients required higher end-expiratory pressure levels and more time to reduce the PaCO 2 level below 50 mmHg than the nonobese group. Length of NIV and ICU stay, intubation and the mortality rate were similar across the groups. Conclusions These results suggest that improvement of hypercapnia in obese patients may require higher PEEP levels and longer times than the non-obese ones during acute hypercapnic respiratory failure attack. A simple predictive scoring system for prolonged mechanical ventilation in severe sepsis and septic shock N Saito, Y Sakamoto, K Mashiko Chiba Hokusou Hospital, Nippon Medical School, Chiba, Japan Critical Care 2010, 14(Suppl 1):P243 (doi: 10.1186/cc8475) Introduction Prolonged mechanical ventilation (MV) is associated with high morbidity and mortality in septic patients. However, limited data are available on the prediction of prolonged MV. We conducted an observational cohort study aimed at developing a prolonged MV predictive scoring system for severe sepsis and septic shock. Methods We retrospectively analyzed 120 consecutive patients with severe sepsis or septic shock who were ventilated for more than 72 hours between January 2005 and October 2009. Clinical features and physiologic parameters were examined for more than 15 days for use as predictors of MV. Patients were divided into two groups: group 1, those requiring MV for <15 days; group 2, those requiring MV >15 days. The mean (± SD) age and SOFA scores were 65.3 ± 16.7 years and 10.0 ± 3.9, respectively, and 35% required prolonged MV. Univariate analysis indicated that the length of ICU and hospital stays, hospital mortality, the rate of transfusion, incidence of ARDS, ventilator-associated pneumonia (VAP), other nosocomial infection (NI) and drug-resistant bacteria, the ratio of steroid therapy and muscle relaxant use, and the mean PaO 2 /FiO 2 ratio during the fi rst 3 days after admission were signifi cantly diff erent between the two groups. The independent predictors for prolonged MV were ARDS (OR 5.24 (P = 0.001; 95% CI: 1.9 to 14.1)), VAP (OR 7.75 (P <0.001; 95% CI: 2.7 to 22.0)), and transfusion (OR 2.84 (P = 0.036; 95% CI: 1.0 to 7.5)) Using these results, we were able to develop a prolonged MV predictive scoring system. This simplifi ed clinical risk assessment tool was developed from Introduction Severity scoring is a powerful tool for quality control in the ICU. We introduced a new severity scoring system to our ICU. The aim of this report is to describe the database and present the results of the fi rst 12 months. Methods We analysed needs for database and severity scoring. We chose on-admission scoring with EUROSCORE for cardiac surgical patients, and Mortality Prediction Model II time zero for other patients. Data were collected via highly simplifi ed forms onto a spreadsheet. Demographic entry was by ward clerk, on-admission severity scoring was by resident medical staff , and risk of death calculation and data cleaning was by senior attending medical staff . Individual patient risk of death was presented as a logit. Combined risk of death for the whole cohort was calculated from the arithmetic mean of individual logits. We estimated the time to completion of each step in data acquisition to calculate a total time spent per patient. Results There were 1,355 admissions, mean (SD) age 69.2 (15.9) years, 57.1% male; median (range) length of stay 23.1 (1.7 to 1,882.5) hours. Fiftyfour per cent of patients were ventilated for a median (range) 7.5 (0.8 to 1,877) hours. Predicted mortality for cardiothoracic and noncardiothoracic patients combined was 8.64% and the observed mortality was 2.8%. EUROSCORE-predicted mortality for the 535 cardiothoracic patients was 5.72% and the observed mortality was 0.75% (four patients). MPMpredicted mortality for the 820 noncardiothoracic patients was 11.23% and the observed mortality was 4% (34 patients). Estimated time to complete the severity scoring form was 30 seconds, to enter a new patient on the database was 90 seconds and to calculate risk of death and check data integrity was 90 seconds. Total was 3.5 minutes per patient. Conclusions Data collection/analysis is essential for quality management in the ICU. Proprietary systems are expensive. Traditional scoring systems (APACHE II) are poorly calibrated to some case mixes. To overcome these problems, we devised a simple, inexpensive and highly valuable ICU database. The key features were well calibrated, on-admission severity scoring, highly simplifi ed forms, a basic spreadsheet and collaborative staff involvement. Senior medical staff performed the fi nal data checking. The project provided abundant high-quality data with a total input of 3.5 minutes per patient. We are trialling the database in a large ICU in China and we would welcome input from other ICUs that would like to copy our methods. Validation of a computerised system to calculate the sequential organ failure assessment score C Bourdeaux 1 The sequential organ failure assessment (SOFA) score was introduced with the aim of quantifying the severity of illness, based on organ dysfunction, serially over time. A previous study has suggested that the reliability and accuracy of SOFA scoring by intensive care physicians is good but there may be room for improvement [1] . Manual calculation of the score can be time consuming. We developed a computer program that derives a SOFA score from the electronic patient record, the Innovian system (Draeger, Germany). To date, no automated method for SOFA score calculation has been validated. We validated the automated SOFA score collection method in order to assess its accuracy and reliability. We also measured the time for manual collection of SOFA scores in order to assess the resource saving potential of the computer system. Methods Fifty patient records were selected from the database as a stratifi ed random sample in order to represent the patient population of our teaching hospital adult ICU. Two ICU physicians calculated a total SOFA score and an individual organ score on each of the 50 patient records. A gold standard SOFA score was then generated after discussion between the two physicians with the aid of a third adjudicating ICU physician. SOFA scores generated by the computer were compared with the gold standard to assess accuracy. Reasons for inaccurate scoring were recorded. Results SOFA scores varied from 1 to 15 in this sample, the mean SOFA score for the Gold standard was 8.1 and the standard deviation was 2.9. The agreement between the diff erent ratings was very good. The computer score had a better agreement with the gold standard score (Pearson correlation coeffi cient 0.92), compared with the individual physician scores (Pearson correlation coeffi cient 0.890 and 0.895). The computer calculated the SOFA score correctly in 41 cases and the physicians calculated the SOFA score correctly in 32 cases. The average time to calculate a SOFA score was 4.91 minutes and was not signifi cantly diff erent between the physicians. The results show that this computer system is highly accurate at calculating SOFA scores from the electronic patient record and is more accurate than physicians. The time saved is considerable. Reference [1] found patients older than 60 years and with SOFA score higher than 9 for at least 5 days unlikely to survive, suggesting that it could provide a basis for deciding whether to withhold or withdraw life support. We tried to control their hypothesis in our patient population. Methods We reviewed the data of the patients older than 60 years and with a LOS higher than 5 days admitted to our ICU in 2007. We calculated the daily SOFA score of the patients with an initial SOFA score higher than 9. IGSII, LOS and mortality were compared between patients with a SOFA score higher than 9 for at least 5 days (SOFA(+) group) and the other selected patients (SOFA(-) group). In 2007, 430 patients were admitted to our ICU. One hundred and forty were older than 60 years and remained more than 5 days in the ICU. Eleven of these patients had a SOFA score higher than 9 during 5 days or more. In the SOFA(+) group, LOS (22.6 ± 13.1 vs 10.8 ± 6.9 days) and mortality (55% vs 33%) are signifi cantly higher (P <0.05). The mortality of 55% is in good agreement with the predicted IGSII mortality but far less than the 100% predicted by Cabré and colleagues. Conclusions A SOFA score higher than 9 for at least 5 days in patients older than 60 years seems useless to defi ne futility in our patient population. It is applicable to only less than 3% of the 430 patients admitted in 2007. The mortality rate of the SOFA(+) group increased but remained far from values permitting to withdraw or withhold intensive care. A weakness of the use of sequential SOFA score to predict outcome is that some therapeutic options can infl uence the value of the SOFA score. Introduction Prolonged ICU stay is associated with high morbidity, mortality and costs [1, 2] . Prediction of this prolonged stay will provide information for physician and family and help with resource allocation. Even though, available severity scoring system ie. APACHE II, APACHE III, MPM, SAPS II, MODS scores are widely accepted for evaluating outcomes in the ICU population. But these models might be inaccurate when apply to subpopulation and might not predict prolonged length of stay. Conclusions All three scores are useful prognostic factors for mortality and for ICU therapy in the ED, with usually lower patients' severity of infection than in the ICU. The ICU-validated APACHE II and SOFA scores were of similar prognostic value as the ED-specifi c MEDS score. Recently, Raum and colleagues [1] generated and validated a new trauma score (Emergency Trauma Score, EMTRAS) based on age, prehospital GCS, prothrombin time and base excess. Each parameter is subdivided into four classes, scored from 0 to 3 points. Scores of each class are summed to obtain the EMTRAS, ranging from 0 to 12. Here we present preliminary results of a study with the aim to validate this new scoring system and to compare it with other commonly used illness scores. Methods One hundred and fi fty trauma patients admitted to the regional referral trauma center (Careggi Teaching Hospital, Florence, Italy) were studied. Predictive value of the EMTRAS score was compared with the Injury Severity Score (ISS), Revised Trauma Score (RTS), Trauma Injury Introduction Severe sepsis is an important cause of morbidity and mortality following major surgery. Factors that are associated with an increased risk of sepsis following surgery include emergency surgery, patient co-morbidities, allogeneic blood transfusion and degree of surgical insult [1] . Physiological track-and-trigger systems are widely used to identify deteriorating patients. The Modifi ed Early Warning System (MEWS) is one such system, but has not been studied in regard to predicting the development of sepsis after surgery. Although high MEWS scores are associated with increased hospital mortality, the sensitivity of MEWS and other physiological track and trigger scores for predicting death or admission to intensive care is low [2] . Methods We carried out a prospective cohort study on 101 patients undergoing elective major surgery in a large university teaching hospital. The patients were followed up for 10 days, and the incidence of sepsis and septic shock was documented. MEWS scores were recorded daily for each patient. Admissions to critical care were documented, along with critical care length of stay. Results Twenty-seven (27%) patients developed sepsis and nine (9%) developed septic shock. Factors associated with the development of sepsis were intraoperative blood transfusion (P = 0.013), duration of operation (P = 0.004) and a postoperative MEWS score greater than 3 (P = 0.0003). Using multivariate logistic regression analysis, a MEWS score greater than 3 after surgery was the only factor that remained signifi cantly associated with sepsis (odds ratio 4.89, P = 0.003). Although a high MEWS score was associated with sepsis after surgery, only fi ve (19%) patients who developed sepsis had an abnormal MEWS score prior to (mean 4.6 days) sepsis being diagnosed. In the post-CCO group 10 of the 27 patients were admitted to the ICU and 17 of these were treated in hematology ward, and nine of these received NIV (Figure 1 ). We defi ned six criteria for activation of the CCO team: radiological signs of pneumonia, organ increased by 116% (n = 65; type 1 MI: 54%; type 2 MI: 46%; P <0.05). Type 1 MI increased by 75%, which was refl ected by a 57% decrease of patients formerly classifi ed as instable angina. Of note, increases of MI diagnosis were mainly refl ected by a 200% increase of type 2 MI (for example, acute heart failure, tachycardia or hypertensive crisis). Numbers of patients with noncardiac chest pain were not signifi cantly changed with the use of the new cutoff point. AUC of admission cTnT levels were 0.76 and 0.96 for the cTnT fourth generation and cTnThs (P <0.05). At admission, sensitivity and specifi city of cTnThs were 93% and 94% for the detection of acute MI, while they were 32% and 96% using cTnT fourth generation. Conclusions Using lower cutoff points for the defi nition of MI as suggested by current recommendations, the rate of chest pain patients with acute MI doubled. Because the increased rate of patients categorized as acute MI is mainly refl ected MI type 2, the ED triage decision at lower cutoff levels of cTnT levels should include patient history, physical examination, 12-lead ECG and cardiac imaging. Introduction The aim of this study was to determine the correlation between B+ line score and blood gas analysis (BGA) parameters in patients with acute cardiogenic pulmonary edema (ACPE). The presence of B+ line artifacts on lung US is correlated with increased extravascular lung water. In patients with acute decompensated heart failure the resolution of B+ lines can be used to monitor resolution of pulmonary congestion. A simple nine-point comet score has been proposed to quantify B+ lines and monitor ACPE treatment [1] . Methods Twenty-one patients with ACPE were submitted to lung US and BGA on presentation at 2 and 24 hours after admission. Lung US was done in nine thoracic areas and scored one point if B+ lines were present. A score of 10 was given for the presence of multiple coalesced comets in all fi elds (white lung). BGA values were used to calculate PO 2 /FiO 2 (P/F) and A-a gradient. Results A total of 64 scans were performed with simultaneous BGA. Ninety-eight percent of patients presenting with white lung were severely hypoxic (PO 2 <50 mmHg; P/F <200). Figure 1 illustrates a strong linear correlation between reduction in comet score and improvement in P/F (r = -0.73; P <0.001). A decrease in comet score of 2 points corresponded to a minimum increase of 20 points in P/F (P <0.05). The correlation between comet score and A-a gradient was less striking signifi cant (r = -0.51; P = 0.05), perhaps because patients with underlying lung disease were not excluded from this pilot study. Conclusions There is a strong negative correlation between lung comet score and gas exchange in patients with ACPE. In these patients, serial thoracic US may reduce the need for repeated invasive BGA monitoring and further help tailor therapy. Reference The conventional treatment for most distal radial fractures is closed reduction and cast immobilization. Conventionally the hematoma block is used to reduce the fractures but it is done blindly and radiographic techniques have been essential and eff ective in monitoring these reductions [1] . Radiation-free ultrasonography, however, can provide both real-time and dynamic multiple-plane images with a small and simpleto-use transducer that can be operated with only one hand. We therefore wanted to see whether the real-time and dynamic multiple-plane Figure 1 (abstract P267) . Correlation between reduction in comet score and improvement in P/F. observation capabilities of ultrasonography would allow an ED physician to perform under direct vision a hematoma block and closed reduction without multiple attempts of either as are frequently required when only conventional techniques are used. Methods Sonographically guided hematoma block and closed reduction was performed in three patients in the emergency department for an acute distal radial fracture. The effi cacy of this method was evaluated and compared with that of conventional techniques. The ultrasound images helped giving the hematoma block under direct vision and delineated the fractures as accurately as did the conventional radiographs. All parameters measured on the ultrasound images showed substantial restoration of anatomic alignment after reduction. Under direct vision, the hematoma block helped reduce the number of attempts and was less painful and achieved more patient satisfaction. Conclusions Sonography is an accurate, simple, time-saving, less painful and radiation-free tool. Reducing badly displaced or angulated forearm fractures in the emergency department can be diffi cult. Multiple attempts at reduction may be required, with repeated trips to the radiology department, before an adequate reduction is achieved [2] . Therefore ultrasound-guided hematoma block and reduction of diffi cult forearm fractures reduces the number of needle attempts and allows the physician to assess the adequacy of the reduction at the patient's bedside. Introduction Patients with acute cardiogenic pulmonary edema require rapid assessment and therapy to prevent progression to respiratory failure and cardiovascular collapse. Lisbon city has emergency medical teams that respond to situations where the life of the patient is at risk and whose goal is to begin treatment if indicated and assure transport to the hospital in the best conditions possible. We studied the intervention of one of these teams on patients with acute pulmonary edema. We designed our study to fi nd a useful USG pattern for diagnosis and assessment of dyspnea. We also compared utilities of chest radiograms in a supine position, thorax CT and bedside USG techniques. Methods In Selcuk University Meram Medical Faculty, during the year 2009, 60 patients who had acute dyspnea were included in our study (30 patients with chest trauma and 30 patients with nontraumatic causes). We designed our study as prospective. We tried to determine pathologies that have positive and negative Sliding Lung Sign (SLS) in USG examinations. We also investigated its reliability in diff erential diagnosis of acute dyspnea. Results First of all we performed chest radiograms on all patients in a supine position. After that we performed CT scans. All of the results were interpreted by a detached radiologist. Then we performed bedside thoracic USG to fi nd SLS by physicians in the ER. We conceded CT results as the golden standard and compared the results with chest radiograms and USG fi ndings. The SPSS 13.0 program and Χ 2 test were used for statistical analysis (P <0.05). According to our fi ndings, SLS in USG examinations has a sensitivity of 84% and a specifi city of 97% with a 97% positive predictive value and 83% negative predictive value. Conclusions Throughout the years, USG examinations were defi ned as inappropriate in lung examinations but in our study we reached a conclusion that they are a necessity. Bedside sonographic examination allows impetuous diagnosis. But every respiratory failure has its own sonographic pattern so mistakes are inevitable. Nevertheless we believe that if profound studies like ours keep coming and if more and more clinicians begin to use USG examinations, the SLS diagnostic value will rise and become a routine element in the ER. Introduction Many air medical transport programs use pulse oximeters, end-tidal carbon dioxide monitors and other devices as indirect measures of respiratory and cardiac status. Thus, these methods do not replace cardiac auscultation during fl ight, which may be needed to identify sudden critical change. And the ability to compare left-sided and right-sided breath sounds may be essential to confi rm the appropriate placement of endotracheal tubes as well as to diagnose pneumothorax. The ability for auscultation during air medical transport is compromised by high ambient noise levels. The aim of this study was to assess the capabilities of a traditional and an amplifi ed stethoscope (which is expected to reduce background and ambient noise) to assess heart and breath sounds during medical transport in a FALCON 50 plane. Methods A prospective, double-blind, randomized study was performed. We tested one model of traditional stethoscope (Littman® cardiology III) and one model of amplifi ed stethoscope (Littman 3100). The sound level was amplifi ed in six of the eight increments of the amplifi ed stethoscope. Practitioners on board were all experienced in air medical transport. They had normal audiologic testing. We studied 18 heart and lung auscultations, during real medical evacuations on board the Falcon 50 (medically confi gured; Dassault aviation). For each, the quality of Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 auscultation was described using a numeric rating scale (ranging from 0 to 10, 0 corresponding to 'I hear nothing' , 10 to 'I hear perfectly'). Comparisons were accomplished using a t test for paired values. Results Age of patients was 42 ± 11 years, 78% were males. The body mass index was 29.5 ± 4.7. For cardiac auscultation, the value of the rating scale was 5.7 ± 1.4 and 6.5 ± 1.8, respectively, for the traditional and amplifi ed stethoscope (P = 0.027). For lung sounds, quality of auscultation was estimated at 3.6 ± 2.3 for the traditional stethoscope, at 3.9 ± 2.9 for the amplifi ed stethoscope (P = 0.193). Conclusions We concluded that fl ight practitioners in the Falcon 50 are more able to hear cardiac sounds with an amplifi ed than with a traditional stethoscope, whereas there is no signifi cant diff erence concerning breath sound auscultation. These fi ndings suggested to the investigators that Falcon 50 noise and human breath sounds might share a common or substantially overlapping frequency spectrum, with amplifi cation of one necessarily amplifying the other. To assess this hypothesis, further studies are needed to evaluate the sound frequency spectrum in the medically confi gured Falcon 50. Performance We studied the performance of two respirators employing an advanced turbine delivery system: LTV 1000, Tbird VSO2. We assessed the ability of the ventilators to deliver to a normal lung model a set fraction of inspired oxygen (FiO 2 ) at diff erent simulated cabin altitudes. Methods We used a decompression chamber to mimic the hypobaric environment at a range of simulated cabin altitudes of 1,500, 2,000 and 3,000 meters (4,000, 5,333, 8,000 feet). A model of normal lung was used. Concerning the Tbird VSO2, cabin altitude was input. Ventilators were tested with FiO 2 set at 50% and Vt set at 700 ml. We noted the eff ective FiO 2 assessed by the ventilators (paramagnetic analysis). We measured the FiO 2 really delivered with a dedicated instrument of the French physiological laboratory of aviation and space medicine of the Air Force. Comparisons of preset and assessed FiO 2 to actual measured FiO 2 were accomplished using a t test for each altitude. Results Figure 1 shows the data with FiO 2 set at 50%. Conclusions On one hand, both ventilators showed a moderate variation between FiO 2 set and delivered. On the other hand, variations between FiO 2 delivered and assessed are high, suggesting ineffi cacity of ventilators' paramagnetic analysis in hypobaric conditions. Conclusions Both teams evacuated red codes before the yellow ones in similar time. T-AMP was shorter considering global, yellow and green codes for T as opposed to NT. Global and green LOS was also shorter in the T group as opposed to NT. Training seems to infl uence global exercise management, less aff ecting red codes but with an impact on yellow and green evacuation strategies. Introduction Exacerbation of asthma can appear during air transport. Severe patients, not responding to conventional therapy can require ventilator support. We evaluated the performance of two transport ventilators, built with turbine technology -the T-bird VSO2 and the LTV 1000 -for use during aeromedical evacuation of acute severe asthma. We have assessed the ability of both ventilators to deliver to an acute severe asthma model a tidal volume (Vt) set at diff erent simulated altitudes, by changing the ambient air pressure. Methods We used a decompression chamber to mimic the hypobaric environment at a range of simulated cabin altitudes of 1,500, 2,500 and 3,000 meters (4,000, 6,670, 8,000 feet). Ventilators were tested with realistic parameters. Vt was set at 700 ml and 400 ml in an acute severe asthma lung model. The protocol included three measurements for each simulated altitude. Comparisons of preset to actual measured values were accomplished using a t test for each altitude. Results Figure 1 summarizes the data. With altitude, the T-bird VSO2 showed a decrease in volume delivered. Comparisons of actual delivered Vt and set Vt demonstrated a signifi cant diff erence starting at 1,500 m for a Vt set of 700 ml, at 2,500 m for Vt set of 400 ml, with a negative variation of more than 10% compared with tidal volume set at respectively 3,000 and 2,500 m. With decreasing barometric pressure, the LTV 1000 showed mostly an increase in volume delivered. Comparisons of actual delivered Vt and set Vt demonstrated a signifi cant diff erence at 1,500 m for a Vt set of 700 ml, at 2,500 m for a V test of 400 ml. The delivered tidal volume remained within 10% of the set Vt. The T-bird VSO2 progressively delivered lower volumes as barometric pressure decreased; whereas the LTV 1000 showed a moderate increase in volume delivered for the acute severe asthma lung model, with increasing altitude, but maintained the delivered volume within 10% of the set Vt up to 3,000 m. We tested inter-rater and intra-rater agreement respectively. As concerns inter-rater agreement, nine triage nurses sorted the same 100 patients. The assigned classifi cations compared the reference assignment and were analyzed with Cohen's kappa coeffi cient. Intra-rater agreement was tested by asking 30 triage nurses to sort the same 10 patients at two diff erent times (T1 was early morning and T2 was late night). We compared the classifi cation at T1 and T2 with the null hypothesis being no diff erence and determined whether diff erences were nurse dependent. Results Statistical analysis of intra-rater agreement revealed that the mean diff erence between classifi cation at T1 and T2 did not signifi cantly depend on the nurse (P = 0.3487). This result allowed pooling of the data and testing to determine whether the mean was diff erent from zero. We used the Sign test and found that classifi cation at T1 was not diff erent from classifi cation at T2 (P = 0.581). As concerns inter-rater agreement, we used Cohen's kappa coeffi cient, which revealed an almost perfect agreement between classifi cation by nurses and the reference. The need for a specifi c triage tool in our emergency department led us to develop a new French-language triage scale. The present study demonstrates that this scale is a valid triage tool with very good inter-rater and intra-rater agreement. Such results now allow a future study to evaluate its effi ciency. Introduction In 1996, a UK audit suggested suboptimal involvement of consultant care in acute medicine [1] . Acute medicine has since evolved as a specialty to improve medical inpatient care in the fi rst 48 hours [2] . Many hospitals now incorporate an acute short stay unit through which up to 70% of medical admissions can be directed [3] . These units reduce the length of hospital inpatient stay [4] . Methods We conducted a review of case notes for 100 consecutive medical admissions to the Emergency Short Stay ward (ESS). Data collected included diagnoses on admission and discharge, length of stay on ESS and discharge destination. On reviewing each initial medical clerking, an assessment was made regarding the suitability of the admission to ESS, standardized against local criteria. Results Data on 100 patients (age 17 to 89) were collected. Eighty-nine per cent were admitted from the medical admissions unit, most commonly We analysed triage reliability among nursing students at an Italian university before and after a course on a new Triage Emergency Method, TEM v2, which showed good reliability in a previous study [1] . Few studies compared triage reliability in nursing students. To our knowledge there are no Italian studies on this topic. Methods This is an observational study conducted in the University of Parma using a triage scenarios database used in previous studies [1, 2] . Fifty students attending the third year of a nursing course were selected to assign a triage level to 105 paper scenarios, without triage protocols (before course) and with TEMv2 (1 month after a 2-hour course on triage and TEMv2). To prevent communication between participants, they assigned triage codes in diff erent rooms, and in the presence of two investigators. The triage scenarios were given randomly to the participants. We measured: the inter-reliability using the weighted kappa statistics; the complete disagreement (when nurses of the same group assigned to the same scenario triage codes that diff ered in more than two priority levels); the complete agreement (when all fi ve nurses assigned the same triage code) before and after the course. Of the 105 patients included in triage scenarios, 66 (63%) were women, the mean age was 43.7 years (SD ± 26.3), and 22 were under the age of 18 years. The most frequent presentation at triage was minor trauma (30%). There were 30 hospital admissions: 27 in non-intensive wards and three in ICUs. The mean age of students was 24 ± 3 SD. Few participants attended triage training before (18%) and they declared scarce triage knowledge. Inter-rater reliability was k = 0.42 (95% CI: 0.37 to 0.46) and k = 0.61 (95% CI: 0.56 to 0.67) before the course (without triage protocol) and after the course (with TEMv2), respectively. Complete disagreement occurred in 98% of scenarios evaluated before and in 64% after the course. Complete agreement was always zero. Conclusions Our data suggest that TEM v2 improves triage reliability among nursing students. It seems to be easy to understand and to use. International guidelines for trauma care still recommend the traditional advanced trauma life support investigations for the primary survey: plain radiographs of the chest, cervical spine and pelvis with FAST ultrasounds. However, an increasing amount of data suggest that plain X-rays have excessively low sensitivity. Cervical X-ray may miss up to 35% of the spine injuries, chest X-ray may miss as many as many as 50% of traumatic pneumothorax and a signifi cant percentage of pelvic fractures may not be detected on the pelvis X-ray. CT scan and extended emergency ultrasound (EUs) have a much higher sensitivity if compared with plain X-rays. Moreover, EUs is even less time consuming. Therefore since 2004 we have adopted EUs and/or total body CT scan as the fi rstline diagnostic tools for major trauma (MT) victims. In a benchmarking analysis of Italian trauma centers (TCs), our hospital ranked fi rst for patient mortality and long-term disability [1] . These results were associated with the shortest diagnostic time. The aim of this study is to assess whether these results were associated with a real change in the diagnostic process. Methods In order to evaluate the diagnostic approach to MT we retrospectively analyzed all MT cases (ISS >15) admitted to our TC over a 2-year time span. All investigations performed in the emergency department (ED) are entered in an electronic shift just before being performed. Data for all investigations requested within 2 hours from admission were analyzed. Results From 1 November 2007 to 31 October 2009, 743 MT patients were admitted to the ED. EUs was performed in 515 patients (69%), total body CT scan in 679 (91%). Patients who were not submitted to CT either died soon after admission or were rushed to the operating room on the basis of EUs results. Thirty-eight patients had a chest X-ray taken in the ED (5.1%), 11 (1.5%) a pelvic X-ray and only three a cervical spine X-ray. Conclusions Although many international guidelines for trauma care still recommend traditional plain radiographic investigations, our TC as well as many other European institutions with a high volume of trauma patients have adopted a diff erent strategy based on the extended use of EUs and CT scan. This may improve the accuracy of the diagnosis in the stabilized patients and reduce time to the operating room in the highly instable ones. We suggest that a change in the recommended guidelines should be considered. Results Of these patients, 317 blunt trauma patients (mean age, 43.9 ± 23.8 (SD) years) were examined by fi eld FAST and enrolled in the study; all participants were also examined by FAST in the ED. The mean Injury Severity Score was 16.1 ± 15.4 (SD). Forty-seven patients (14.8%) were ultimately diagnosed with pericardial and intraperitoneal fl uid. Field FAST detected 12 of these cases. The sensitivity of fi eld FAST was signifi cantly lower than that of FAST in the ED (P <0.01), whereas the specifi cities of FAST in the two settings were not signifi cantly diff erent. Emergency surgery was performed in nine of the 12 patients who were positive with fi eld FAST. The aim of the investigation was to analyze the diagnostic and prognostic signifi cance of serum proapoptosis and antiapoptosis markers in the patients with severe trauma injury. Methods Thirty-three patients with severe trauma injury were enrolled into our investigation (we excluded patients with severe brain injury, the middle age was 39 ± 8, initial TRISS scale 7.8 ± 1.8, APACHE II 14 ± 5). All of the patients demonstrated signs of systemic infl ammatory response, in 12 patients the infectious pulmonary complications were revealed and in 21 patients the infectious complications were not developed. Venous blood serum EDTA samples were investigated every day during a week in the -25°C freeze. We investigated antiapoptotic markers (soluble Fas-L, soluble APO-1/Fas, Cu/Zn superoxide dismutase), proapoptotic markers (protein p53, protein Bcl-2), and high-sensitivity CRP (ELISA, Bender Medsystem). We observed an increase in proapoptotic markers in patients with infectious complications from the fi rst day in the ICU, antiapoptotic markers (Cu/Zn superoxide dismutase) were lower with regard to reference points. The patients demonstrated a maximum of Fas-L, protein p53, and protein Bcl-2 (near 50% relative to fi rst day) on the second day. In patients without infection complications we identifi ed a decreased level of proapoptotic markers and an increased level of antiapoptotic markers (SOD, soluble Fas-L, soluble APO-1/Fas Introduction Inadequate or delayed imaging of the unconscious patient with traumatic brain injury or of the unconscious polytrauma patient may lead to catastrophic consequences. The need for comprehensive imaging needs to be balanced against the patient's condition, the resources available and excessive radiation exposure. We have introduced regional evidence-based guidelines in the Southwest region of the UK. The aim of these guidelines was to standardise practice, minimise delayed or missed diagnosis of serious injuries, facilitate treatment of associated injuries (such as head injuries) and to obviate the need for repeated imaging. Methods The notes of all unconscious polytrauma patients transferred from the emergency department or other hospitals and all patients transferred to our institution for ongoing neurosurgical care were retrospectively reviewed over a 1-year period. Adherence to the imaging protocol was assessed and the need for any further radiological investigations within 48 hours was also documented. Results A total of 46 patients were identifi ed who fulfi lled the criteria for the introduced guidelines. Of these patients, 21% were transferred from other hospitals while the remainder was admitted from the onsite emergency department. Eighty-three per cent of all eligible patients adhered to the protocol. Two patients (4%) required further radiological investigation in the 48 hours following admission. Both had isolated head injury and required further imaging to exclude cervical injuries following inadequate imaging which did not follow protocol. One further patient (2%) required repeated imaging to exclude mesenteric injury. Conclusions Regional adoption of imaging guidelines aimed at obtaining early comprehensive imaging of both head-injured and polytrauma patients results in high compliance and a low rate of re-imaging. Introduction Blunt cerebrovascular injuries (BCVI) are more common than previously reported and, if not promptly recognized and treated, may have devastating sequelae. When these injuries are diagnosed before the onset of stroke, and patients receive early antiplatelet or anticoagulation therapy, a substantial reduction in BCVI-related neurologic events has been demonstrated [1] . Methods The study setting was a level 1 trauma center with a catchment population of more than 2 million people, admitting 350 major traumas yearly. To assess the incidence of asymptomatic BCVI in the severely injured patients, we planned a prospective cohort study. According to the study design, all of the severely injured patients presenting with at least one of the following criteria [2] were submitted to a screening 16-channel CT angiography within 24 hours from admission: diff use axonal injuries; fracture of the cervical vertebrae or of the skull base; Lefort II or III or other severe facial fractures. Results During the fi rst 5 months of the study, 24 major trauma patients (ISS >15) with at least one of the above listed risk factors were submitted to a screening CT angiography for BCVI. All of the patients were sedated and artifi cially ventilated. None of them was symptomatic for stroke. Three patients (12.5%) in this high-risk group had an asymptomatic BCVI: pseudoaneurysm (one), traumatic stenosis (one) or dissection (one) of the carotid artery. They were immediately treated with antiplatelet therapy (clopidogrel + aspirin). They experienced no episodes of cerebral infarction and no cerebral haemorrhage. Conclusions There are limited data in the literature on traumatic BCVI. The available data as well as the preliminary results of our prospective study show that BCVI are more common than previously recognized. Aggressive screening and earlier therapy may signifi cantly reduce complications and improve patient outcomes. Introduction The aim of this study was to determine the effi cacy of 8.4% sodium bicarbonate for intracranial pressure (ICP) reduction in adult patients with severe traumatic brain injury (TBI) and intracranial hypertension. Methods The study examined 10 episodes of ICP >20 mmHg in seven patients with severe TBI. Eighty-fi ve milliliters of 8.4% sodium bicarbonate was infused over 30 minutes when osmotherapy was indicated after standard care. ICP, mean arterial pressure (MAP), and cerebral perfusion pressure (CPP) were recorded at baseline and then continuously for 6 hours. Serum pH, pCO 2 , (Na + ), and (Cl -) were measured at baseline, 30 minutes, 60 minutes and then hourly for 6 hours. Serum osmolality was measured at baseline and at 6 hours in three patients. All other care was identical to the institutional protocol for the management of raised ICP. Results At the completion of the infusion, the mean ICP was reduced to 36.2% of baseline, from 28.5 mmHg (±2.62) to 10.33 mmHg (±1.89). Mean ICP remained below 20 mmHg for 6 hours (Figure 1) . CPP was increased after the infusion due to the eff ect on ICP. MAP did not change. Mean pH was elevated at t = 30 minutes (from 7.45 ± 0.02 to 7.50 ± 0.02) and remained elevated for the duration of the study period. Serum (Na + ) increased (from 145.4 ± 1.9 mOsm/l to 147.1 ± 1.9 mOsm/l) at 30 minutes. pCO 2 did not change. Osmolality was elevated. Conclusions Eighty-fi ve milliliters of 8.4% sodium bicarbonate infused over 30 minutes is eff ective at reducing sustained ICP >20 mmHg to within accepted treatment targets (ICP <20 mmHg) for at least 6 hours. Introduction Water-electrolyte imbalance and endocrine disorders make the problem of maintaining patients with severe traumatic brain injury (TBI) more diffi cult. A plasma sodium level ≥160 mmol/l is associated with 75% mortality. The purpose of this investigation was to fi nd the relationship between hypernatraemia and the rate of unfavorable outcomes in children with TBI. Methods A total of 77 children <18 years of age with TBI (admission GCS score <8) were divided retrospectively into three groups: Group A included children without hypernatraemia (n = 51), Group B children with hypernatraemia (n = 14) and Group C (n = 12) children with hypernatraemia and polyuria. Group C was considered the group of patients with central diabetes insipidus (CDI). Hypernatraemia was defi ned as a twice elevation of the plasma sodium level over 149 mmol/l within 24 hours, while polyuria was defi ned as an increase in the hourly diuresis of more than 3 ml/kg/hour in no less than 6 hours. The mean sodium level at admission was 140.1 ± 4.1 mmol/l. Hypernatraemia was detected in 26 patients (33.8%). The mean duration of the period of hypernatraemia in Group B was 4 days (3 to 6 days), while the mean sodium level during the period of hypernatraemia was 158.3 ± 3.3 mmol/l (max 176.8 mmol/l). The duration of the period of hypernatraemia in Group C was 4.5 days with max 181.1 mmol/l and average 161 ± 4.7 mmol/l. Polyuria was diagnosed in 15.5% of the cases. The highest diuresis in this group was 4.1 mmol/kg/hour, mean 3.7 ± 0.5 ml/kg/hour. Such changes were considered a manifestation of CDI. All 12 patients in Group C received desmopressin (DDAVP) for more than 48 hours (mean 56.8 ± 4. 5 hours). The doses were 0.025 to 0.2 mg/day. In four out of 14 children in Group B (29%), an increase hourly diuresis up to 3 ml/kg/hour was considered the onset of CDI; thus, they were also prescribed DDAVP. Unfavorable outcomes (GOS score 1 to 3) during a 30-day assessment were observed only in Groups B and C. In a comparison of unsuccessful outcomes between Groups B and C, there was an increase in the unfavorable outcome rate in patients of Group C (with hypernatraemia and polyuria) -10 children (84%) and Group Bfour children (28%). The risk factor in the comparison between patients of Groups B and C was 0.3, P <0.05. Conclusions Our results demonstrate that hypernatraemia increases the rate of unfavorable outcomes in children with TBI. Thirty-day outcomes were worse with CDI patients. Presumably, the used of DDAVP prevents dehydration and CDI advance. Introduction Secondary ischemic insult after severe traumatic brain injury (TBI) is correlated with poor outcome. Transcranial Doppler sonography (TCD) permits a non-invasive measurement of cerebral blood fl ow. The purpose of this study is to determine the usefulness of TCD in patients with severe TBI. Methods TCD was performed on 73 patients with severe TBI, defi ned as a Glasgow Coma Scale of 8 or less on admission. All patients were on mechanical ventilation. TCD was performed on hospital days 1, 2, 3 and 7. Hypoperfusion was defi ned by having two out of three of the following: mean velocity of the middle cerebral artery less than 35 cm/ second, diastolic velocity of the middle cerebral artery less than 20 cm/ second and a pulsatility index greater than 1.4. Vasospasm was defi ned by the following: mean velocity of the middle cerebral artery greater than 120 cm/second and/or Landegaard index greater than 3. Results Thirty-four patients (64%) had normal measurements. Thirteen were discharged home, 16 were discharged to a long-term care facility and fi ve died. Two of these patients were comatose and their families requested withdrawal of care. The other three died from brain death. Eighteen patients (25%) had hypoperfusion and all 18 progressed to brain death. Twenty-one patients (29%) had vasospasm. Four of these patients were discharged home, 11 to a long-term care facility and six died. The vasospasm was detected on hospital day 1 in three patients, hospital day 2 in seven patients, hospital day 3 in four patients and hospital day 7 in seven patients. Nimodipine was administered in six patients and all six were discharged to a long-term care facility. However, in one patient, nimodipine caused hemodynamic instability and was discontinued. In 15 patients, nimodipine was not given. Six of these patients expired from brain death. Twelve of 21 patients (57%) with subarachnoid hemorrhage on computed tomography had vasospasm. Conclusions Most patients with normal measurements can be expected to survive. Patients with hypoperfusion have a poor prognosis. In patients with vasospasm, the use of nimodipine should be considered; however, further studies are needed to determine safety and effi cacy. TCD may be useful in determining early prognosis. Further studies are also needed to determine whether TCD can improve outcome in patients with severe TBI. is not yet confi rmed. We suppose that HBO could make the oxygen stay longer in the injured tissue, because of poor circulation or perfusion of blood in the injured area. This mechanism may help to explain the longer eff ect of HBO, after HBO treatment. That is the purpose of our study. Methods Patients suff ering from sub-acute traumatic brain injury, who would be treated with HBO, were enrolled in the study. They were divided into two groups: frontal lobe lesion and nonfrontal lobe lesion groups (n = 20) depending on their iconography check (X-CT or MR). The regional cerebral oxygen saturation (rSO 2 ) was measured by a Somanetics INVOS 5100 monitor before and after 90 minutes treatment with HBO (15 minutes compression, 40 minutes breathing oxygen of FiO 2 99% by mask, 10 minutes rest, 15 minutes decompression). Meanwhile, the arterial blood was sampled to measure blood gas analysis and cytokine. The parameter of rSO 2 showed no signifi cant diff erence following HBO treatment in two groups (t = 0.352, P >0.05); however, there were inconceivable results in blood gas analysis. Partial pressure of oxygen (PaO 2 ) was signifi cantly decreased after HBO treatment (P <0.05), although these changes did not take eff ect on the clinical manifestation. Otherwise, the measurement of cytokines (TNF, IL-1) before and after HBO has no diff erence (P >0.05) in all groups. Conclusions HBO has no eff ect on brain oxygen saturation, after HBO treatment. But PaO 2 was signifi cantly decreased following HBO. The mechanism needs to be studied further. Introduction This study evaluated relationships between CSF levels of brain biomarkers, glial fi brillary acidic protein (GFAP), ubiquitin C-terminal hydrolase (UCH-L1) and α II -spectrin breakdown (SBDP145), partial pressure of brain tissue oxygen (ptiO 2 ) and brain temperature (Licox system) during the fi rst 24 hours and for up to 10 days following severe TBI. Methods We studied 27 severe TBI patients having CSF drainage and invasive monitoring of partial brain tissue oxygen tension (PbtO 2 ) and brain temperature using the Licox (Integra Neurosciences, Plainsboro, NJ, USA) probe. CSF SBDP145, UCH-L1 and GFAP levels were measured by quantitative ELISA assay on admission and every 6 hours thereafter for a maximum of 10 days. Using a double lumen bolt, ptiO 2 and temperature were measured with the Licox. This study focused on the recordings of the fi rst 24 hours following injury (27 patients), as well as preliminary data from four patients for 10 days. The total duration of monitoring was 1,512 hours. During the fi rst 24 hours, biomarker levels decreased while levels of PbrO 2 increased. All three biomarkers correlated with PbrO 2 (P <0.0001, P = 0.016 and P = 0.023, respectively). After the fi rst 24 hours, there were statistically signifi cant changes in levels of brain biomarkers (SBDP145, UCH-L1 and GFAP) as well as in levels of ptiO 2 (respectively, P = 0.025, P <0.0001, P = 0.033, P <0.0001). However, the correlation between biomarkers and brain tissue oxygenation was sustained, and for UCH-L1 improved (P <0.0001). No signifi cant correlations between biomarker levels and brain temperature were found. There were no complications from the monitoring. Conclusions Our fi ndings show that CFS levels of SBDP145, UCH-L1 and GFAP are related to brain tissue oxygenation in acute and possibly the subacute (≤10 days post injury) phases of severe TBI. Future studies will more directly address relationships between changes in tissue oxygenation and biochemical markers of injury following severe TBI. CSF levels of biomarkers and brain tissue oxygenation could yield insights into pathophysiological events following severe TBI and aid in clinical assessments of severe TBI patients. Introduction After traumatic brain injury (TBI), structural lesions are heterogeneous, but the spatial heterogeneity of consequences of insults as hypoxia-hypotension (HH) and/or TBI has never been studied. The objective of this study was to compare the eff ect of standardized insults (HH, TBI and both) on brain energy metabolism in two diff erent regions: frontal cortex and thalamus. Methods Twenty-eight Sprague-Dawley rats were randomized into four groups: Sham, TBI (impact acceleration alone, 450 g weight drop from 1.8 m), HH (blood depletion to mean arterial pressure 40 mmHg, FiO 2 10%, 15 minutes) and TBI-HH (TBI followed by HH, 45 minute delay). Cerebral perfusion pressure (CPP) was continuously and invasively measured. Brain microdialysis and PtiO 2 probes were both inserted stereotaxically in the right thalamus and frontal cortex. Results Except during the HH phase, CPP was always greater than 60 mmHg. During the hour following the HH period, a signifi cant increase in cerebral lactate/pyruvate ratio (Figure 1 ), glycerol and glutamate was observed. This increase was higher in the cortex than in the thalamus in all groups subjected to HH (P <0.001). In the TBI-HH groups, the increase in glycerol in the cortex was signifi cantly higher compared with the HH group (P <0.001), as well as thalamic and cortical glutamate. During 15 minutes following the HH phase (after reinjection and reoxygenation), an increase in PtiO 2 was observed in the cortex and thalamus, but with diff erent profi les (lower increase in the cortex) ( Figure 1 ). Conclusions Diff erent profi les of cerebral response to HH and TBI were observed with higher sensitivity in the cortex than in the thalamus. The post-ischemic hyperemia seems to be altered in the traumatized cortex but conserved into the thalamus and nontraumatized brain. Introduction Decompressive craniectomy is indicated for treatment of severe intracranial hypertension. However, this procedure is invasive and potentially associated with complications. We present a preliminary result of a study comparing early and late decompressive craniectomy in severe traumatic brain injury. Methods Patients studied were all admitted to the ICU of a tertiary referral center (Careggi Teaching Hospital, Florence, Italy) during 4 years (2005 to 2009). In total, data of 62 brain-injured patients, who underwent decompressive craniectomy, were retrospectively examined and included in two groups based on decompressive craniectomy execution: early decompressive craniectomy group (decompressive craniectomy performed within 24 hours after brain injury, group A; n = 41) and late decompressive craniectomy (later than 24 hours, group B; n = 21). For all patients, demographic, scores, clinical data, length of stay and fi nal outcome were collected from the institutional database. Traumatic lesions were compared at admission using the Marshall score and 24 hours after decompressive craniectomy execution with CT scan. The Glasgow Outcome Scale (GOS) at 6 months was also collected. Results Demographic and clinical characteristics of groups are shown in Figure 1 (data expressed as mean ± SD). Patients who underwent decompressive craniectomy within 24 hours after injury (group A) had a signifi cant worst Marshall score if compared with group B (3.1 ± 0.7 vs 2.4 ± 0.8, respectively; P <0.05), but also showed a signifi cant enlargement of contusions compared with group B (52.7% vs 16.6%; P <0.01). ICU/ hospital length of stay and mortality were not signifi cantly diff erent between groups. The GOS evaluated at 6 months showed a good recovery of surviving patients in both groups (3.7 ± 1.0 in group A and 3.2 ± 0.9 in group B). Conclusions Our data, limited by the retrospective nature of the study, do not encourage an early decompressive treatment of severe intracranial hypertension. Decompressive craniectomy should be considered in case of lack of response to a medical, even intensive, approach. Introduction The purpose of this study is to verify the reliability of optic nerve ultrasound (ONU) for detecting intracranial hypertension (IH). IH is a frequent and potentially fatal complication of severe head injury. Actually, intraventricular catheters represent the gold standard for ICP measurements. Unfortunately they present serious complications and relative contraindications like thrombocytopenia or coagulopathy. Computed tomography and transcranial Doppler (TCD) represent noninvasive methods to evaluate ICP but they present drawbacks. ONU has been recently suggested as a non-invasive tool to diagnose IH. Its reliability has been showed in patients with severe head injury or with intracranial hemorrhage [1, 2] . Methods The study was conducted on 10 patients admitted to the neurointensive care unit for moderate/severe head injury in the period January to June 2009 (Group 1). Ten healthy subjects were enrolled as the control group (Group 2). Estimated ICP (eICP) was calculated with TCD using the equation proposed by Czosnyka and colleagues [3] . It was measured twice a day for 3 consecutive days. In the same sonographic session, the optic nerve sheath diameter (ONSD) was measured in the sagittal and transverse plane 3 mm behind the papilla, in both eyes. No signifi cant diff erences have been found between groups regarding age and sex. The ONSD distribution is shown in Figure 1 . ONSD, eICP and ICP values were signifi cantly higher in Group 1 than in Group 2. Linear regression analysis identifi ed a signifi cant relationship Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 between ONSD and ICP (r = 0.588). By calculating the receiver operating characteristic curve, an ONSD value of 5.35 mm resulted as the optimal cut-off point with a sensitivity of 95.1% and a specifi city of 96.2%. Conclusions ONSD measurements correlated with invasive and noninvasive (TCD) measurements of ICP. It is a useful, non-invasive bedside tool to diagnose IH. It is safe, easy to perform and can rapidly give reproducible information on patients' ICP. Introduction Traumatic brain injury (TBI) is a multiphasic disease. Follow ing the initial impact, secondary injury, including oxidative stress and infl ammation, is thought to contribute signifi cantly to death and disability. This ongoing damage in the penumbra of the brain leads to the demise of neuronal populations, and ultimately decreased brain function. Identifi cation of various neural markers such as neuronal specifi c enolase (NSE) at various time points following TBI may help us better understand the magnitude of secondary injury following TBI, and may help predict outcome in these patients. Methods Early serial cerebrospinal fl uid (CSF) samples were collected from patients with severe TBI (GCS 3 to 8) who required placement of a therapeutic ventriculostomy. Following ventriculostomy placement, CSF samples were obtained every 4 hours for the fi rst 24 hours post-injury, then every 8 hours for post-injury days 2 to 5. The levels of NSE were then measured using the ELISA method. Results Following sample analysis in 11 patients, we found that in TBI patients that had a good neurological outcome, the levels of NSE in the CSF rose early (between 25 and 60 ng/ml) within 12 to 16 hours following the initial injury. The NSE levels then rapidly decreased to normal levels (~5 ng/ml) at approximately 20 hours following injury, with these levels persisting until day 5 (day of fi nal sample collection). Patients with a poor outcome (lack of ability to return to pre-injury activities or death) showed signifi cantly higher levels of NSE persisting out to 5 days post injury, with late levels ranging from 35 to 50 ng/ml. Conclusions In TBI patients with a good outcome, there was an increase in NSE levels in the CSF noted at early time points (~16 hours), which abated at approximately 20 hours after injury. In TBI patients with a poor clinical outcome, CSF levels of NSE were signifi cantly elevated at later time points over the fi rst 5 days post injury. Introduction The aim of the study was to determine whether serum levels of biomarkers -hyperphosphorylated neurofi lament NF-H, S100β protein and NSE -correlate with severity of brain injury and outcome in children with traumatic brain injury (TBI [1] . Glasgow Outcome Scale (GOS) was calculated at 12 months by the patient's general practitioner [2] . Unfavourable outcome was defi ned as DDS 3, 4 or GOS 1, 2, 3. Correlation between DDS and GOS was measured and the ability of DDS to predict GOS at 12 months in survivors was assessed using logistic regression. Sensitivity and specifi city of dichotomised DDS for prediction of unfavourable outcome at 12 months was also calculated. Results Data were available on 1,227 patients. A highly correlated linear relationship was evident between DDS and GOS (spearman correlation coeffi cient 0.77, P <0.0001). Unfavourable outcome measured by DDS showed sensitivity of 73% (95% CI 69 to 76%) and specifi city of 56% (53 to 59%) for prediction of unfavourable outcome at 12 months defi ned by GOS. DDS showed a strong association with GOS at 12 months for survivors at discharge, with fully dependent patients having 15 times higher odds of an unfavourable outcome. Adjusting for known TBI prognostic factors attenuated the strength of the association, however DDS remained a statistically signifi cant strong predictor of outcome. Conclusions DDS is strongly associated with 12-month GOS and could potentially be used to replace missing data or provide a surrogate outcome measure in TBI trials. A valid prediction score may also aid the clinician's ability to discuss patient prognosis at hospital discharge. References to BT-CPA and PT-CPA treated with our strategy including emergency department thoracotomy (EDT). Methods This study is a population-based case series observational study. We have taken three approaches to these patients: our private aggressive treatment strategy (resuscitation for 30 minutes, aggressive infusion using sheath introducer into the subclavian vein, and EDT); in-hospital system supporting this aggressive resuscitation (logistic issue such as the close location between ED and the room for catheter intervention and CT, and direct entrance to the OR by exclusive lift, and common instruments interchangeable between ED and OR including the bed); and prehospital EMS in our city (CPA patients are transferred in about 7 minutes to the nearest of selected 11 hospitals which can receive and treat CPA patients). ) is a bag-valve-mask resuscitator (BVM) equipped with a valve limiting fl ow-rate and peak airway pressure to decrease hyperventilation and gastric insuffl ation. We studied this device used by hospital staff without training to evaluate the possibility to change the standard BVM to the SB in hospital wards where positive pressure ventilation is rarely needed and training is sporadic. The participants (12 physicians, 38 nurses) were randomised to use the standard BVM or SB to ventilate a resuscitation training manikin over 1 minute. The mechanism of the SB was shortly described in the SB group but no hands-on training was provided. Participants were asked to ventilate the patients as they would do if patients have a pulse but do not breathe. Tidal volumes were registered to a computer connected to the manikin. The medians of minute ventilation were 6.1 (interquartile range 2.6 to 8.1) and 3.9 (IQR 1.7 to 5.4) litres per minute in the SB and control groups, respectively (Figure 1 ). Hyperventilation >10 l/minute occurred only in the control group (P = 0.23). Conclusions Using the SB without previous hands-on training is possible for the majority of nursing and medical staff and decreases hyperventilation in comparison with the standard BVM. In the current study, therefore, we investigated the effi cacy of ECC in the dental chair in comparison with ECC on the fl oor. Methods Two dentists and two nurses with experience of ECC participated in this study; 30 ± 5 years old, 160 ± 5 cm in height, 65 ± 7 kg in weight. Before the study, they were educated about CPR and performed ECC for 5 minutes on the resuscitation manikin in two diff erent situations; on the fl oor and in the dental chair. On separate days, they repeated these ECC procedures on the fl oor and in the dental chair again. The depth of compression and the percentage of adequate compression were evaluated. In addition, each participant commented on the preferable situation in the questionnaire after each set of ECC. Results Four dental personnel performed ECC fi ve times on the fl oor and fi ve times in the dental chair and commented on the preferable setting fi ve times. The effi cacy of ECC was evaluated by the average depth and the percentage of ECC with adequate depth; 39.8 ± 8.2 mm and 46.8 ± 48.8% on the fl oor and 34.4 ± 6.9 mm and 41.7 ± 42.7% in the dental chair. The percentage of ECC with adequate depth was higher for the fl oor setting than that of the dental chair setting, although it did not reach statistical signifi cance (P = 0.079). In the 20 questionnaires, three of them preferred the dental chair setting, two of them were no diff erence between both settings and 15 of them preferred the fl oor setting. Conclusions ECC on the fl oor can be performed eff ectively and easy and we can start CPR immediately after moving the patient onto the fl oor. Introduction Today, the emergency medical service (EMS) system has been developing in each country. However, it is not known whether people are willing to perform CPR and whether they are prepared to perform CPR under telephone CPR advice. In this study, we examined how Japanese citizens are interested in the importance of immediate CPR and defi brillation, and how they understand this importance. Methods Patients with out-of-hospital cardiac arrest due to nontraumatic etiology treated in our center for the past 2 years were enrolled. Cardiac arrest after the scene was excluded. Patients' records from our emergency department were reviewed. In Japan, the ambulance service, dispatch service and emergency life-saving technician (ELST) belong to the fi re department. ELSTs perform not only advanced CPR for CPA patients on the job but also education of call takers in the central operation center under medical control about the importance of the recognition of CPA and advice for immediate bystander CPR. Results A total of 747 patients were enrolled. Telephone CPR advice was performed for bystanders of 336 (45%) patients and 304 bystanders actually performed CPR (90%). Five percent (40) of all 747 cases underwent voluntarily bystander CPR before telephone advice, 40% underwent bystander CPR according to telephone advice, and 4% (32) did not undergo CPR against telephone CPR advice. In 344 patients with bystander CPR, 33% reached ROSC, 11% survived more than 24 hours, and 3% were discharged, and in 391 patients without bystander CPR, 30%, 11%, and 6%, respectively. Restricted in 302 witnessed patients, of 125 with bystander CPR, 52% reached ROSC, 16% survived more than 24 hours, and 3% were discharged, and in 177 without bystander CPR, 44% reached ROSC, 14% survived more than 24 hours, and 1% were discharged, respectively. Conclusions Most people are willing to perform CPR. Bystanders are prepared to perform CPR with telephone CPR advice to help them. However, bystander CPR is not always adequate, resulting in no distinct eff ect of CPR on survival rate. We should educate citizens beforehand and guide bystanders with more proper and quick advice by telephone. Introduction The purpose of the study was to analyze the eff ects of the type and nature of CPR on the prognosis of out-of-hospital cardiac arrests (OHCAs). Methods We analyzed 1,612 OHCAs, witnessed by citizens and handled by the dispatch system in Ishikawa, Japan, from 1 April 2003 to 31 March 2008. Bystander CPR was classifi ed into four groups according to type (CC only or CC + MMV) and nature (under one's own initiative or telephoneassisted instruction). The presence of bystander CPR signifi cantly augmented the 1-month survival rate. However, there were no signifi cant diff erences among the four groups of CPR. The multivariate logistic regression analysis identifi ed three time factors including intervals of collapse-to-call, call-tofi rst CPR, call-to-arrival to patients as independent factors associated with 1-month survival. See Figures 1 and 2 . Conclusions Signifi cance of correctable time factors rather than type of CPR should be considered in the future guideline revision. Conclusions As an alternative airway device recommended by the ERC, the LT may enable airway control rapidly and eff ectively. Additionally, by using the LT, a reduced 'no-fl ow-time' and a better outcome may be possible. LT may be a good alternative airway device for providing and maintaining a patent airway during resuscitation. Introduction Besides the gold standard endotracheal tube, supraglottic airway devices are alternatives for emergency airway management [1] . The goal of the study was to identify airway devices that provide successful ventilation, even 12 months after training in manikins. Conclusions One year after training, time for successful ventilation for all devices was lower than 25 seconds. This is acceptable compared with the gold standard endotracheal tube, but due to the rising gastric infl ation rate and high amount of insuffi cient tidal volume, shorter intervals of training maybe necessary. Introduction The purpose of the study was to clarify the infl uence of aging on attitudes toward the initial three links in chains of survival. Methods We gave questionnaires to attendants of compulsory programs for basic life support (BLS) or driving technique at the beginning in authorized driving schools. The questionnaires included their backgrounds. We studied their willingness in four hypothetical scenarios related to the initial three links: early emergency call, cardiopulmonary resuscitation (CPR) under one's own initiative, telephone-assisted chest compression and use of AED. The respondents were divided into young (17 to 29 years, n = 6,122), middle-aged (29 to 59 years, n = 827), older person (>59 years, n = 15,743) groups. Results There were signifi cant diff erences in gender, occupation, residential area and experience of BLS training, and knowledge of AED use between the three groups. The proportions of respondents who are willing to perform the desirable BLS actions were lowest in the older person group (Table 1) . Multiple logistic regression analysis confi rmed that aging is one of the independent factors relating to negative attitude in all the scenarios. Gender, occupation, resident area, experience of BLS training, and knowledge for AED use were other independent factors relating to negative attitude to some of the scenarios. Conclusions The aged population is more negative to the chain of survival. More are willing to follow the telephone-assisted direction for chest compression. The BLS training should be modifi ed for them to gain confi dence and to be aware of the signifi cance and benefi t of early call. The ECG tracings recorded during a ventricular fi brillation (VF) using an automated external defi brillator (AED) contain useful information predictive of shock outcome. The focus is on the VF waveforms' morphology. The amplitude and the spectral properties of VF may predict the likelihood of successful defi brillation [1] [2] [3] [4] . In almost all previous studies, the amplitude or the spectral properties of the ECG tracings have been singularly used. However, these approaches have led to methods lacking suffi cient predictive power. Methods Five hundred patients with out-of-hospital cardiac arrest on arrival in an emergency room were examined. The rhythm was identifi ed as VF and confi rmed by two trained investigators. ECG data were stored in modules in digitized form over a period of 20 minutes and were analyzed retrospectively. ECG traces containing CPR artefacts were removed by digital fi ltering. Times of collapse, dispatch, scene arrival, CPR, and initial defi brillation were determined from dispatch records, recordings of arrest events, interviews with bystanders, and hospital records. The preshock VF waveform morphology was studied and diff erent parameters of VF ECG signals were extracted. We then introduced a pattern classifi cation machine that combines the amplitude and spectral features simultaneously. The use of the pattern classifi cation machine which combines amplitude and spectral features of VF ECG signals shows an improved predictive power as compared with other methods. Conclusions This technique could help to determine which patients should receive shock fi rst and which should receive a period of CPR prior to shock, thereby increasing the probability of survival. The potential impact of this research is high in the direction of generating a new methodology able to increase the probability of survival after a cardiac crisis. References Introduction Recent studies report an increase of asystole and pulseless electrical activity (PEA) as the fi rst monitored cardiac arrest rhythms after arrival of the Emergency Medical Services [1] . The asystolic patients presumably undergo ischaemia for a longer time and may benefi t from treatment reducing hypoxic brain injury. Therapeutic hypothermia (TH) has expanded into prehospital care to be initiated as soon as possible. Rapid cold crystalloid infusion is the most frequent method; however, severe haemodynamic instability is its contraindication. The aim of the study was to assess the adverse eff ects of prehospital volume expansion in patients with initial nonshockable rhythms when used in a setting with only restricted cardiovascular monitoring. Methods All patients who were deemed eligible for advance cardiac life support (ACLS) were included as long as the arrest was witnessed and cardiopulmonary resuscitation (CPR) was initiated within 20 minutes of collapse. Patients were randomized to treatment or control groups. The trial was designed to determine the safety and eff ectiveness of early cooling initiated at the site of arrest. Survival and time to target temperature were documented. Results Data are presented as the mean ± SD or median (interquartile range (25, 75%)). Mean age was 67.8 ± 14 years in the intervention group and 65.4 ± 13.9 years in the control group. On average, cooling therapy was started in 33 ± 12 minutes in the RhinoChill™ group and 170 ± 97 minutes in the control group. Temperatures at hospital admission were signifi cant lower in the RhinoChill™ group. Time to target tympanic temperature, refl ecting brain temperature, were signifi cant faster in the RhinoChill™ group (211 ± 124 minutes vs 424 ± 217 minutes; P <0.05). Adverse events occurred in 12 patients. None was related to the cooling therapy. In the intervention group fi ve patients (20%) survived and three patients (12%) had a CPC of 1 to 2. In the control group only four patients (12.5%) survived and one patient (3.1%) had a CPC of 1 to 2. Conclusions Using the intranasal cooling method, cooling was much faster and earlier in treated patients. Neurologically intact survival and discharge rates were higher in treated patients. Transnasal cooling for the induction of therapeutic hypothermia during prehospital resuscitation is feasible and highly eff ective in lowering brain temperature rapidly. The method off ers the possibility for immediate introduction and realization of mild hypothermia in the fi eld. [1] . We sought to investigate whether a hospitalwide approach to TH after CA would reduce delays and result in improved outcomes. Methods We conducted a retrospective analysis of all TH interventions from 2007 to 2008. Following this we implemented a hospital-wide approach to TH and re-evaluated in 2008 to 2009. The hospital-wide approach included an educational programme, TH guidelines in the resuscitation room in the ED, and a cooling pack stored in the ICU which could be taken immediately to the post-CA patient. Results CA-ROSC and CA-Hosp delays were similar between groups. The hospital-wide approach signifi cantly reduced the time to initiate TH and time to achieve target temperature (see Table 1 ). The small sample size may have prohibited demonstration of outcome diff erences. [1] . The objectives of this study were: to demonstrate a diff erence in chest compression eff ectiveness with bed height, and provide a suggestion for an optimal and achievable bed height for eff ective chest compressions; and to demonstrate fatigue during chest compressions, and provide a suggestion for an upper time limit for eff ective chest compressions. Methods Exclusion criterion for this trial; no previous basic life support training in the past 4 years, or refused consent to participate. A modifi ed Laerdal manikin was connected to a Dragor ventilator (to measure intrathoracic pressures generated). The manikin was placed on a hospital trolley, and CPR was performed by candidates at three diff erent bed heights: mid thigh; anterior superior iliac spine; and xiphisternal area. Chest compressions were continuous and asynchronous with ventilation, and allowed to continue for 30 seconds before recordings were taken. Results One hundred and one subjects took part. The diff erences in intrathoracic pressures generated at diff erent bed heights were compared using ANOVA variance testing for multiple groups, and were statistically Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 signifi cant with P <0.01 ( Figure 1 ). We also found that the eff ectiveness of CPR decreased 17% over a 2-minute period (Figure 2 Introduction Therapeutic hypothermia is defi ned as controlled cooling of body temperature for therapeutic purposes. At present, the use of mildmoderate hypothermia (32 to 34°C) in patients with neurological damage is increasing in the ICU. In this regard, recent studies suggest that it is able to improve the neurological outcome in patients with anoxic cerebral damage following sudden cardiac death (SCA) [1] . The aim of this study is to assess the role of therapeutic hypothermia on neurological outcome in patients who experienced SCA with ensuing return of spontaneous circulation (ROSC). Methods Ninety adult patients, aged between 18 and 85, referred to our ICU after SCA due to cardiac disease with following ROSC were randomly allocated to the following treatment groups: patients in group 1 were treated immediately after admission with therapeutic hypothermia plus standard treatment, patients in group 2 received only standard treatment. All patients at entry presented with GCS 3. Neurological outcome was assessed on discharge and after 6 months, by means of GOS scale (0 = dead, 1 = vegetative, 2 = severely disabled, 3 = moderately disabled, 4 = good recovery). We consider scores 0 to 1 as unfavourable outcome, scores from 2 to 4 as favourable outcome. We conducted a retrospective study looking at infection rates in patients treated with therapeutic hypothermia (TH) following cardiac arrest who were admitted to the ICU of the Bristol Royal infi rmary since May 2007. TH is recommended in all patients presenting with persistent coma following cardiac arrest. One complication of TH is the risk of infection. Hypothermia suppresses the immune system by inhibiting the release of proinfl ammatory cytokines and by suppressing the chemotactic migration of leukocytes and phagocytosis [1] . In patients with traumatic brain injury, TH for more than 48 hours is associated with a higher risk of infection, but not if the period of hypothermia is less than 24 hours [2] . In patients following cardiac arrest, infection is common, reported in up to 73% of patients [3] . Methods Data were collected retrospectively, and information about ICU length of stay, whether they had a protected catheter (PC) specimen taken, whether they had any positive cultures and whether they were treated with antibiotics was extracted. Data for ICU length of stay was also obtained. Infections were defi ned as >10 5 colony-forming units (cfu) grown from either PC specimens or from peripheral blood cultures. We identifi ed 82 patients treated with TH post cardiac arrest. Nineteen (29%) had proven infection either on PC or on peripheral blood cultures. PC specimens were taken in 21 (25%) patients. Of the 21 patients who had PC specimens taken, 16 (76%) had proven infection. We also found that an increase in ICU length of stay was associated with increased infection rates, 44% in patients with a length of stay of greater than 3 days and 55% in patients with a length of stay of greater than 4 days. We have shown, in our ICU, that of patients who were treated with TH following cardiac arrest, 29% had a proven infection. On PC sampling, there was a much higher rate of infection with 76% of patients with positive cultures. This suggests that the risk of infection in patients treated with therapeutic hypothermia post cardiac arrest is higher than that for patients who are cooled post traumatic brain injury. Introduction In this presentation we will review the process used for developing a therapeutic hypothermia program and our outcomes during the fi rst year. Despite the fact that therapeutic hypothermia has been shown to improve survival and neurologic outcome after cardiac arrest [1] , it is underused [2] and has been only slowly introduced in many institutions. One of the reasons that it is not more widely used may be the diffi culty of implementation of a program. Methods A multidisciplinary team was assembled in 2007 to explore the opportunity for therapeutic hypothermia in patients after cardiac arrest. Pertinent literature was reviewed and a guideline and electronic orders were drafted. The team investigated all available options for cooling and chose a method that was non-invasive and did not require a large fi nancial investment by the institution. Nurses were sent to an international program to learn about hypothermia and become advocates and educators. Education was done for caregivers in the prehospital setting, emergency department, cardiac catheterization laboratory, and adult ICUs. Advertising to community and other hospitals was done by newspaper, radio, and television. The team continued to meet regularly to assess and modify the program based on experience. Results Prior to the start of this program, therapeutic hypothermia was essentially not utilized at the institution. Twenty-one patients were cooled after cardiac arrest and resuscitation at our institution in the fi rst year (2008), with few missed opportunities. Fifteen (71%) survived, and 12 (57%) had good neurologic outcome. (Post-cardiac arrest survival rates without hypothermia in the literature range from less than 5% to 35% [1] .) Challenges included altering historic practice patterns, integrating sectors of the healthcare delivery system that had historically drifted apart, and interacting with extra-institutional regulations governing prehospital care. Conclusions The therapeutic hypothermia program at our institution is successful in improving patient outcome after cardiac arrest. The elements that contributed to a successful program included: teamwork; multidisciplinary nature of the team; promoting awareness of the program; buy-in by nurses who championed the program in the ICU; positive patient outcomes; and low fi nancial investment for the institution. Introduction Measurement of body temperature using the tympanic method (TTM) represents a standard non-invasive method which best approximates core temperature (CTM). However, published evidence increasingly casts doubt on the agreement between TTM and CTM. The aim of this study was to assess the agreement of these two methods of temperature measurement in critically ill patients. Methods We recruited 20 consecutive critically ill patients, who required indwelling thermistor-tipped arterial catheters for haemodynamic monitoring. For each patient we simultaneously collected the core temperature, obtained from the arterial catheter, and tympanic temperatures in the left and right ears, in the supine position. Tympanic temperature was recorded using the ear setting (TTMe) and the core setting tympanic temperature (TTMc) in accordance with the manufacturer's instructions (Genius™ 2; Kendall, Tyco Healthcare, MA, USA). Local ethics committee approval was waived and the study was conducted as a service improvement audit. Bland-Altman analysis was used to measure agreement between the two set of measurements. Results From the 20 patients we made 102 paired measurements. Comparison of the TTM in the right and left ears resulted in high variability in the recorded temperature with a percentage error of 34.4%. In view of this observed large variability, the mean TTM between the two ears was used for subsequent comparison with core temperature. Comparison of core measurements against tympanic measurements using ear settings (TTMe) showed a small positive mean bias (limits of agreement (LOA); percentage error) of 0.3°C (-0.4 to 1.1°C, 2.4%). However, when the CTM and TTM were compared using the core temperature setting of the thermometer, the bias (LOA) increased to -0.97°C (-1.6 to -0.4°C), creating a larger bias than the unadjusted TTM. Conclusions Tympanic temperature measurements showed a large random variability compared with core temperatures and suggest bilateral averaged tympanic measurements may be necessary. In our series, using the core setting increased error and increased the bias between the two methods. Our fi ndings raise the possibility that the diagnosis of fever and the need for further investigation may be aff ected by the choice of temperature measurement site and device setting used. Can we predict neurological prognoses with computed tomography just after resuscitation? Introduction Technological improvements in ventilator design may confound the diagnosis of brain death. We describe a series of patients clinically perceived to be breathing spontaneously even though they satisfi ed the criteria for brain death. The misdiagnosis of cerebral function in these patients delayed the diagnosis of brain death and subsequent organ donation. Resource utilization, patient/family suff ering, and staff morale are all negatively impacted when this occurs. Methods The ventilator triggering mode was recorded in all patients. Patients were identifi ed by the authors with a high likelihood of brain death and receiving pressure support ventilation. Patients without cranial nerve function underwent formal apnea testing. Results Seven patients were identifi ed with cessation of cranial nerve function, but thought to have spontaneous breathing activity. All patients were on pressure support ventilation with fl ow triggering. Formal apnea testing was performed (Table 1) . Conclusions Improvement in triggering ventilation may confound the diagnosis of brain death. Considering the unmet need for transplantation and the negative impact of not recognizing death on families, staff , resource utilization and both patients and potential recipients of organs mandates a simple solution to eliminate these problems. We propose that Introduction Studies of patients presenting with coma are limited and little is known about the prognosis of these cases [1] [2] [3] . The aim of this study was to investigate the acute and long-term prognosis after an episode of nontraumatic coma. Methods Adults admitted consecutively to an emergency department in Stockholm, Sweden between February 2003 and May 2005 with a Glasgow Coma Scale (GCS) score of 10 or below were enrolled prospectively. All available data were used to explore the cause of the impaired consciousness on admission. Patients surviving hospitalization were followed up for 2 years regarding survival. The fi nal study population of 865 patients had the following eight diff erent coma etiologies: poisoning (n = 329), stroke (n = 213), epilepsy (n = 113), circulatory failure (n = 60), infection (n = 56), metabolic disorder (n = 44), respiratory insuffi ciency (n = 33), and intracranial malignancy (n = 17). The hospital mortality rate among the 865 patients was 26.5%, varying from 0.9% for epilepsy to 71.7% for circulatory failure. The accumulated total 2-year mortality rate was 43.0%, varying from 13.7% for poisoning to 88.2% for malignancy. The level of consciousness on admission also infl uenced the prognosis: a GCS score of 3 to 6. Conclusions The prognosis in patients presenting with nontraumatic coma is serious and depends largely both on the level of consciousness on admission and on the etiology of the coma. Adding the suspected coma etiology to the routine coma grading of these emergencies may more accurately predict their prognosis. Introduction Recently, near-infrared time-resolved spectroscopy (TRS), which is quite eff ective in quantitative monitoring tissue oxygenation, because it off ers the actual measurement of photon migration in the tissues, and the photon mean path length is easily obtained from the center of gravity of the temporal profi le, has been developed. In this study, we investigated whether the changes in the cerebral oxygen saturation (T-SO 2 ) obtained with the TRS and the jugular venous oxygen saturation (SjvO 2 ) predicted cognitive decline after cardiac surgery. Methods With institutional approval and informed consent, we studied 10 patients (68.7 ± 6.1 years) undergoing cardiac surgery under cardiopulmonary bypass (CPB). T-SO 2 was continuously monitored using a TRS-10 (Hamamatsu Photonics KK, Hamamatsu, Japan). For measurement of SjvO 2 , a 5.5 Fr oximetry catheter was inserted by retrograde cannulation of the right internal jugular vein. The values of T-SO 2 and SjvO 2 were compared with each point: before CPB, 5 minutes after the onset of CPB, before aorta clamp, after aorta clamp, rewarming, aorta declamp, end of rewarming, and end of CPB. The cognitive decline was evaluated by Mini Mental State Examination before and 7 days after the operation. The statistical analysis was performed by repeated-measures ANOVA followed by Fisher's PLSD. P <0.05 was considered statistically signifi cant. Results Four of 10 cases showed postoperative cognitive decline. The mean values of T-SO 2 and SjvO 2 during operation in patients without postoperative cognitive decline (n = 6) were 63.9 ± 4.6% and 60.5 ± 9.7%, respectively, and there were no statistical signifi cances between these values. However, the mean values of T-SO 2 and SjvO 2 during operation in patients with postoperative cognitive decline (n = 4) were 62.8 ± 5.6% (T-SO 2 ) and 70.4 ± 14.9% (SjvO 2 ), and showed signifi cant diff erences between two values (P = 0.0024), and at the rewarming period, the values of SjvO2 was signifi cantly higher than those of T-SO 2 (87.9 ± 6.3% vs 65.0 ± 5.3%, P = 0.0014). Systemic hypertension and smoking showed strong association with the rupture of intracranial aneurysms. The arteries of the previous segment were those that had higher incidence of aneurysms. More than onehalf of the patients did not have complications during the procedure. Embolization of cerebral aneurysms was revealed to be a low lethality method. The objective of this work was to assess the usefulness of monitoring regional brain oxygen saturation (rSO 2 ) in the identifi cation of intraoperative episodes of cerebral ischemia during the surgical clipping of brain aneurysms. Moreover it aimed to verify whether this kind of monitoring aff ects the incidence of postoperative neurological defi cits. Methods We monitored 50 patients undergoing cerebral aneurysm clipping with the use of Somanetics INVOS Cerebral Oximeter 4100. The alarm threshold of rSO 2 was defi ned as a minimum of 20% drop from baseline, triggering intervention aimed at increasing rSO 2 values. Results Eleven cases of postoperative deterioration of neurological status were noted, of which six were reversible. Additional intraoperative events, such as numerous surgical clip displacements, temporary clipping, aneurysm ruptures, and trapping occurred in 16 cases. On the operated side, a rapid, lasting increase of rSO 2 values was noted in nearly one-half of the cases; implying an artifact caused by the neurosurgical procedure. On the nonoperated side, seven cases of rSO 2 values dipping below the alarm threshold were noted. In all cases intervention led to an increase and normalization of rSO 2 values in a relatively short time. Since the values registered in the frontal lobe on the nonoperated side were considered representative of the entire brain, the balance of cerebral oxygen supply and consumption was seen as intraoperatively preserved. The comparison between two groups of patients, with and without neurological defi cits developed in the postoperative period, revealed no disparities in cerebral oximetry values on the nonoperated side, nor in the range of other monitored physiological parameters. In 10 out of 16 cases involving additional intraoperative events, deterioration of neurological status was noted (62.5%), while in the remaining cases only one such occurrence was registered (2.95%). Conclusions Monitoring regional brain oxygen saturation on the operated side is a method hampered by a large percentage of false results, and thus is a monitoring tool of little prognostic value. Assuming that the frontal lobe on the nonoperated side is representative of the entire brain with regard to rSO 2 measures, the maintenance of oxygen supply and demand balance does not safeguard the brain against ischemic lesions in the supply area of the operated artery. stiff ness was assessed by sonographic measurements of the carotid to femoral pulse wave velocity (PWV). The above sonographic measurements along with BNP measurements were obtained upon admission and repeated measurements were performed 20 ± 2 days following the acute phase of SAH. All patients were mechanically ventilated under the same conditions during all measurements. Results Patients with severe SAH showed depressed LV function and increased aortic stiff ness. Both Hunt-Hess and Fisher scales were associated with LVEF (r = -0.460, P = 0.004 and r = -0.512; P = 0.001, respectively) and PWV values (r = 0.359, P = 0.029 and r = 0.363; P = 0.027, respectively Introduction Anemia is frequently encountered in critically ill patients and adversely aff ects cerebral oxygen delivery and metabolic function. However, there is limited evidence to support the use of packed red blood cell (PRBC) transfusion to optimize brain homeostasis after subarachnoid hemorrhage. The objective of this study was to investigate the eff ect of PRBC transfusion on cerebral oxygenation and metabolism in patients with subarachnoid hemorrhage. Methods Prospective observational study in a neurological ICU of a university hospital. Nineteen PRBC transfusions were studied in 15 consecutive patients with subarachnoid hemorrhage that underwent multimodality monitoring (intracranial pressure, brain tissue oxygen and cerebral microdialysis). Data were collected at baseline and during 12 hours after transfusion. The relationship between Hb change and lactate/pyruvate ratio (LPR) and brain tissue oxygen (PbtO 2 ) was tested in univariate and multivariable analyses. , the other 14 were in the coiling group (35%). But HHH therapy and CSF drainage with lumber puncture was started more liberally, in 57 and 35 patients respectively, due to at least two of the following fi ndings or clinician's decision: headache, agitation, elevated leukocyte level (without infection), new motor defi cit and worsening in GCS. The severity of cases on admission in the clipping and coiling groups were similar for both. Although mortality and morbidity rates were higher in the coiling group there was no statistical diff erence in both groups (Table 2) . GOS, Glasgow outcome score; H&H, Hunt and Hess. Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 S115 Conclusions This study demonstrated that although mortality, morbidity rates and DCI incidence were lower in the surgical group, there was no statistical signifi cance. Conclusions According to the outcome, we can conclude that Group 1 obtained a greater protection on the vasospasm incidence in comparison with Group 2 but showed no diff erence in mortality. The P value was signifi cant for vasospasm but was still not signifi cant for mortality. Introduction Vasospasm is the main cause of death and cognitive defi cits in patients with subarachnoid hemorrhage after rupture of the aneurysm (aSAH). Some trials have shown that statins in the acute phase of aSAH reduces the incidence, morbidity and mortality of cerebral vasospasm. Methods We realized a prospective, randomized, nonblind study, with the use of 80 mg SVT (at night) in the fi rst 72 hours of the beginning of bleeding, and a control group that did not use SVT, for 21 days, between January and December 2008. Informed consent was obtained for all patients. CT scans was performed as control and another CT scan in patients with altered neurological signals. In the presence of changes suggestive of vasospasm or correlation in clinical and CT scans, the patients were taken for cerebral arteriography examination followed by an angioplasty procedure if necessary. Liver and renal function, LDL cholesterol evaluated weekly, and CK total evaluated every 3 days. Exclusion criteria: liver and renal disease, pregnant elevation of serum transaminases (three times the value of normal), creatinine ≥2.5, rhabdomyolysis or CK total ≥1,000 U/l. We excluded two patients with bleeding for more than 72 hours. There was no signifi cant change in the levels of CK total, renal or liver function. We included 20 patients, 11 in the SVT group and nine in the control group. Mortality was eight patients (38%), six patients in the control group and two from the SVT group. Vasospasm was confi rmed by Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 S116 cerebral arteriography examination in four patients in the control group and one patient in the SVT group. All patients who died had Fisher scale IV. Conclusions SVT at a dose of 80 mg was eff ective in reducing the mortality (18.1% against 66%) compared with the group that did not use SVT, and also decreased the incidence of cerebral vasospasm despite higher APACHE II score in the group that used SVT (14. 3 Conclusions We can conclude that the group with patients with SAH is predominantly female (74:29). The APACHE II in Group I was 10.9, while Group II was 17.9. Regarding the criteria used to assess patients with SAH was observed that the only criterion which showed statistical signifi cance in the prediction of death was the serum sodium (P = 0.002). The other criteria evaluated did not have statistical signifi cance in predicting the prognosis of patients. Reference Introduction The objective of this study was to investigate the relationship between cardiac output response to a fl uid challenge and changes in brain tissue oxygen pressure (PbtO 2 ) in patients with severe brain injury. Methods Prospective observational study conducted in a neurological ICU in a university hospital. Seventy-eight fl uid challenges were administered to 17 consecutive comatose patients that underwent multimodality monitoring of cardiac output, intracranial pressure (ICP), and PbtO 2 . The relationship between cardiac output and PbtO 2 was analyzed with logistic regression utilizing GEE with an exchangeable correlation structure. Of the 78 fl uid boluses analyzed, 34 (44%) resulted in a ≥10% increase in cardiac output. Median absolute (+5.4 vs +0.7 mmHg) and percentage (20% vs 3%) changes in PbtO 2 were greater in cardiac output responders than in nonresponders within 30 minutes after the end of the fl uid bolus infusion. In a multivariable model, a cardiac output response was independently associated with PbtO 2 response (adjusted odds ratio 15.4, 95% CI 1.9 to 122.0, P = 0.01) after adjusting for mean arterial pressure, Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 S117 intracranial pressure and end-tidal CO 2 . Stroke volume variation showed a good ability to predict cardiac output response with an area under the ROC curve of 0.85 with a best cutoff value of 8%. See Figures 1 and 2 . Conclusions Bolus fl uid resuscitation resulting in augmentation of cardiac output can improve cerebral oxygenation after severe brain injury. Introduction ICM + software encapsulates our 20 years' experience in brain monitoring. It collects data from a variety of bedside monitors and produces time trends of parameters defi ned using confi gurable mathematical formulae. To date it is being used in nearly 40 clinical research centres worldwide. We present its application for continuous monitoring of cerebral autoregulation using near-infrared spectroscopy (NIRS). Methods Data from multiple bedside monitors are processed by ICM + in real time using a large selection of signal processing methods. These include various time and frequency domain analysis functions as well as fully customisable digital fi lters. The fi nal results are displayed in a variety of ways including simple time trends, as well as time window based histograms, cross histograms, correlations, and so forth. All this allows complex information from bedside monitors to be summarized in a concise fashion and presented to medical and nursing staff in a simple way that alerts them to the development of various pathological processes. Conclusions ICM + software is proving to be a very useful tool for enhancing the battery of available means for monitoring cerebral vasoreactivity and potentially facilitating autoregulation guided therapy. Complexity of data analysis is also hidden inside loadable profi les, thus allowing investigators to take full advantage of validated protocols including advanced processing formulas. dysbalances and their relationship to outcome in the neurologicneurosurgical care unit (NNICU) over a period of 5 years. Methods We prospectively evaluated patients with brain diseases, who developed serum sodium below 135 mmol/l (hyponatremia) or above 150 mmol/l (hypernatremia). We compared the incidence of cerebral complications, Glasgow Outcome Scale upon discharge from the NNICU and mortality in the NNICU between these two groups. In the 5-year observation period, serum sodium dysbalances occurred in 378 (24%) patients. The majority of them had hyponatremia (245 patients, 65%); hypernatremia was less frequent, in 133 (35%) patients. Hypernatremic patients stayed in the NNICU longer (P = 0.035), onset of hypernatremia arose in patients with signifi cantly lower Glasgow Coma Scale (P = 0.001). These patients had more cerebral complications (P <0.001), worse Glasgow Outcome Scale upon discharge from the NNICU (P <0.001), higher mortality in the NNICU (P = 0.003) and higher incidence of pulmonary edema (P = 0.021). They received more antiedematic therapy (P <0.001) and diuretics (P <0.001). On the other hand, hyponatremia was more frequent upon entry to the NNICU (P <0.001) and arose later after brain damage (P <0.001) in comparison with hypernatremia. Conclusions In neurointensive care, hypernatremia was a prognostically more serious and less frequent sodium dysbalance than hyponatremia. 19%) , cerebral contusion in 17 patients (17%), brain oedema in 15 patients (15%), and subdural hematoma in two patients (2%). Glasgow Coma Scores (range 3 to 10); 60 patients were mechanically ventilated; 10% were diabetic and 22% were hypertensive. Hypernatremia was documented in 40 patients (40%) of the total TBI patients. The total inhospital mortality was 36/100 (36%), 10 of them had normal sodium levels all through their in-hospital course and 26 patients were hypernatremic. After adjustment for the baseline risk, the incidence of hypernatremia over the course of the ICU stay was signifi cantly related to increased mortality (hazard ratio 3.2 (P = 0.0001)). However, there was positive correlation between serum sodium levels and duration of the ICU stay (Spearman correlation coeffi cient 0.5 and P = 0.002). Conclusions Hypernatremia in patients with severe TBI is associated with an increased risk of death, and a longer ICU stay. This association is independent of other outcome predictors including age and Glasgow Coma Score. Strategies to prevent hypernatremia in neurocritical ICUs should be encouraged. Reference In addition, chemical prophylaxis should be initiated as soon as determined to be safe to do so. Limited data exist describing the use of chemical prophylaxis and the timing of it in relation to IVCF placement in this patient population. We aim to describe such use and timing in relation to IVCF placement. Methods All trauma patients with a retrievable prophylactic IVCF inserted over 3.5 years with an age ≥18 and an ISS ≥15 were enrolled into the descriptive study. Patients were identifi ed using the local trauma registry and data collected from the registry and patient chart. Results One hundred and three patients with a prophylactic IVCF are described. Mean age was 42.8 ± 17.9 years, 68% were male and they had a mean ISS of 38.6 ± 12. A total of 36.9% had ≥3 injuries. A total of 60.2% had an injury with maximum AIS score ≥5 and the type of injury associated with maximal AIS was head (53.4%), pelvic (19.4%), thoracic (15.5%) and spinal cord injury (11.7%). Shock was present on admission in 8.7% patients, 98.1% required ICU admission, 88.4% required intubation and 27.2% patients required ≥4 PRBC over the fi rst 24 hours. The retrievable IVCF was inserted prophylactically after a mean of 1.7 ± 1.9 days (range 0 to 9 days) of hospital admission. Chemical prophylaxis was initiated 5.7 ± 5.1 days after IVCF in 77.7% patients, and dalteparin or enoxaparin were the initial agents of choice in 93.8% while heparin was chosen in 6.2% of cases. The time from hospital admission to the start of chemical prophylaxis ranged from 6.3 to 8.5 days. Conclusions Chemical prophylaxis was initiated in the majority of trauma patients with an IVCF, albeit 22% still did not receive any. When administered, a low-molecular-weight heparin was selected. Although rapidly inserted after admission, there was considerable delay between insertion of IVCF and initiation of chemical prophylaxis. For studying ICU lower limb DVT incidence, compressive ultrasonography (CUS) was performed by three trained ultrasonographers twice a week until discharge. SM and MR administration was registered daily: the drugs considered are presented in Figure 1 . MR-treated patients were excluded from statistical analysis; the diff erence in SM administration in the No DVT group vs the DVT-positive group was assayed by chi-squared test. Results Over the period of study, 380 patients were enrolled. Lower limb DVT was diagnosed in 23 of 380 patients, with a DVT incidence of 6.05%. MRs were administered in 121 patients and they were associated with DVT incidence (P <0.001). Statistical analysis results performed on 259 patients are shown in Figure 1 : none of the considered SM resulted to be associated with DVT incidence, nor sedation as a whole or the number of SM administered. Conclusions Our study confi rms prior fi ndings [1] on the MR role as a DVT risk factor. Nevertheless, SM administration in patients not treated with MR is not associated with increased DVT risk. Introduction Despite the evidence of perioperative hypercoagulability in cancer patients, there are no consistent data evaluating the extent, duration, and specifi c contribution of platelets and procoagulatory proteins by in vitro testing. This study compared effi cacy of haemoviscoelastography versus thromboelastography for monitoring of coagulation imbalance. Methods Two hundred and forty-one patients undergoing surgery for abdominal cancer were examined for the effi cacy of a variety of coagulation tests. A complete coagulation screening, thromboelastography (TEG) and haemoviscoelastography (HVG) were performed before and at the end of surgery. We calculated the elastic shear modulus of standard MA (Gt) and HVG MA (Gh), which refl ect the total clot strength and procoagulatory protein component, respectively. The diff erence was an estimate of the platelet component (Gp). There was a 16% perioperative increase of standard MA, corresponding to a 51% increase of Gt (P <0.05) and a 79% to 87% contribution of the calculated Gp to Gt. We conclude that serial standard thromboelastography and the HVG viscoelastic test may reveal the independent contribution of platelets and procoagulatory proteins to clot strength. Using multiple linear regression, all coagulation, TEG and HVG variabilities were used to model postoperative hypercoagulation. Results showed that some components of the TEG failed to identify hypercoagulation (r <0.2, P >0.75). All components of the HVG test refl ect postoperative coagulopathies. Conclusions Hypercoagulability is not refl ected completely by standard coagulation monitoring and TEG, and seems to be predominantly caused by increased platelet reactivity. HVG provides a fast and easy-toperform bedside test to quantify in vitro coagulation, and may be useful in determining the coagulation status of cancer patients perioperatively. The aim was to study the incidence, the etiology and to assess the severity of thrombocytopenia in surgical ICU (SICU) patients admitted to the central army hospital of Algiers (Algeria). Methods We conducted a retrospective study; all admitted patients during the year 2008 were enrolled in the study (351 patients). All charts were reviewed and diff erent hemograms recorded from admission until discharge. One hundred and fi fty-three thrombocytopenia cases defi ned as a platelet count (plt) <150 x 10 9 were reported (43.6%). Results Among 351 patients (227 survivors, 124 died), 153 developed thrombocytopenia, 96 males and 57 females. The onset of thrombocytopenia was for the majority of patients (98%) at 24 to 72 hours after ICU admission. Only 26% were severe (plt <50 x 10 9 ) and 3% <20 x 10 9 ). The etiopathology is multifactorial and diffi cult to obtain precisely: 77% Figure 1 (abstract P362) . Statistical analysis. of cases were hemodynamically unstable multitrauma patients (with or without ARDS and fat embolism), and necessitated volume and blood product administration through an intravascular device; sepsis was present in 47% of cases; disseminated intravascular coagulopathy in 33%; and fi ve thrombocytopenia cases were related to drug administration with late onset after the 5th day. Reported mortality was 57% (P <10 -6 ) compared with the overall mortality (35.3%), related to the severity of underlying diseases and the associated co-morbidities. Conclusions Thrombocytopenia is frequent among chirurgical patients, especially multitrauma patients with cardiovascular instability where aggressive volume and blood product administration is needed. Introduction Thrombocytopenia is a common occurrence in the critically ill population [1] . We aimed to determine the incidence, severity and prognosis of thrombocytopenia in our ICU. Methods A retrospective laboratory result review of 330 consecutive admissions to the Victoria Infi rmary ICU was undertaken. Demographics, APACHE II score and outcome data were retrieved from the Ward Watcher system in the ICU. We compared survivors' and nonsurvivors' platelet levels on admission to the ICU and the trough level in the ICU and looked at correlation with the length of stay and duration of mechanical ventilation. Data were analysed using Student's t test, Pearson correlation coeffi cient and chi-squared test where appropriate. Results Complete data were available for 274 patients. Population demographics were as follows -61.3% male, mean age 56.3 ± 2.1 years, median APACHE II 20 (IQR 15 to 27), crude ICU mortality 25.6% and mean length of stay 5.2 ± 0.7 days. Incidence of thrombocytopenia (platelet count <150 x 10 9 /l) was 29.8% on admission to the ICU, increasing to 46.9% when considering the entire ICU stay. Comparing survivors and nonsurvivors, nonsurvivors had a lower trough platelet count (140 x 10 9 /l vs 181 x 10 9 /l, P = 0.005). Patients with platelet counts less than 50 x 10 9 /l have the highest mortality (45.7% vs 27.6%, P = 0.006). Platelet data were used to construct an ROC curve, demonstrating an area under the curve of 0.66, P <0.001. Platelet count correlated negatively with APACHE II (r = -0.20, P <0.001) but did not correlate signifi cantly with length of stay or duration of mechanical ventilation. Conclusions Thrombocytopenia occurs in 47% of our ICU patients. It correlates signifi cantly with severity of illness as measured by APACHE II and decreasing platelet count correlates with increasing mortality, with the highest mortality in those with a trough ICU platelet count of less than 50 x 10 9 /l. The ROC characteristics also demonstrate that the platelet count is a useful predictor of mortality. Introduction Heparin-induced thrombocytopenia (HIT) is a lifethreatening and limb-threatening, immune-mediated, prothrombotic disease resulting from an interaction between heparin and platelet factor 4 (PF4). Due to the many causes of thrombocytopenia in critically ill patients, the diagnosis of HIT is diffi cult, requiring both a high clinical suspicion and confi rmatory testing. The ELISA test is the most commonly performed method to detect anti-PF4 antibodies; however, the ELISA results generally take one or more days to report. The particle immunofi ltration assay (PIFA) test has the advantage of being done rapidly; with results generally available within 1 hour. Methods Starting in July 2009, patients in our MICU were screened daily for thrombocytopenia; defi ned as either a platelet count that decreased by at least 30% from baseline or an absolute platelet count less than 150,000. Patients with thrombocytopenia underwent both PIFA and GTI ELISA testing for anti-PF4 antibodies. PIFA is a qualitative test reporting results as positive or negative. The GTI ELISA test reports an optical density (OD), with an OD greater than 0.40 considered positive. Patients were followed through hospitalization. Introduction Heparin-induced thrombocytopenia (HIT) is a lifethreatening and limb-threatening immune-mediated prothrombotic complication caused by heparinic drugs. The aim of this study was to evaluate the incidence of HIT in a mixed ICU population. Methods Patients admitted to the ICU of a regional referral center (Careggi Teaching Hospital, Florence, Italy) who underwent unfractionated heparin (UFH) or low-molecular-weight heparin (LMWH) administration were prospectively observed from October 2008. Exclusion criteria: haematological malignancy, platelet count <50,000/mm 3 before anticoagulant treatment. After anticoagulant administration start, patients showing a fall of platelet count >50% or a nadir <150,000/mm 3 were clinically evaluated with the 4T score (4Ts) [1] : cases with a 4Ts of at least 2 were investigated with the laboratory antigen assay, the ELISA. Positive tests underwent to a functional confi rmation test, the heparin-induced platelet activation (HIPA) test [2] . Results Preliminary results refer to the period from the start to August 2009. Patients admitted to the ICU were 369; 327 enrolled for the study. Clinical evaluation with 4Ts was performed for 31 patients, 23 of 31 (74%) had a value of at least 2; eight of 23 were on UFH and the remaining 15 on LMWH. Potential etiologies of thrombocytopenia have been considered: 21 patients were septic, three were on extracorporeal membrane oxygenation, two were admitted to the ICU due to cardiogenic shock. The ELISA test resulted positive in three of 23 (13%) suspected HIT cases; none of the three cases resulted positive in the HIPA test, so HIT was excluded. The medium 4Ts of the 23 suspected HIT cases was 4.1 ± 1.7 whereas 4Ts did not diff er between patients with positive and negative ELISA test. Conclusions HIT incidence in the ICU is extremely rare, with no cases shown in this study. HIT was suspected in 23 of 327 patients (7.03%), but the HIT incidence was 0.0%. ICU patients show numerous potential etiologies of thrombocytopenia and in this population sepsis was the most common. References Introduction Many procedures in cardiac surgery require extracorporeal circulation (EC). During EC, blood is pumped from a venous line to an oxygenator, cooled and infused back via an arterial line. This procedure is known to compromise the coagulation system, especially under prolonged hypothermia. The aim of the present study was to investigate the eff ects of EC and hypothermia on the coagulation system in a porcine model. The effi cacy of a substitution therapy with a prothrombin complex concentrate (PCC) regarding normalization of coagulation and decrease in hemorrhage was evaluated. Methods A total of 17 male anesthetized pigs were included in the study. EC was performed in 12 animals by a hollowfi ber oxygenator with a priming solution containing saline, HES and heparin. Five animals without EC served as a control. The coagulation system was characterized by thromboelastography (TEG), thrombin generation, coagulation factor levels, platelet numbers and function (aggregation). A GoreTex patch was inserted into the carotid artery, bleeding occurred from the stitch channels. The eff ects of intravenous substitution therapy by PCC (Beriplex P/N, 30 U/kg, n = 6) compared with placebo (n = 6) on bleeding from the stitch channels was investigated. Results EC and hypothermia compromised the coagulation system. Coagulation factor levels were decreased, TEG and thrombin generation became pathologic. Platelet count and aggregation were decreased. Blood loss from the stitch channels of the GoreTex patch was doubled in the EC group when compared with animals without EC. After substitution therapy with PCC, blood loss decreased signifi cantly from 83.0 ± 48.4 ml in the EC + placebo to 27.2 ± 35 ml in the EC + PCC group (P <0.025, Wilcoxon test; Figure 1 ). PCC normalized the impaired coagulation. Conclusions Hypothermia and EC led to a compromised coagulation system, resulting in increased hemorrhage. The defi cit in coagulation could be overcome by substitution therapy with PCC. Elevated blood loss from stitch channels was decreased signifi cantly after treatment with PCC. It was concluded that PCC should be benefi cial in patients undergoing cardiovascular surgery with an EC-induced coagulopathy. Introduction Dilutional coagulopathy following massive bleeding is the result of clotting factor dilution and impaired fi brin polymerization after infusion of colloidal plasma expanders. Although fi brinogen has been clinically used to treat dilutional coagulopathy [1, 2] , the eff ects of fi brinogen dosages on normalizing coagulation function is unclear. This study investigated the eff ect of six diff erent fi brinogen dosages (range: 37.5 to 600 mg/kg) on ROTEM® parameters and overall blood loss in a pig model for dilutional coagulopathy. Methods Forty-two pigs underwent a 60% hemodilution with Voluven® (HES130/0.4). A standardized bone injury was performed after the completion of hemodilution. Animals were then randomized to receive 37.5, 75, 150, 300, 450 or 600 mg/kg fi brinogen (FGTW, LFB, France) or 500 ml saline. Four hours after fi brinogen administration a standardized liver injury was performed. Animals were then observed for 2 hours or until death. Blood loss was measured and tissue samples were collected at the end of the study. Hemodynamic and coagulation parameters were measured at baseline (BL), after hemodilution, 15 minutes, 1, 2 and 4 hours after drug infusion and 2 hours after liver injury or right before the animal's death. Statistical signifi cance was set at P <0.05. Results Fibrinogen dosages of 150 mg/kg and higher completely reversed dilutional coagulopathy: the maximum clot fi rmness (MCF) which was decreased after hemodilution (36 ± 3 vs 65 ± 4 mm at BL, P <0.05), returned to BL levels after fi brinogen administration (69 ± 5 mm). Blood loss from bone and liver injury signifi cantly decreased with increased fi brinogen dosages: 42 ± 19 (sham), 34 ± 14 (75 mg/kg), 29 ± 13 (150 mg/kg) and 28 ± 10 ml/kg BW (600 mg/kg). No thrombotic events occurred. Conclusions In a swine model of 60% hemodilution with bone and liver injury, fi brinogen administration (150 mg/kg and above) normalized MCF and decreased blood loss. Infusions of 12 times the dosage recommended in humans did not induce hypercoagulability. Introduction Postpartum haemorrhage (PPH) is a leading cause of maternal death. Present guidelines for PPH management focus on uterotonic treatments, while the potential effi cacy of antifi brinolytic drugs has been poorly explored in this indication. Methods After approval by the ethics committee, we randomly assigned women with PPH >800 ml after vaginal delivery to receive intravenous tranexamic acid (TXA) (4 g within 1 hour, followed by 1 g/hour during 6 hours) (TXA group) or no antifi brinolytic treatment (Control group). Monitoring and management were performed according to the guidelines. Additional procoagulant treatments (coagulant concentrates, fresh frozen plasma, platelets) were used only when PPH became intractable (>2,500 ml or >500 ml/30 minutes). Embolization or surgery was performed when considered clinically necessary. Blood loss, maternal morbidity and safety were collected at four time points: inclusion (T1), T1 + 30 minutes (T2), T1 + 2 hours (T3), and T1 + 6 hours (T4). The study plan, based on the hypothesis that TXA would induce a 20% blood loss reduction with an α risk of 5% and a β risk of 10%, required the inclusion of 72 patients in each group. Statistical analysis used the SAS system. Results One hundred and forty-four patients were included, 72 in the TXA group and 72 in the control group. Management data were similar between the two groups. In the TXA group, as compared with the control Figure 1 (abstract P368) . Blood loss from a GoreTex Patch after EC. group: blood loss was 46% lower between T2 and T3 (P = 0.026), and 49% lower between T2 and T4 (P = 0.0012); bleeding duration was shorter (31 ± 28 vs 65 ± 95 minutes, P = 0.004); and maternal morbidity was reduced (hemoglobin drop >4 g/dl: n = 15 vs 34, P <0.001; blood transfusion requirement: n = 14 vs 20, P = NS; overall number of transfused packed red blood cells: n = 42 vs 62, P <0.001; number of severe PPH (Charbit criteria 1): n = 23 vs 36, P = 0.028). Invasive procedures were needed in four women in the TXA group and in seven women in the control group (P = NS). Transient visual and digestive side eff ects were observed more frequently in the TXA group than in the control group (17 vs 4 P = 0.028). Conclusions These results show, for the fi rst time, that TXA, when administered early in the management of PPH, reduces blood loss and maternal morbidity, with only minor and transient side eff ects. As TXA is a readily available and cheap hemostatic agent, we strongly suggest that it should be further investigated as an adjuvant treatment in PPH. In 1999 the fi rst publication appeared of the use of recombinant factor VIIa (rFVIIa) for uncontrolled haemorrhage [1] . Since then there has been widespread international use of rFVIIa in this setting. Consensus guidelines have been published [2] , but there is much variation in its administration [3] . Methods Review of the blood bank databases of all six acute hospitals in the southwest of England between January 2000 and February 2009 was undertaken to reveal all prescriptions of rFVIIa. Case notes were analysed and the administration of all blood products for 24 hours either side of rFVIIa administration was extracted from the blood bank databases. Results Eighty-two patients were identifi ed who had received off -licence rFVIIa. Of these, full data were available on 67 patients. There was unequal use between hospitals; Plymouth had 33% of use compared with Taunton, with only 4%. A total 65.7% of patients were male and the mean age was 56.6 years. The mean APACHE II score was 18.4 (± 7.4) and 40.3% patients died in ≤96 hours. The mean dosage administered was 80.9 μg/kg (± 21.3). The number of annual prescriptions shows a bimodal pattern with peaks in 2005 and 2008. The distribution of usage was predominantly with general and vascular surgical patients (39%), but trauma (14.9%), obstetrics (13.4%) and cardiac surgery (13.4%) were the next most frequent specialties. rFVIIa was given after a mean of 17.6 (± 11.2) units of packed red cells, 8.8 (± 5.9) units of fresh frozen plasma and 2.3 (± 1.9) pooled platelets. Following rFVIIa transfusion of all blood products was markedly less, and only three patients received a repeat dose of rFVIIa. The prothrombin time decreased from a mean of 18.9 seconds to 13.9 seconds following rFVIIa. Thirty-six per cent of patients had <50 platelets at administration and 40% were acidotic with pH <7.2. Sixty-eight per cent patients had a haematocrit <0.25 and 30% of patients were hypothermic. Conclusions Off -label prescribing of rFVIIa varies widely across the southwest region. Its use as an adjunctive therapy for uncontrolled haemorrhage has often been unsuccessful. Our analysis of prescribing rFVIIa in this setting refl ects the practice of the larger Australian series in terms of indications and dosage; however, the mortality at 28 days of 32% is lower than in our series [3] . There is still heterogeneity in prescribing rFVIIa and its use does not conform to European guidelines [2] . Introduction Bleeding after cardiac surgery is a common state, associated with adverse eff ects. Factor VIIa is a recombinant factor able to restore blood coagulation without signifi cant adverse eff ects. We aimed to evaluate whether the early use of factor VIIa reduces bleeding and the rates of transfusion. Methods We studied 241 patients submitted to cardiac surgery who presented bleeding after heparin reversal and a fi rst replacement of clotting factors. From these, 81 patients were submitted to valve procedures, 51 coronary artery bypass surgery (CABG), 60 valve + CABG, 35 aortic ascending surgery and 14 cardiac transplant. From these, 131 received factor VIIa early (in the intraoperative room) and 110 received the drug in the ICU. Results Both groups of patients presented a statistically signifi cant reduction of blood transfusion after receiving factor VIIa (P <0.001). The group of patients undergoing early use presented a lower rate of reoperations due to bleeding (P <0.01), and received less units of red blood cells (P <0.01), fresh frozen plasma (P <0.01), cryoprecipitate and platelets (P <0.01). Also, we detected a low incidence of infection in the group who received the factor in intraoperative room. Conclusions Early use of factor VIIa in patients who bleed after cardiac surgery is associated with lower rates of reoperations, lower rates of transfusion and lower incidence of infection. This suggests the drug should not be used as a last therapeutic tool and may be indicated earlier in the management of bleeding. Reference Introduction Venous thromboembolism is of the most common complications in cancer patients and may be due to the hypercoagulable state of malignancy and to its surgical treatment. Methods Patients undergoing planned curative open surgery for abdominal cancer received MEDNORD (Ukraine Co. analyzer) analysis (HVG), a viscoelastic test that measures clot formation and includes information on the cellular as well as the plasmatic coagulation system. We examined the effi cacy of a variety of coagulation tests. A complete coagulation screen, activated clotting time, thromboelastography (TEG) and haemoviscoelastography (HVG) were performed before surgery, at the end of surgery, and on postoperative days 1, 2, 3, and 7; they were analyzed for the reaction time and the maximal amplitude (MA). We calculated the elastic shear modulus of standard MA (Gt) and HVG MA (GH), which refl ect the total clot strength and procoagulatory protein component, respectively. The diff erence was an estimate of the platelet component (Gp). There was a 14% perioperative increase of standard MA, corresponding to a 48% increase of Gt (P <0.05) and an 80 to 86% contribution of the calculated Gp to Gt. We conclude that serial standard thromboelastography and the HVG viscoelastic test may reveal the independent contribution of platelets and procoagulatory proteins to clot strength. Using multiple linear regressions, all coagulation, TEG and HVG variables were used to model postoperative hypercoagulation. However, three components of the routine coagulation assay, including bleeding time, prothrombin time and platelet count, could be modeled to show prolonged postoperative hypercoagulability (P <0.01). We conclude that all components of the HVG test refl ect postoperative coagulopathies; these results suggest that it may be useful in determining the coagulation status of cancer patients perioperatively. Introduction A number of studies have observed decreased survival associated with transfusion. Leukocyte-mediated immunosuppression may contribute to postoperative infectious complications. There is evidence for a benefi t with the use of leukocyte reduced transfusion, but it is unclear among septic shock patients. In Japan, leukocyte-depleted blood products have been used since 2007. We assessed the eff ect of transfusion and the effi cacy of the use of leukocyte-depleted blood products for new onset of septic shock and mortality among patients with septic shock. Methods A total of 101 septic shock patients at a single university general ICU were enrolled in the study. A target hemoglobin (Hb) concentration of 10 g/dl (Ht 30%) was considered for allogeneic red cell transfusion (to maintain central venous saturation >70%) during the early phase of severe sepsis, and Hb was kept in the range of 7 to 9 g/dl in the stable phase except for the patients with particular disease (acute coronary syndrome, and so forth). Fresh frozen plasma was used to keep PT-INR <1.67, and platelet transfusion was done due to the patient's condition (for example, postoperative, and so forth). Results Eighty-six patients (85%) received transfusion (group T: age 68 ± 12, APACHE II 25.9 ± 9.5) and 15 received no blood product (group NT: age 63 ± 14, APACHE II 27.4 ± 3.2). Frequent sites of infection were the lung (46%), peritoneum (17%), mediastinum (12%), and gastrointestinal tract (7%) (NS between groups). Overall mortality for group T vs group NT was 27/86 (31%) vs 3/15 (20%) (P = 0.54). Onset of new septic shock for group T vs group NT was 27/86 (31%) vs 4/15 (27%) (P = 0.71). Survivors received 26.6 ± 31.2 Japanese units of total blood product and 44.6 ± 35.6 units for patients who died, respectively (1 American unit is almost equivalent to 2 Japanese units) (P = 0.013). For group T, 56 patients received leukocytereduced blood product (LRB) and 29 received ordinary product (Cont.). Overall mortality for LRB vs Cont. was 40/56 (71%) vs 18/29 (62%) (P = 0.38). Conclusions The total amount of blood products received was highly associated with increased mortality, but there was no obvious adverse eff ect of transfusion to the onset of new septic shock. No association between overall mortality and the use of leukocyte depleted blood was identifi ed in patients with septic shock. There are no prospective studies comparing outcomes between restrictive or liberal strategies in cardiac surgery. This study is a double-blind randomized study to determine whether a restrictive strategy of red cell transfusion and a liberal strategy produced equivalent results in patients submitted to cardiac surgery. Methods Until November 2009 we enrolled 380 patients undergoing elective cardiac surgery and randomly assigned 185 patients to a restrictive strategy of transfusion, in which red cells were transfused if the hematocrit concentration dropped below 24%, and 195 patients to a liberal strategy, in which transfusions were given when the hematocrit fell below 30%. Both transfusion strategies were followed into the operative room and during the ICU stay. We compared the rates of complications and death from all causes during the hospital stay and clinical outcomes of patients after 90 days. Adult patients, after written informed consent, were allocated if they would be submitted to elective primary surgery or were redone adult cardiac surgical patients for coronary artery bypass grafting, valve procedure or combined procedures. Results Overall, hospital mortality was similar in the two groups (4.7% vs 5.3%, P = 0.11). The rates of complications were similar in the two groups The deleterious eff ects of red blood cell (RBC) transfusion are well known [1] and restrictive transfusion practices are safe in patients without active haemorrhage [2] . Our objective was to determine transfusion practices in critically ill patients without evidence of ongoing bleeding to establish our conformity with published guidelines [3] . Methods All adult ICU patients receiving RBC transfusions between 1 September 2008 and 31 August 2009 were included in the analysis. Data were collected on demographics, APACHE II score, ICU and hospital length of stay (LOS), ICU and hospital mortality, presence of ischaemic heart disease (IHD), and pre/post-transfusion haemoglobin concentrations in g/dl ([Hb]). Subgroup analyses were performed for patients with IHD, age <55 years or APACHE II ≤20. We analysed patients with IHD because benefi t from liberal transfusion has not been confi rmed [1] whereas the latter two subgroups have shown a mortality benefi t with restrictive transfusion [2] . Results A total of 1,723 patients were admitted to the ICU during the study period. Two hundred and fi ve patients (11.9%) received RBCs, of whom 47 had active bleeding and were excluded from further analysis. It is important to restrict transfusions to limit morbidity and mortality and to make effi cient use of RBCs [4] . This study demonstrates the importance of regular audit and will be used to inform local guidelines. Introduction Transfusion of red blood cells has been used with the objective of improving oxygen delivery to the tissues in the critically ill patient. However, the appropriate hemoglobin (Hb) level is still controversial in septic shock patients [1, 2] . The objective of this study was to evaluate the eff ect of red blood cell transfusion on the central venous saturation (SvcO 2 ) and lactate levels in patients with septic shock with two diff erent levels of Hb. Methods Patients with less than 48 hours of septic shock and Hb levels under 9.0 g/dl were randomized to maintain Hb over 9.0 g/dl (Group 1) or over 7.0g/dl (Group 2). Before and up to 1 hour after each transfusion, Hb, lactate and SvcO 2 were determined. Results were expressed as median (25% to 75% percentile) or percentage and analyzed through chi-square or Mann-Whitney test. P <0.05 was considered signifi cant. Results Forty-six patients were included with a total of 74 transfusions, being 39 in Group 1 and 35 in Group 2, without any diff erence in demographic or hemodynamic data between the groups. Before and after Hb levels were 8.5 (8.2 to 8.7) g/dl and 9.4 (9.1 to 9.5) g/dl in Group 1 (P = 0.000) and 6.8 (6.6 to 6.9) g/dl and 7.6 (7.4 to 7.8) g/dl in Group 2 (P = 0.000). A reduction in median lactate after transfusion was found in In patients with altered lactate levels a reduction was found regardless of the basal levels of hemoglobin (P = 0.002 and P = 0.001 for Groups 1 and 2, respectively). The same was demonstrated in patients with SvcO 2 <70% (P = 0.007 and P = 0.000 for Groups 1 and 2, respectively). On the other hand, patients with normal levels of both perfusion parameters did not present a worsening of those variables, either in Group 1 or Group 2. Conclusions Red blood cell transfusion improved tissue perfusion in patients with signs of hypoperfusion regardless of basal levels of Hb. Transfusion did not seem to worsen tissue perfusion in patients without signs of hypoperfusion, even in the group with Hb levels above 9.0 g/dl. References Introduction Uncontrolled haemorrhage is a leading cause of early mortality in trauma. Administration of blood products, packed red cells and fresh frozen plasma (PRBC:FFP) in the ratio of 1:1 is associated with improved mortality in major haemorrhage in adult trauma. However, there is little evidence available for the management of major haemorrhage in the context of paediatric trauma. Methods We retrospectively analysed the local paediatric trauma database of all children following trauma call activation on arrival at the emergency department of The Royal London Hospital, a major urban trauma centre in London. The study period was over 15 months, May 2008 to August 2009. During this period, there has been no major haemorrhage policy within the centre for paediatric trauma. We collected data on demographic profi le, injury severity scores, blood products transfused in the fi rst 24 hours and the outcomes. We defi ned massive transfusion as the requirement of packed red cells >40 ml/kg in the fi rst 4 hours or >80 ml/kg in the fi rst 24 hours. Results Two hundred and twenty-seven children presented to the emergency department during this period following major trauma call activation. The median age at presentation was 10.2 years. Thirteen (5.7%) children had major haemorrhage. The median ISS was 35 (IQR 10 to 60). All but one were males. Three had penetrating trauma, one of whom made it to theatre, but all died. Four had emergency damage control surgery. Abnormal results were seen in three patients, each having one abnormal result (INR =1.9 and APTT = 86, low Hb = 7.6, thrombocytopaenia = 63). Eight of the 13 patients received additional blood products such as fresh frozen plasma, platelets and cryoprecipitate. However, no patient received the ratio of blood products of PRBC:FFP of 1:1 as practised in adult trauma. Two patients had no admission bloods done. Worsening coagulation parameters were seen in two patients when measured post transfusion and the remaining 11 patients did not have routine monitoring of blood parameters post transfusion. Eight (62%) patients died, of which seven died in the emergency department. Conclusions Major haemorrhage is associated with a very high mortality in severely injured children. We recommend rigorous monitoring of laboratory parameters to guide appropriate administration of blood products. There is a need for instituting a major haemorrhage policy in paediatric trauma and consideration of point-of-care testing of blood parameters. Methods Twenty-one pigs were anesthetized, instrumented and randomized into three groups: Control, HES and GEL. Animals in the HES and GEL groups were submitted to acute isovolemic anemia to a target hematocrit of 15% with volume replacement performed with HES 130/0.4 and GEL at a 1:1 ratio. The withdrawn blood was returned to the animals 120 minutes after the end of AIA. TNF, IL-1, IL-6 and IL-10 measurements were performed with blood samples collected at the femoral vein at the following time points: Baseline, after instrumentation (INST), immediately after AIA (AIA), 60 minutes after AIA (60AIA), 120 minutes after AIA (120AIA), 60 minutes after blood infusion (60BI) and 120 minutes after blood infusion (120BI). The cytokines were measured in serum by a commercially available ELISA with specifi c monoclonal antibodies. Two-way analyses of variance (ANOVA) and a Tukey test were used to assess its signifi cance (P <0.05). Results TNF varied signifi cantly among the groups in diff erent time points (AIA: Control, 91 ± 16 and GEL, 221 ± 61; P <0.05; 60AIA: Control, 76 ± 15; HES, 172 ± 42; P <0.05 and GEL, 323 ± 77; P <0.001; 120AIA: Control, 98 ± 6 and GEL, 304 ± 66; P <0.01; 120BI: Control, 71 ± 7 and GEL, 211 ± 71; P <0.05). In relation to IL-1, only in the GEL group (224 ± 56) was verifi ed a statistical signifi cant at time point 60AIA when compared with Control (84 ± 17; P <0.05). When IL-6 levels were studied, GEL presented a statistically signifi cant at 60AIA (GEL, 331 ± 65) when compared with Control (146 ± 19; P <0.05). Serum IL-10 levels in GEL-treated animals were statistically signifi cantly elevated at 60AIA (Control, 17 ± 3 and GEL, 46 ± 12; P <0.01) and 120BI (Control, 21 ± 5 and GEL, 59 ± 11; P <0.05). Conclusions Fluid replacement infl uences the cytokine measurements during acute isovolemic anemia expressed by serum TNF, IL-1, IL-6 and IL-10 increases, especially in GEL group. Introduction Community-acquired sepsis at an early stage is common, but haemodynamic alterations remain unclear. The aim of the study was to characterize cardiovascular alterations in patients of our ProFS (monocentric observational) study, which was to characterize patients with sepsis in the emergency department. Methods Systemic vascular resistance (SVR) and cardiac output (CO) were measured non-invasively using a TaskForce monitor (CNSystems, Graz, Austria) after admission, 24 hours and 72 hours. Indexed values were calculated (SVRI (dyn·second/cm 5 /m 2 ), CI (l/minute/m 2 )). Procalcitonin (PCT, ng/ml) was measured in serum. Results A sample of 64 patients of 208 included patients received haemodynamic examination. Mean age was 61.8 ± 18.0 years, 62.7% were male. Patients were divided by PCT <2 and ≥2. Age, gender and previous medical history were comparable in both groups. The heart rate was 99.8 ± 21.6 vs 104.6 ± 23.0/minute (P = NS) and the mean artery pressure was 89.5 ± 15.6 vs 81.6 ± 21.5 mmHg (P <0.01). Mean SVRI in patients with PCT <2 was 2,934 ± 1,045 vs 2,376 ± 842, P <0.05 at the time of admission. No diff erence was found after 24 hours (2,959 ± 1,002 vs 2,924 ± 1,324, P = NS) and 72 hours (3,123 ± 931 vs 3,556 ± 1,524, P = NS). On the contrary, for patients with PCT ≥2 the increase after 72 hours was signifi cant (P <0.05). Diff erences after admission could not be observed for CI between patients with PCT <2 vs ≥2. Mean values after admission were 2.7 ± 1.0 vs 2.8 ± 0.8, after 24 hours 2.5 ± 0.8 vs 2.5 ± 0.5, and after 72 hours 2.3 ± 0.7 vs 2.3 ± 0.6 (all: P = NS). See Figure 1 Conclusions Patients with community-acquired sepsis in the emergency department had an elevated SVRI. At the time of admission patients with high PCT had a signifi cantly lower SVRI than patients with low PCT. Cardiac index at the time of admission was at a lower limit of normal range in all patients. These fi ndings are in strong contrast to the classic pattern of sepsis on the ICU, where SVRI is keenly reduced and CI elevated. They implicate that patients with sepsis in the emergency department may benefi t more from application of fl uid and positive inotrope substances than from vasopressor. Introduction A multicentre, prospective, observational study conducted in four intensive therapy units (ITUs) in India from June 2006 to June 2009 to determine the incidence and outcome of severe sepsis among adult patients. Methods All patients admitted to the ITU were screened daily for SIRS, organ dysfunction and severe sepsis as defi ned by the ACCP and SCCM. Patient with severe sepsis were further studied. Results A total of 5,478 ITU admissions were studied. SIRS with organ dysfunction was found in 1,385 (25%) patients, of which 731 (52.77%) were due to sepsis. The incidence of severe sepsis was 16.45% of all admissions. Mean age of the study population was 58.17 years (SD 18.66), of which 57.71% were male. The median APACHE II score was 13 (IQR 13 to 14) with predominant (90.93%) medical admission. ITU mortality of all admissions was 12.08% and that of severe sepsis was 59.26%. Hospital mortality and 28-day mortality of severe sepsis were 65.2% and 64.6%, respectively. The standardized mortality ratio of severe sepsis patients was 1.45. Median duration of stay in the ITU for the severe sepsis cohort who survived was 13 days (IQR 11 to 17). The number of episodes where infection was the primary reason for admission to the ITU was 86.32%. Culture positivity was found in 61.6%. The lung was the predominant source of sepsis (57.45%). Gram-negative organisms were responsible for 72.45% of cases and Grampositive for 13.13%. The rest were parasitic, viral and fungal infection. Conclusions Severe sepsis was common in Indian ITUs. ITU mortality was higher compared with western literature. Gram-positive infections were less common although incidences of parasitic and viral infection were higher than in the West. Methods In a randomised, prospective trial 807 paediatric patients admitted to the interdisciplinary PICU of a tertiary university hospital were assigned to either a control or an interventional group, the latter receiving inline fi ltration (infusion fi lter Pall ELD96LLCE/ NOE96E, Braun Intrapur Lipid/Intrapur Neonat Lipid) throughout the whole infusion therapy. Prior to this study, the infusion regimen was optimised to prevent precipitation and incompatibilities of solutions and drugs. Primary objectives included a reduction in the incidence of sepsis, thrombosis, SIRS, organ failure (liver, lung, kidney, circulation) and mortality. Results eight hundred and seven children (343 female, 464 male) with a heterogeneous background of underlying diagnoses and a Gaussian distribution to either the control group (406 patients) or the inline fi ltration group (401 patients) were included. According to the study criteria, a signifi cant reduction in the incidence of SIRS for the interventional group (95% CI, 145 versus 200 patients, P <0.001) was evident. No diff erences were demonstrated for the occurrence of sepsis, thrombosis, organ failure (liver, lung, kidney, circulation) or mortality between the control and interventional groups. Conclusions The occurrence of SIRS often complicates the treatment in intensive care medicine. Inline fi ltration is most eff ective, reducing the incidence of SIRS, and off ers a novel therapeutic option. Introduction Intravenous catheter-related bloodstream infections (ICR-BSI) are a major contributing factor to in-hospital mortality and morbidity, extending the inpatient stay by 10 days and expenditure per patient by £2,000 to £30,000 [1] . A prospective survey was conducted in our unit on all patients with central venous catheters to ascertain the incidence of ICR-BSI, identify the organisms and determine the occurrence of infection from the various sites -femoral, internal jugular and subclavian lines. The survey was carried out over a period of 13 weeks. Data collected from patients' case notes included site of central line insertion, length of line in situ, reason for line removal and positive blood culture reports. Results During the study period, 104 patients were treated on the unit. Fifty-two central venous lines were inserted in 36 patients (63.5% femoral (n = 33), 32.7% internal jugular (n = 17) and 3.9% subclavian lines (n = 2)). The lines were reviewed daily and removed if indicated clinically (pyrexia or raised white cell count) or if not required. A total 51.5% of femoral lines (n = 17) were removed due to clinical indications, as were 29% (n = 5) of internal jugular and 50% (n = 1) of subclavian lines. The average duration of a line remaining in situ was 4.5 days for femoral, 6 days for internal jugular and 5 days for subclavian lines. Blood cultures were taken at the time of line removal. These yielded positive results in eight femoral, seven internal jugular and one subclavian line. Our survey indicated that the incidence of ICR-BSI in our unit is 30.8% (of this 62.5% coagulase-negative staphylococci (CNS), 12.5% E. coli and Pseudomonas each, and 6.25% MSSA and MRSA each). Conclusions The distribution of microorganisms causing bacteraemia is broadly similar in our unit to that in other teaching hospitals in the UK [2] , in that CNS was the commonest organism isolated. However, E. coli and Pseudomonas were the next common organisms, unlike other units where Staphylococcus aureus was the second most prevalent organism. The incidence of bacteraemia from femoral lines (53.7/1,000 catheter-days) was lower than that from internal jugular lines (68.6/1,000 catheter-days) possibly due to a higher index of suspicion in the case of femoral lines and earlier removal (Figure 1 ). Our study highlights the fact that femoral lines, which are often the safest option for unstable patients with head injury, can be eff ectively managed with strict adherence to guidelines to reduce ICR-BSI. The central venous catheter (CVC) is indispensable in the ICU. Catheter-related bloodstream infection (CR-BSI) is the leading cause of healthcare-associated infections and one of the most important complications from the CVC. Nowadays there are two well-known ways to diagnose CR-BSI: the standard method (withdrawal of the catheter) and the conservative method (without withdrawal of the catheter). Our objective was to compare the in-hospital mortality between the two methods in patients with CR-BSI (short-term catheter) in the ICU. Methods This study was conducted in a 38-bed mixed ICU in a tertiary hospital. We reviewed all episodes of CR-BSI that occurred in our ICU from January 2000 to December 2008. The standard method was defi ned in a patient with a CVC with at least one positive blood culture obtained from a peripheral vein and a positive semiquantitative (>15 CFU) catheter segment culture whereby the same organism (species and antibiogram) was isolated from the catheter segment and peripheral blood. The conservative method was defi ned in a patient with a CVC with at least one positive blood culture obtained from a peripheral vein and one of the following: diff erential time period of CVC culture versus peripheral culture positivity of more than 2 hours, or simultaneous quantitative blood culture with a ratio ≥5:1 (CVC versus peripheral). Results During the study period, 247 episodes of bloodstream infection were identifi ed; of these, 192 were catheter-associated bloodstream infection and 55 were CR-BSI (39 standard method and 17 conservative method). Considering the CR-BSI patients, the mean age ± standard variation (SD) was 64 ± 19.29 (years), 75% were under mechanical ventilation, 55% were under vasopressors, 40% were under total parental nutrition, 40% were under hemodialysis, 72% were double lumen and the mean time ± SD in place was 16.32 ± 8.56 (days). The in-hospital mortality of the standard method versus the conservative method did not show any diff erence (57% vs 75%, P = 0.208). Conclusions This study showed that there is no diff erence in the inhospital mortality between the standard method versus the conservative method in patients in the ICU with CR-BSI. Introduction In previous guidelines of the Centers for Disease Control and Prevention (CDC) [1] and in the recently published Guidelines of the Society for Healthcare Epidemiology of America/Infectious Diseases Society of America (SHEA/IDSA) [2] it is recommended to avoid the femoral access to reduce the risk of catheter-related bacteremia (CRB). However, in these guidelines there are no recommendations for the catheter site regarding the presence of tracheostomy, and we have not found data about the incidence of CRB comparing a central jugular site with tracheostomy and a femoral site. The objective of this study was to determine whether a jugular site with tracheostomy may have a higher risk of CRB than a femoral site. Methods A prospective, observational, 4-year study (from 1 May 2000 to 30 April 2004) was carried out in the medical-surgical ICU of the University Hospital of the Canary Islands (Tenerife, Spain). We included all patients undergoing a central jugular catheterization with tracheostomy or femoral venous catheterization. In the period of study, were used 208 catheters in femoral access and 52 catheters in central internal jugular access with tracheostomy. There were no signifi cant diff erences between patients with central jugular with tracheostomy and femoral access in age, sex, APACHE II score, diagnosis group, use of mechanical ventilation, use of antimicrobials, use of total parenteral nutrition, use of pulmonary artery catheter, and duration of the catheter. We diagnosed 16 CRB in 208 femoral catheters during 1,679 days of catheterization and 10 CRB in 52 central internal jugular catheters with tracheostomy during 462 days of catheterization. The incidence of CRB was higher in the central internal jugular with tracheostomy than in the femoral site (21.64 vs 9.52 per 1,000 catheter-days; risk ratio = 2.27; 95% confi dence interval = 1.04 to 4.97; P = 0.04). Conclusions The femoral site could be considered a more safe venous access than the central internal jugular in patients with tracheostomy to minimize the risk of CRB. The central venous catheter (CVC) is very useful in the ICU. A comparative study between antiseptic impregnated and standard catheters is therefore of great value. Methods Alternating the type of CVC used in each patient, we recorded for each patient: sex, age, APACHE II score, GCS, site of the puncture, reason for withdrawal of the catheter and the type of catheter used. The tip of the catheter was cultured. The groups were divided: group I (41 patients, 54 punctures) used the standard CVC, and group II (38 patients, 54 punctures) used the impregnated CVC. Results Sixty-two patients were included (48.38% male). We studied 108 periods, of which 54 were standard CVCs and 54 were impregnated CVCs. The average length of stay was higher for impregnated CVCs (14.11 days) compared with standard CVCs (10.7 days). Excluding death in both groups, the length of stay of the catheter in group I was 10.86 days, compared with 15.43 days in group II. Adding all periods of catheterization, group I have an amount of 578 days, and 762 days for group II. The total duration for group II was 31.84% higher than for group I. Regarding the reason for withdrawal of the CVC, predominant were suspected infection 77.8% of the time for standard CVCs, and 49.1% of the time for impregnated CVCs. The culture of the catheter's tip was positive in 10 (18.5%) standard CVs, against eight (15.1%) CVCs in the impregnated group. Most patients had GCS <9.The average APACHE II score was17.97 in group I, compared with 19.63 in group II. The predominant site of puncture was the subclavian vein (56.48%). See Figure 1 . Conclusions According to our study we observed that the length of stay with use of the impregnated CVC was higher (15.43 days). The rate of colonization was higher in the standard CVC. Patients who require a CVC for long periods have benefi ted with the use of impregnated CVCs, Introduction The chirurgical strategy for infection control in complicated intra-abdominal infections is responsible alone for 40% reduction in mortality. The second look strategy in tertiary peritonitis reduces the mortality and morbidity of the patients. Vacuum-assisted closure therapy can potentially decrease the concentration of proinfl ammatory cytokines, bacterial count, management of third-space fl uid and improved input and output monitoring. Methods The authors present a revision of 60 patients with tertiary peritonitis admitted to the ICU after chirurgical infection control. All patients have septic shock at admission. Twelve percent of them were admitted with an open abdomen and vacuum-assisted closure therapy was used. The parameters evaluated were hospital mortality, ICU stay (days) and time of mechanical ventilation (days). The patients submitted to vacuum-assisted closure therapy were similar to patients in whom the abdominal wall was closed, regarding the risk factors for peritoneal infection (corticotherapy, oncologic disease, renal insuffi ciency, hepatic insuffi ciency, desnutrition, hypoalbuminemia, a high APACHE II score). The patients submitted to vacuum-assisted closure therapy had a higher ICU stay and time of mechanical ventilation but lower hospital mortality. Conclusions Vacuum-assisted closure therapy is superior to primary abdominal wall closure in patients with tertiary peritonitis. The Sequential Organ Failure Assessment (SOFA) score is a widely used method to describe organ dysfunction/failure in critically ill patients. Although the majority of the studies use the SOFA score in outcome research, little is known about its correlation to left ventricular (LV) function in patients with severe sepsis and septic shock. The purpose of this study was to evaluate the correlation of SOFA score and LV function by tissue Doppler imaging in this patient population. Methods Sixty patients (ages 66 ± 15, men n = 28) admitted to the ICU with severe sepsis or septic shock were prospectively enrolled. Transthoracic echocardiography was performed within 24 hours of admission to the ICU. We measured the LV end diastolic volume index (LVEDVI), LV end systolic volume index (LVESVI), LV ejection fraction (LVEF), mitral deceleration time of E velocity (Dct) and myocardial tissue Doppler strain imaging profi les at the basal, mid and apical portions of the LV septal wall in the LV apical fourchamber view. On the ICU admission day, the SOFA scores were calculated. Signifi cantly correlated parameters were subjected to linear regression analysis. P <0.05 was considered signifi cant. The mean heart rate was 90 ± 17 bpm, mean LV volumes, mean LVEF and mean Dct were within normal limits (LVEDVI; 44.6 ± 15.4 ml/m 2 , LVESVI; 18.7 ± 11.5 ml/m 2 , LVEF; 58.9 ± 15.7%, Dct; 175.8 ± 50.2 ms). The mean peak systolic strain measurements were -16.0 ± 5.0% (basal; -18.1 ± 7.0%, mid; -15.7 ± 5.6%, apical; -14.1 ± 6.6%). The mean SOFA score was 10.8 ± 3.9. Linear regression analysis showed correlation between the SOFA score and LVEF (r = -0.40, P = 0.02), Dct (r = -0.34, P = 0.007), mean septal strain (r = 0.40, P = 0.002), mid-septal strain (r = 0.43, P = 0.0007) and apical septal strain (r = 0.36, P = 0.006). Conclusions These results suggest that the higher SOFA score had lower LVEF, lower LV systolic strain and higher LV fi lling pressure in the patients with severe sepsis or septic shock. An assessment of the left ventricular longitudinal contraction by tissue Doppler strain imaging may serve as a useful tool to evaluate multiple organ failure in this patient population. Introduction Myocardial dysfunction is reportedly a common complication of sepsis that requires specifi c management. Although frequently reported in the literature, the clinical spectrum and frequency of this organ failure have not been fully appreciated. We sought to determine the frequency of myocardial dysfunction in severe sepsis and septic shock and to describe the clinical spectrum of this entity with transthoracic echocardiography. Methods Prospective single-center study capturing all patients admitted to the ICU with severe sepsis or septic shock from May 2007 to January 2009. All patients enrolled underwent comprehensive transthoracic echocardiography on admission. The exclusion criteria included <18 years of age, pregnancy, documented ischemic, valvular or congenital heart disease. All patients with LV systolic dysfunction defi ned as LVEF <50% received a repeat echocardiographic study at 5 days or upon dismissal from ICU. Results One hundred and six patients were enrolled, mean age was 65, and 50% were female. The mean SOFA score was 11. Central venous oxygen saturation was less than 70% in 37% of patients. Twenty-nine patients (27%) had global LV systolic dysfunction (14 = mild, nine = moderate, six = severe), 33 patients (31%) had RV dysfunction (18 = mild, nine = moderate, six = severe), 14 patients had biventricular involvement (13%) and 39 patients (37%) had diastolic fi lling abnormalities. Of the 29 patients with LV systolic dysfunction on initial examination, 28 received a follow-up echocardiogram and 96% of these patients (n = 27) improved in all parameters. Thirty-day mortality was 39%, 6-month mortality 52%. Myocardial dysfunction did not predict mortality. Conclusions These results confi rm that myocardial dysfunction is frequent and broad in patients with severe sepsis and septic shock. Right ventricular involvement and diastolic abnormalities should be considered as part of the clinical spectrum of this entity. There was poor correlation with myocardial dysfunction and mortality and it was reversible regardless of presentation. We should not focus only on left ventricular ejection fraction to diagnose myocardial dysfunction in sepsis, since only a small portion of these patients had isolated LV systolic dysfunction. In sepsis, microcirculatory blood fl ow is impaired. Intravenous nitroglycerin (NTG) seems to have no eff ect on sublingual microcirculation Introduction Elevated serum levels of cardiac-specifi c troponin I (cTnI) are claimed to be associated with the degree of sepsis-related cardiac dysfunction and with the disease outcome in critically ill. We tested this association in a specifi c group of ICU patients with post-traumatic sepsis. Methods A prospective observational study was conducted in an ICU of a level I-equivalent trauma center between September 2008 and August 2009. No exclusion criteria were adopted. Serum cTnI concentrations were collected at peak sepsis-related cardiovascular dysfunction (defi ned by the highest pressure adjusted heart rate (PAR) in MODS score and by the dose of the inotrope). Data were gathered on demographics, severity of infl ammatory response (SIRS criteria) and organ dysfunction (MODS score), presence of superimposed acute coronary events, other potential reasons for cTnI elevation and the disease outcome. Intergroup comparisons were made with Student's t test/nonparametric alternatives. The prognostic utility of cTnI was examined by the construction of ROC curves and its levels were categorized on the basis of the defi ned cut-off value. Probable associations between these categories and the organ dysfunction severity, as well as the outcome, were examined with Fisher's exact test. Results A total of 92 polytrauma patients were enrolled. The confi dence intervals of PAR, MODS and SIRS were 21.26 ± 2.72, 6.99 ± 2.94 and 3.75 ± 1.10, respectively. Blunt cardiac injury had been diagnosed in 20 patients (21.7%) on admission. Three patients (3.3%) developed acute coronary events during their ICU stay and 12 patients (13.2%) had other potential reasons for cTnI elevation. Signifi cantly higher cTnI concentrations were found in the critically ill with myocardial contusion and with acute coronary events. However, the presence of blunt cardiac injury had no measurable outcome eff ect. Troponin I levels were signifi cantly higher in nonsurvivors (P = 0.00) and very good prognostic performance of cTnI was found (AUROC = 0.82; cut-off value = 0.05 ng/ml). The subsequent comparison between its categorized values and those of PAR and MODS showed strong positive association. Elevated cTnI concentrations (>0.05 ng/ml) signifi cantly increased the risk of death (RR = 5.58; 95% CI = 1.67 to 20.79; P = 0.002). Conclusions In a group of patients with post-traumatic sepsis elevated serum cTnI levels proved to be a good marker of severe organ (including cardiovascular) dysfunction. They also showed reasonable prognostic value regarding the disease outcome. was measured using Simpson's rule, and early peak diastolic relaxation velocity of septal mitral annulus (E' , cm/second) was measured using tissue Doppler imaging. APACHE II score was calculated at admission. By APACHE II score, two groups were formed: low APACHE II score (LAS) was defi ned as 0 to 14 patients and high APACHE II score (HAS) ≥15 patients. In 65 of 208 patients an echocardiographic examination was performed. Age (61.0 ± 18.4 vs 63.5 ± 16.9 years, P = NS), male sex (64.5% vs 58.8%, P = NS) and mean artery pressure (86.2 ± 15.7 vs 85.7 ± 21.2 mmHg, P = NS) were comparable, whereas patients with HAS had a higher heart rate (96.5 ± 24.5 vs 108.2 ± 17.1/minute, P <0.05). At admission, patients with HAS had a signifi cantly lower EF than patients with LAS (55.1 ± 7.0 vs 50.5 ± 10.1, P <0.05). Diff erence could not be observed after 24 hours (53.6 ± 7.2 vs 52.4 ± 7.%, P = NS) and 72 hours (53.8 ± 7.3 vs 54.3 ± 7.3, P = NS); see Figure 1 . Patients with HAS showed a lower initial E' than patients with LAS (6.2 ± 2.1 vs 5.1 ± 2.1, P <0.05) and after 24 hours (6.1 ± 2.2 vs 5.1 ± 1.3, P <0.05). Three days after admission, E' values in patients with LAS and HAS were comparable (5.6 ± 2.0 vs 5.7 ± 2.1, P = NS). Conclusions In patients with community-acquired sepsis and APACHE II score ≥15 points, a signifi cantly depression of both systolic and diastolic function could be observed. After 24 hours systolic function and after 72 hours diastolic function of patients with an APACHE II score <15 and ≥15 points were similar. This could be possibly due to alterations in afterload. Following a 10 mmHg drop in mean arterial pressure (MAP) from baseline, sheep were randomly assigned to receive intravenous infusions of either POV, AVP or the vehicle (0.9% NaCl; n = 6 each). POV and AVP were titrated to keep MAP above baseline -10 mmHg. All sheep were awake, mechanically ventilated and fl uid resuscitated to maintain hematocrit at baseline ±3%. Data are expressed as mean ± SEM at 24 hours. Fluids are represented as the average over time to account for individual survival times. Results Both treatment strategies signifi cantly reduced fl uid input (AVP: 7.7 ± 0.6 ml/kg/hour; POV: 5.8 ± 0.8 ml/kg/hour), positive net fl uid balance (AVP: 4.9 ± 0.5 ml/kg/hour; POV: 0.4 ± 0.6 ml/kg/hour), thoracic (AVP: 0.8 ± 0.2 ml/kg/hour; POV: 0.3 ± 0.1 ml/kg/hour) and abdominal fl uid (AVP: 0.10 ± 0.04 ml/kg/hour; POV: 0.01 ± 0.01 ml/kg/hour) as well as extravascular lung water (AVP: 22.6 ± 1.4 ml/kg; POV: 15.8 ± 1.9 ml/ kg) and increased plasma oncotic pressures (AVP: 12.5 ± 0.9 mmHg; POV: 15.9 ± 1.1 mmHg) vs control animals (9.6 ± 0.4 ml/kg/hour; 7.7 ± 0.4 ml/kg/ hour; 2.2 ± 0.4 ml/kg/hour; 0.29 ± 0.13 ml/kg/hour; 22.9 ± 2.8 ml/kg; 8.9 ± 0.1 mmHg, respectively). Notably, POV was superior to AVP in all these variables (P <0.05 each). Myocardial contractility was higher in the POV group vs AVP and control animals as suggested by higher left ventricular stroke work indexes (66 ± 3 vs 48 ± 7 and 43 ± 3 g/m/m 2 , respectively; P <0.05 each) at lower or similar global end-diastolic volumes. Pulmonary dysfunction (decrease in PaO 2 /FiO 2 ratio) was attenuated in POV (364 ± 47 mmHg) as compared with AVP (195 ± 66 mmHg; P <0.05) and control animals (213 ± 76 mmHg; P <0.05). Whereas one AVP and three control animals had to be sacrifi ced prematurely, all POV-treated animals survived the 24-hour study period. Conclusions In MRSA pneumonia-induced ovine septic shock, the selective V 1a -receptor agonist POV is superior to the mixed V 1a /V 2receptor agonist AVP in reducing vascular leakage and cardiopulmonary dysfunction when administered as a fi rst-line vasopressor. Introduction Improvements in survival for acute myocardial infarction, trauma, and stroke have been realized through continuous quality improvement (CQI) initiatives that include early identifi cation and implementation of time-sensitive therapies at the earliest stage of disease presentation. In addition, challenges in the management of severe sepsis and septic shock remain in regards to implementation of an early sepsis resuscitation bundle (RB) in order to maximize outcome benefi t. We examined the impact of a hospital-wide CQI initiative on septic RB compliance and outcome and investigated the mortality benefi ts beyond the 6-hour recommendation period. Methods Prospective cohort study of patients who met the defi nition of severe sepsis or septic shock over an 18-month period. Current sepsis RB being MAP ≥65, CVP ≥8 and SvO 2 ≥70% goals met within 6 hours and 18 hours upon presentation. The 498 severe sepsis and septic shock patients were enrolled and examined at what time period would the RB still be eff ective. Using a time cut-off , the Compliers at 18 hours and NonCompliers at 18 hours were then compared. There were 202 patients who had the RB completed in less than or equal to 18 hours (Compliers at 18 hours). There were 296 patients who never completed the RB within 18 hours (NonCompliers at 18 hours). The Compliers at 18 hours had a signifi cant 10.2% lower hospital mortality at 37.1% (22% relative reduction) compared with the NonCompliers at 18 hours hospital mortality of 47.3% (P <0.03). Adjusting for diff erences in baseline illness severity, the Compliers at 18 hours had a greater reduction in predicted mortality of 26.8% vs 9.4%, P <0.01. Compliance started at 30.4% and fi nished at 63%. Conclusions A CQI initiative for severe sepsis and septic shock with particular emphasis on the RB signifi cantly improved bundle compliance and decreased hospital mortality. It has been previously shown that compliance to the RB within 6 hours improves outcome; however, when the time of bundle completion is extended to 18 hours, the mortality benefi ts are still signifi cant. Introduction To investigate the eff ect of myocardial preservation of early goal-directed therapy (EGDT) on severe sepsis/septic shock patients in the ICU. Methods This is a prospective and randomized controlled study, in which the total 158 severe sepsis/septic shock patients from the ICU were randomly assigned into two groups (EGDT group, n = 81 and control group, n = 77). Then the concentration of serum cardiac troponin I (cTnI), high sensitivity C-reactive protein (hs-CRP) and APACHE II score of patients were obtained on the 0, 3rd, 7th, and 14th day after fl uid resuscitation therapy. The levels of cTnI were the same between the EGDT and control groups on day 0 (exceeded normal cTnI patients in two groups 0.41 vs 0.42, normal cTnI patients in two groups 0.05 vs 0.04, P >0.05), there was a dramatic decrease in exceeded normal cTnI patients of the EGDT group, in which cTnI returned to normal after EGDT (14th day: exceeded normal cTnI patients in two groups 0.08 vs 0.16, P <0.05); however, only a little diff erence in normal cTnI patients of the two groups (P >0.05). The level of hs-CRP changed like cTnI (P <0.05), and there was positive correlation between cTnI and hs-CRP on each time (P <0.05); APACHE II scores obviously decrease in the EGDT group (P <0.05). Meanwhile, the EGDT group have an obviously higher 28-day survival rate and longer survival time than that of the control group (74.1% vs 55.8%; 23.5 vs 19.6, P <0.05). Conclusions EGDT has an eff ect of myocardial preservation and improves the survival rate for severe sepsis/septic shock patients in the ICU. Introduction Sepsis mortality rates in Brazil are high [1, 2] . Many studies have already shown that implementation of education programs based on Surviving Sepsis Campaign 6-hour (6h SSC) and 24-hour bundles successfully reduces sepsis mortality. The objective of this study was to evaluate the impact of the SSC-based program on patient outcome in Brazil. Methods Severe sepsis and septic shock patients were included in voluntary participating hospitals that have sent data for at least 2 years. The intervention was based on strategies aimed at improving compliance with the bundles. Regular reports containing compliance and mortality data were sent to the hospital in order to base local strategies of improving care. Results are presented by the semester of hospital inclusion in the campaign through 2 years, regardless of when those months occurred in the time frame of the study. Results Compliance to 6h SSC increased from the fi rst to the fourth semester as follows: lactate: 65.0% to 71.5%, P = 0.01, blood culture: 42.5% to 50.3%, P = 0.08, antibiotics: 38.9% to 45.3%, P = 0.02, achieving mean arterial pressure >65 mmHg: 43.7% to 75.4%, P <0.000, CVP achievement: 14.6% to 36.6%, P <0.000, SvO 2 achievement: 10.4 to 22.7, P = 0.000 and completion of the whole bundle: 8.3 to 9.0, P = 0.34. A 7% absolute reduction (relative reduction of 11.6%) in hospital mortality was found in the third semester of the campaign (OR 1.33 (1.04 to 1.69), P = 0.01). However, in the fourth semester mortality has increased again to 60.6%. Conclusions Implementation of the SSC in Brazil has been associated with increased bundle adherence. However, the impact on the mortality rate seems not to be sustained, probably due to the diffi culty associated with keeping an education program. It is important to develop strategies aiming to increase motivation in Brazilian hospitals. Introduction Noradrenaline and dopamine were the standard catecholamines used in the treatment of septic shock. Loss of response was the common problem that led to patient loss after large continuous doses of noradrenaline, which was termed catecholamine refractory septic shock. Recently vasopressin and its analog, namely terlipressin, were used in the treatment of such a catastrophic condition. Methods In a prospective controlled study we included 40 patients with catecholamine-resistant septic shock (that is, noradrenaline dose exceeded 0.6 μg/kg/minute) divided into two groups: 20 patients were treated conventionally according to the Surviving Sepsis Campaign 2008, who served as a control group; and the other 20 patients were treated conventionally and when the noradrenaline dose exceeded 0.6 μg/kg/ minute, terlipressin in a dose of 1 mg intravenous bolus every 12 hours for a study time of 48 hours was started. Results Terlipressin therapy was associated with increased systemic vascular resistance from 546 ± 260 dyne·sec/cm -5 to 986 ± 390 dyne·sec/ cm -5 after 48 hours, which represents normalized arterilor tone that is expected to allow better organ bed perfusion. There was a reduction of both stroke volume and cardiac output (from 63 ± 16 ml/beat to 51 ml/ beat and from 78 l/minute to 5.3 l/minute, respectively) yet this was not associated with abnormal organ perfusion marked by improved urine output from 49 ml/hour to 133 ml/hour and improved global perfusion as marked by decrease lactic acidosis from 9.3± 3 mEq/l to 5.7 ± 3 mEq/l, P <0.002. There was signifi cant reduction of oxygen delivery (DO 2 from 848 ml/minute to 610 ± 47 ml/minute after 48 hours, P >0.02). There was no eff ect on length of ICU stay in both groups (16 ± 6 days in the terlipressin group and 12 ± 6 days in control group, P <0.06). Terlipressin support showed nonsignifi cantly less mortality (60% vs 70% in control group). Regarding organ function, terlipressin could improve the SOFA score from 11 ± 3.2 to 8 ± 5 with P <0.02. Conclusions Terlipressin is a rather safe, inexpensive, easy to administer alternative in the treatment of septic shock. Further studies are needed to decide the ideal timing for initiation of this therapy early vs late adjuvant or as an initial treatment. Reference in patients with specifi c co-morbidities such as chronic liver disease. In this study we evaluated whether the adherence to the resuscitation sepsis bundle improved the outcome of cirrhotic patients with septic shock admitted to the ICU. Methods The prospective observational cohort study included 38 patients with documented cirrhosis and septic shock admitted to a multidisciplinary ICU at a university hospital from January 2005 to June 2009. In each patient the compliance to four resuscitation interventions recommended by the SSC guidelines (that is, 6-hour bundle) and the 30-day mortality were measured. Results The 6-hour bundle was completed in 50% of the patients. In these patients the MELD and SOFA scores (39 ± 11 and 18 ± 2) were higher (P <0.05) than those observed in patients without compliance to the 6-hour bundle (31 ± 12 and vs 15 ± 3). The 30-day mortality was 94.7% and 68.4% (P <0.05) in patients with and without 6-hour bundle compliance, respectively. A Cox regression analysis, after adjustment for MELD and SOFA scores, indicated that none of the single sepsis interventions as well as the 6-hour bundle was independently associated with 30-day survival. Conclusions The adherence to the resuscitation interventions recommended by the SSC evidence-based guidelines did not improve the survival rate of cirrhotic patients with septic shock admitted to our ICU. Recently, there has been further interest in the complications of DAA following a review by Gentry and colleagues suggesting a signifi cantly increased incidence of SBEs and death in those with baseline bleeding precautions [1] . We are one of the largest single-centre users of DAA worldwide and thus are keen to report our experience. Methods In 2002 a scoring system, reporting tool, and supporting database were developed to track the outcome of DAA-treated severe sepsis patients in our hospital. This is maintained by the clinical pharmacists and captures data on relative contraindications, adverse events and patient outcome. Classifi cation of data is obtained from the electronic patient record. Any DAA administration outside agreed criteria is recorded and bleeding incidents are reviewed with a consultant intensivist to confi rm categorisation into serious or minor. Conclusions Our results are similar to those of a UK-wide audit of APC usage [1] . The mortality of our patients was higher (56% vs 45%), but our patients were probably sicker (median fi ve organ failures vs three). Based on APACHE, actual outcome appears worse than predicted for patients receiving APC. Using the ICNARC method score, however, outcomes were hemoperfusion) we used LPS adsorber (Alteco® LPS Adsorber; Alteco Medical AB, Lund, Sweden). The patient's blood was taken before and after adsorption, then the serum was separated and stored at -70°C. We determined the LPS concentration by LAL test, lipopolysaccharide-binding protein (LBP), sIL-1 RII and sCD14 by ELISA kits (Hycult Biotechnology, the Netherlands), and proinfl ammatory and anti-infl ammatory cytokines IL-6, TNFβ, IL-8, IL-2, INFγ, IL-12, TNFα, IL-4, IL-5, IL-1β and IL-10 by ELISA kit (Bender MedSystems). Results In all cases we saw reliable reduction of the LPS level in serum (1.8 to 2 times). After the investigation of some cytokine levels we obtained similar results: IL-6, TNFβ and IL-8 levels had been reduced 30 to 40%, 87% and 62 to 76% accordingly. At the same time, IL-2, INFγ, IL-12, TNFα, IL-4 and IL-5 levels in serum had not changed practically. However, in approximately 1/3 cases an IL-1β and IL-10 concentration increase in serum was observed (33 and 40% accordingly). Also our research has shown that, in the most of the cases after application of this sorbent, LBP decreased insignifi cantly (10 to 32%), but the sIL-1 RII level slightly increased (10 to 18%) in the majority of patients and considerably decreased (two times) in 1/3 investigated patients. The CD14 concentration in patient serum reliably did not change before and after adsorption. The somatic status of patients was stabilized after LPS adsorption. Introduction It was assumed that pathologic activation of neutrophils and monocytes is associated with sepsis, acute lung injury/ARDS, and multiple organ failure, and that removal of these cells from the circulation could reduce leukocyte-dependent tissue injury. Cartridges containing polymyxin B (PMX) immobilized to fi bers (Toraymyxin; Toray Industries, Tokyo, Japan) have been developed for selective adsorption of circulating endotoxin in patients with Gram-negative bacterial infection, and this treatment has proven to be highly eff ective. This study examined the eff ect of direct hemoperfusion through fi lters with immobilized PMX direct-hemoperfusion (DHP) on leukocyte function and plasma levels of cytokines in patients with septic shock. Methods We evaluated the eff ect of PMX-DHP on circulating leukocytes in patients with septic shock by assessing the changes of neutrophil and monocyte surface antigen expression after PMX-DHP. In another experiment, heparinized blood from patients with sepsis was passed through PMX fi lters in a laboratory circuit and then changes of the cell count and surface antigen expression by neutrophils and monocytes were assessed. After perfusion, neutrophils were isolated and the capacity of these cells to damage cultured endothelial monolayers was also determined. We found that PMX-DHP led to an increased CXCR1 and CXCR2 expression along with a decrease of CD64 and CD11b expression by circulating neutrophils from septic patients. Plasma levels of cytokines, including IL-6, IL-8, IL-10, and high-mobility group box-1, were elevated in patients with septic shock compared with healthy controls, but cytokine levels were not altered by PMX-DHP. Ex vivo perfusion of heparinized blood from patients with sepsis through PMX fi lters in a laboratory circuit caused a signifi cant decrease of the neutrophil and monocyte counts. Activated neutrophils with high CD11b/CD64 expression and low CXCR1/CXCR2 expression showed preferential adhesion to PMX fi lters. Neutrophils isolated from the blood after ex vivo PMX perfusion caused less damage on the endothelial cell monolayer than cells from sham-treated blood. Introduction Direct and rapid removal of pathogens or noxious metabolites from a patient is the most straightforward cure imaginable. Dialysis and plasma fi ltration/exchange are the current broadly applicable methods to perform a direct removal of disease-causing factors from a patient. We describe the use of stable nanomagnets to rapidly and selectively remove heavy metal ions, overdosed steroid drugs and proteins from human blood. This nanomagnet-based purifi cation method avoids fouling of fi lter membranes and benefi ts from a high external surface area, and a correspondingly fast diff usion. Methods Nanomagnets equipped with heavy metal complexants, digoxin antibody fragments and entire human IL-6 antibodies were added to a series of blood samples. The nanomagnets scan blood by Brownian motion and capture their target. Afterwards, a small magnet was placed at the sample tube wall accumulating the nanomagnets in the pole region of the external magnet. The purifi ed supernatant can then easily be decanted. The concentrations of lead, digoxin and IL-6 in blood samples were determined by standard clinical methods. Blood integrity was observed by rotation thromboelastography and monitoring of serum potassium, lactate dehydrogenase, bilirubin and haptoglobin levels. To measure the biological relevance of the IL-6 removal, the eff ect on caspase-3 activation was assessed in camptothecin-stimulated neutrophils. Results A signifi cant decrease of lead, digoxin and IL-6 levels was measured after the blood purifi cation procedure. The extraction using nanomagnets was in clear dose-eff ect dependence and could be accurately titrated. Treatment with nanomagnets did not signifi cantly aff ect the integrity of blood and all levels remained in the clinical norm range. Caspase-3 assays showed a reduced anti-apoptotic eff ect after IL-6 removal, underlining the biological relevance of the achieved removal effi ciency. Conclusions We demonstrate the extraction of lead, digoxin and IL-6 from whole blood as an example for the rapid treatment of heavy metal poisoning, drug overdosing and severe infl ammation. The presented direct blood extraction could be combined with existing therapeutic strategies and may have major implications on the treatment of severe intoxications, sepsis (specifi c fi ltering of cytokines or toxins) [1] , metabolic disorders (thyreotoxicosis) and autoimmune diseases. References Methods Fifty-fi ve septic patients were enrolled in this study. Every patient had four CPFA treatments (LINDA; Bellco-Mirandola, Italy) for 8 hours with Q b = 200 ml/minute, Q ultrafi ltration = 30 ml/kg/hour and Q plasma = 20% of Q b . At T0 (basal), T1 (after fi rst cycle), T2 (after second cycle), T3 (after third cycle) and T4 (after fourth cycle) we evaluated haemodynamic parameters, norepinephrine dosage, PaO 2 /FiO 2 ratio, plasma IL-6, and procalcitonin (PCT). The ANOVA test was used to compare changes during times study. P <0.05 was considered statistically signifi cant. Results Patients enrolled in the study have been submitted to 256 CPFA treatments for 2,650 hours. Table 1 presents the main results of the study. IV quartile of IL-6 is shown in Table 1 . was not diff erent between groups. Group 1 patients had a trend towards higher number of organ dysfunction (P = 0.07) and more septic shock episodes (P = 0.09). Regarding treatment adequacy, compliance with any indicator or the entire SSC-6h bundle was not related to survival. However, achievement of the CPV target was associated with a higher early mortality (Group 1: 69.6%, Group 2: 40.6%, P = 0.013; OR -3.3.4 (1.33 to 8.40)). Conclusions In this observational study, factors related to disease severity rather than to the evidence-based treatment compliance were associated with short-term death among severe septic patients. The sepsis mortality rate is higher in Brazilian public hospitals with no clear diff erences in risk factors. We aim to identify risk factors that could explain why the mortality rate is diff erent in public and private hospitals. Methods All severe sepsis and septic shock patients over 18 years old admitted to two private ICUs (Group 1) and one public ICU (Group 2) were prospectively included. Demographic, clinical and outcome features, including compliance to the 6-hour Surviving Sepsis Campaign bundle (SSC-6h), were collected in an electronic CRF. Duration of organ dysfunction was defi ned as the time elapsed between the installation of the dysfunction and its diagnosis by the healthcare provider. Results were expressed as a percentage or as median and interquartiles, and P <0.05 was considered signifi cant. Introduction Critically ill cancer patients are at increased risk for acute kidney injury (AKI), but studies on these patients are scarce and were all single centered, conducted in specialized ICUs. The aim of this study was to evaluate the characteristics and outcomes in a prospective cohort of ICU cancer patients with AKI. Methods Prospective multicenter cohort study conducted in ICUs from 28 hospitals in Brazil over a 2-month period. Univariate and multivariate logistic regression were used to identify factors associated with hospital mortality. Results Out of all 717 ICU admissions, 87 (12%) had AKI and 36% of them received dialysis. Kidney injury developed more frequently in patients with hematological malignancies than in patients with solid tumors (26% vs 11%, P = 0.003). Ischemia/shock (76%) and sepsis (67%) were the main contributing factors, and kidney injury was multifactorial in 79% of the patients. The ICU mortality was 61% (53/87) and hospital mortality was 71% (62/87). Despite the lack of statistical signifi cance, hospital mortality was higher in patients who received RRT later on during the ICU stay (92%) in comparison with those who received RRT on the fi rst day in the ICU (78%) and those who were not dialyzed (64%) (P = 0.105). End-of-life decisions (to withhold or to withstand therapies) were taken in 18 (23%) patients. General and renal-specifi c severity-of-illness scores were inaccurate in predicting outcomes for these patients. In a multivariate analysis, length of hospital stay prior to ICU, acute organ dysfunctions, need for mechanical ventilation and a poor performance status were associated with increased mortality. Moreover, cancer-related characteristics were not associated with outcomes. Conclusions The present multicenter study confi rmed that AKI in critically ill patients with cancer is frequent, usually multifactorial and still associated with high mortality rates. On the other hand, the current study also suggests that ICU admission and RRT should be considered in selected patients. Mortality in these patients is mostly dependent on the severity of acute illness and the performance status, rather than cancerrelated characteristics. Conclusions These results suggest signifi cant increased ICU mortality in the following groups: over 50 years old, smokers, patients with poor exercise tolerance, patients with IHD or PV, patients with COPD and patients who drink alcohol to excess or those with ALD. While these results should not be used as a basis upon which to permit or refuse intensive care admission, they should be used to inform staff and patients of likely outcome from intensive care. Introduction C-reactive protein (CRP) is an acute-phase protein, the blood levels of which increase rapidly in response to infection, trauma, ischemia, burns, and other infl ammatory conditions. Serum albumin decreases in critically ill patients with similar conditions. The use of these blood tests as risk markers or as predictors of organ failure and death has been studied previously [1, 2] and we wished to investigate their applicability to our population. Admission CRP correlated positively with length of stay (r = 0.14, P = 0.017) and APACHE II score (r = 0.13, P = 0.03) but did not signifi cantly correlate with duration of mechanical ventilation (r = 0.10, P = 0.103). Admission albumin correlated negatively with length of stay (r = -0.15, P = 0.01), duration of mechanical ventilation (r = -0.15, P = 0.014) and APACHE II score (r = -0.17, P = 0.004). Conclusions CRP and serum albumin on admission are both predictors of intensive care mortality. CRP and serum albumin also correlate signifi cantly with other severity markers such as length of stay and duration of mechanical ventilation and also APACHE II score. that statins may reduce mortality in critically ill patients. We aimed to investigate the association between statin therapy on admission and outcome from intensive care in our ICU. Methods A prospective case-note review of 504 consecutive admissions to Glasgow Royal Infi rmary ICU was undertaken over an 18-month period. Details of statin prescription, cardiovascular co-morbidity and smoking status was sought from the patients' case notes by hand using details of current and previous admissions, clinical letters, results of investigations and correspondence from the patient's general practitioner, using agreed criteria. Demographic, Acute Physiology and Chronic Health Evaluation II (APACHE II) score and outcome data were retrieved from the Ward Watcher system in the ICU. Results Complete data were available for 444 patients. One hundred and eleven (25%) of these were on statin therapy on admission to intensive care. All data are expressed as mean ± 95% confi dence interval or median (interquartile range [1] . The aim of this study was to establish children's own views as to their outcome in this regard. Methods A cohort of 102 children aged over 7 years, with no preexisting learning diffi culties, completed the PedsQL 4.0 Pediatric Quality of Life Inventory [2] and a post-traumatic stress screener in face-to-face interviews, 3 months after discharge from the PICU. Of this group, 76 also completed questionnaires, by post or telephone, at 12 months. Results Children's total PedsQL scores were signifi cantly lower than those of healthy children at 3 months (PICU mean = 79.3; healthy mean = 83.9, P <0.01), but at 12 months they were comparable. By 12 months, the mean score for the PICU group on the physical functioning subscale had improved signifi cantly (from 73.1 to 81.5, P <0.001) but was still below normal population levels (healthy mean = 88.5, P <0.01). The total PedsQL score at 12 months was not associated with PIM score or length of stay, but was signifi cantly negatively associated with post-traumatic stress symptoms (r = -0.40, P <0.001). Within-group analyses revealed that electively admitted children reported higher emotional functioning than healthy children at 3 months (PICU elective mean = 91.0 vs healthy mean = 78.5, P <0.01). Conclusions The self-report version of the PedsQL proved to be a feasible and responsive tool for assessing HRQoL in this group of PICU survivors. However, in order to assess this aspect of outcome for the majority of children admitted to the PICU, who are younger and/or have signifi cant cognitive impairment, administration of the parent-proxy version of this questionnaire would be necessary. Health Methods Potential participants were patients admitted for more than 48 hours to a 12-bed medical-surgical ICU. Patients or their proxies were asked to give informed consent and complete the SF-36. The SF-36 is a questionnaire measuring HRQOL on eight subscales [1] . The use of the SF-36 in proxies has been validated [2, 3] . Participants were asked to complete the SF-36 based on the situation 4 weeks before ICU admission. SF-36 scores were compared with normative data (n = 1,742) [1] , by independent t tests. Results Fifty-one questionnaires were completed, of which 28 (51.9%) were completed by a proxy. HRQOL before ICU admission was signifi cantly lower on all SF-36 domains compared with the general population (P <0.0001) ( Figure 1 ). This is in line with fi ndings in one other Dutch survey [4] . Conclusions HRQOL before admittance to the ICU is lower compared with HRQOL in the normal healthy population. This is likely to contribute to the diminished HRQOL after ICU discharge. To measure the infl uence of critical illness and ICU stay on HRQOL after an ICU stay, it is important to measure HRQOL before ICU admission. Introduction Creation of innovative retention strategies is a major focus for nursing administration as a shortage of nurses recurs and turnover of staff becomes a problem. Retention strategies, to be eff ective, need to be targeted specifi cally to particular conditions of the nursing staff . Introduction Thiopental (TP) is used in severe brain trauma to control high intracranial pressure episodes. Deleterious side eff ects were described that questioned its ability to reach its objectives securely and effi ciently. We analyzed the impact of TP use on cerebral hemodynamics and caregivers' behavior towards recommendations using a high-rate recording information system. Methods A set of 813 hours of data were recorded at a rate of 0.5 Hz. The study was observational. Deleterious episodes were detected using the Information System (IS) and validated by two experts. We detected low cerebral perfusion pressure (CPP), high intracranial (HICP) and/or low mean arterial pressure (lMAP) episodes. We used commonly admitted threshold for the ICP (20 mmHg) and the CPP (60 mmHg). Medical orders intended to reach the recommended objectives were analyzed. Results Forty-eight periods stratifi ed according to TP dose were analyzed on 16 patients. Cumulative duration with TP was 20,520 minutes and 26,294 minutes without TP. The mean dose of TP administration was 250 ± 20 mg/hour (3.4 ± 1.4 mg/kg/hour). The HICP incidence was 85 ± 16 per 100 hours with TP vs 95 ± 29 without TP (NS). HICP duration was longer with TP (27 ± 2 vs 19 ± 2 minutes, P <0.005). The lMAP incidence was the same with or without TP (56 ± 19 vs 34 + 8 per 100 hours of monitoring). Duration of lMAP episodes was the same with or without TP (22 ± 2 vs 20 ± 2 minutes, NS). The incidence of orders intended to restore cerebral hemodynamics was equivalent in both situations (TP vs no TP, 79 ± 24 vs 104 ± 29 per 100 hours, NS). One hundred and eighty-eight medical orders were analyzed. Use of catecholamine was more frequent with TP (57% (n = 44) vs 43% (n = 33), P = 0.034). Conclusions The use of TP complies with the recommendations (prolonged HICP). It did not result in a signifi cant increase in cerebral hypoperfusion episodes. We showed evidence of adaptations to its use by physicians. The doses are lower than the doses prescribed earlier in the literature and the use of catecholamine is more frequent during TP infusion. This could result in a better control of deleterious side eff ects of TP and a better compliance with recommendations. Our IS was effi cient in physicians' orders' analysis. Introduction Mistakes and errors may occur during the care process, particularly in the ICU as characterized by its large number of drugs administered to the single patient. One of the most important infl uencing factors on intramural morbidity and mortality is indeed a harmful or unpredicted reaction to a drug, a so-called adverse drug event (ADE) [1] . By far the largest proportion (70%) of these ADEs is dose related [1] . The objectives of this study are to measure the frequency and severity of ADEs in the ICU and to determine the infl uence of severity of illness and nursing workload on their prevalence. Methods A cross-sectional study in the ICU of a tertiary referral hospital, based on a retrospective chart review. A tool was developed for measuring the incidence and characteristics of ADEs based on the Global Trigger Tool for Measuring Adverse Drug Events [2] . If an ADE was identifi ed the severity was evaluated using the categories based on the system for classifying medication errors by the National Coordination Council for Medication Error Reporting and Prevention [3] . The severity of illness was calculated using the Sepsis-related Organ Failure Assessment score [4] and the nursing workload by the classical Therapeutic Intervention Scoring System-28 [5] . The review of 1,009 nursing days in 79 patients revealed 230 ADEs, which occurred in a total of 175 nursing days. The most commonly identifi ed ADE was a hypoglycemia of <50 mg/dl (n = 75), followed by hypokalemia (n = 67). Ninety-six percent of the ADEs were classifi ed as category E, whereas only 4% of ADEs were classifi ed as category F. The mean severity of illness and nursing workload scores were signifi cant higher on nursing days when an ADE occurred (P <0.001 and P = 0.002, respectively). Conclusions ADEs are common in the ICU. The lack of a golden standard for reporting and collecting ADEs makes it diffi cult to compare with other studies and to assess the real value of this study. However, these date strongly and clearly indicate the infl uence of severity of illness and nursing workload on the prevalence of ADEs. Introduction Appropriate use of ICU resources is mandatory. When a patient is admitted to a unit able to provide a higher (lower) level of care than required, a waste (overuse) of resources can be advocated. StART is an approach to identify possible mismatches between the level of care actually delivered, assumed to correspond to what is clinically required, and the level of care deliverable by the unit. Methods ICU beds are classifi ed by levels of care deliverable as High (ventilator, monitor, and 720 minutes nurse time) and Low (monitor and 360 minutes nurse time) [1] . The level of care actually delivered is classifi ed as High (invasive or non-invasive ventilation, or two vasoactive drugs, or at least two of the following: one vasoactive drug, dialysis, respiratory support), Low (single vasoactive drug, or dialysis, or respiratory support) and Ordinary (none of the above) [2] . Mismatches between the level of beds available and the level of care delivered were evaluated both on admission and for each ICU-day of 4,237 patients in 28 ICUs. An ICU-day was judged as inappropriate, even without mismatch, if an Ordinary patient was present. [1] . Patients in critical care typically receive highrisk medications. Error reporting is integral to identifying common errors and medication risk reduction. Methods A medication reporting form was developed to run alongside the offi cial hospital incident reporting system for 2 weeks. Forms were distributed throughout the critical care facility. All members of the multidisciplinary team were asked to anonymously complete a form every time a medication error, or near miss, occurred. After 2 weeks, the submitted forms were analysed by the project team. In total 112 reports were submitted. The largest numbers of incidents reported were due to prescribing (67%) errors followed by administration (15%), documentation (7%), electronic prescribing problems (6%), storage (3%) and monitoring (2% Introduction Communication between healthcare professionals is a key step for patient safety, its failure accounting for over 60% of root causes in sentinel events [1] . Bedside rounds are important for teamwork communication and can be improved by an explicit approach [2] and by process-oriented information tools to organize and direct interprofessional rounds [3] . Methods As part of a quality improvement project, we conducted an observation of the documentation of daily goals (DG) and best practices (BP) in a step-down unit (both tools have been previously added to patient fl owsheets), before and after the introduction of a structured rounds process and team education. Our hypothesis was that these important tools were used before rounds, without input from all team members. Rounds were observed on two separate periods and the observer would take notes of whether DG and BP were documented or not and whether discussion took place before documentation. Diff erences in proportions between the two periods were analyzed with Fisher's exact test. P <0.05 was considered signifi cant. We observed 100 bedside interactions on each period. Documentation remained unchanged for DG (pre 55% vs post 53%, P >0.05) and BP (pre 57% vs post 48%); however, the second period had an improved documentation after team discussion (DG: pre 2% vs post 31%, P <0.001; BP pre 0% vs post 33%, P <0.001). Conclusions The intervention aided in increasing documentation after discussion, implying an increased communication among the interprofessional team. About 50% of patients still will not have documentation after bedside rounds. Patient information was not collected, therefore our study is limited in providing information on clinical outcomes. Further research should focus on how to best implement these tools, how to qualitatively assess the content of daily goals and to demonstrate eff ects on patient-centered outcomes. Introduction The use of a high-fi delity simulator can lead to very realistic clinical situations sometimes diffi cult to manage psychologically. The aim of this study was to evaluate the psychological stress induced by simulationbased training and the associated skills in anesthesiologist residents. Methods A cohort of 27 residents was studied. The psychological stress just before and after the simulation session was quantifi ed by autoevaluation scale (numeric scale 0 to 10) and by salivary amylase sampling [1] . Nontechnical skills were quantifi ed by analysing videotapes and scoring the Anaesthetist NonTechnical Skills [2] . The median stress numeric scale before the simulation session was 5 (ranging 2 to 8), and after was 7 (2 to 10) (P = 0.0004) (Figure 1 ). The stress scale before the session was signifi cantly lower in residents who already underwent simulation-based training (P = 0.04). In 48% of residents, stress scales after the simulation session were above 8/10. Salivary amylase after the session was signifi cantly higher than before (P = 0.008), corresponding to a 2.2-fold increase. They were no signifi cant relationships between psychological stress parameters and nontechnical skills. Conclusions Psychological stress before the simulation session, but especially after simulation, appears to be high in anesthesiologist residents, and particularly in those who performed a simulation session for the fi rst time. This fact should be considered when organising such simulation-based teaching. References The respiratory therapist has a central role in the management of critically ill patients. The objective of this study was to evaluate the impact of an education program aiming to improve quality of care of respiratory therapy. Methods A before-after study was designed to assess compliance to 15 respiratory therapy indicators, chosen due to their relevance in patient care and availability for objective measurement, in all patients admitted to the ICU regardless of mechanical ventilation or not. Compliance was assessed during 1 month before implementation of the education program. The educational process comprised of meetings, group training, written manuals and individual feedback for noncompliance situations. After 6 months of the implementation process, compliance assessment was again performed. Results [1] . We decided to study the level of adherence of Mexican physicians to the measures proposed in this checklist. Methods We conducted a 3-month prospective observational study in the adult ICU of an academic hospital in northeastern Mexico. Patients were sorted into one of the two study groups: group 1 had a critical care physician on charge of the ICU management, while group 2 had a noncritical care physician on charge. We measured the adherence of the physicians to the FAST HUG checklist items. We arbitrarily defi ned good compliance as having fulfi lled >4 items. Results One hundred and forty-seven patients were admitted to the ICU during the study period, but only 129 had complete data and were evaluated. There were 86 patients in group 1, while group 2 had 43 patients. Group 1 patients had higher age, SOFA and APACHE score on admission. We analyzed the subgroup of patients who had an initial APACHE score between 11 and 25, as they are usually the ones most likely to benefi t from FAST HUG measures. These results are described in Table 1 . We noticed that most of the group 1 patients had severe sepsis, while most of the group 2 patients had ischemic heart disease, which clearly explains the diff erences in mechanical ventilation and length of ICU stay. Results The cohorts examined resulted in similar demographic and clinical data. As shown in Figure 1 , microorganisms isolated did not change signifi cantly in the two periods. Figure 2 summarizes the diff erences in tracheal aspirate/blood culture isolation among the two periods. Conclusions We cannot confi rm that a single-bed room location can prevent nosocomial infection diff usion. A limit of this study, as in previous The sniffi ng position is widely promoted for teaching airway positioning prior to intubation attempts, but whether this analogy results in novices placing the head and neck appropriately has not been evaluated. An alternate analogy -win with the chin -or simple anatomic instructions may perform better. Methods Randomized controlled study comparing diff erent instructions for enabling novices to adequately position a simulator mannequin's head and neck. Study volunteers included medical students and PGY1 residents in surgery and internal medicine. Subjects independently positioned a mannequin based upon their understanding of four randomly-assigned written instructions for: the sniffi ng position; the win with the chin analogy; anatomic instructions; and no instructions (control). Digital photographs following each instruction were analyzed by two airway experts for adequacy of overall positioning, and for the three components of airway positioning. Results There were 81 volunteers. The positioning was adequate most often (43.2%) following the win with the chin analogy as compared with the other instructions (37.0% anatomic instructions; 19.8% control; 14.8% sniffi ng position analogy). Positioning following the sniffi ng position instructions was not diff erent from no instruction (P = 0.53). The win with the chin and anatomic instructions were better than no instructions (P = 0.002 and P = 0.023, respectively). The win with the chin analogy resulted in adequate airway positioning signifi cantly more often than the sniffi ng position or control, and maintained atlanto-occipital extension when compared with providing anatomic instructions. Overall, win with the chin was a superior teaching analogy and could replace the sniffi ng position analogy. with an average APACHE II score of 13 and average SOFA score of 5. Four admissions (17%) were elective, due to severe co-morbidities, extreme weight or anticipated airway diffi culties. Seven patients needed to go back to theatre one or more times for re-exploratory surgery due to postoperative complications. Three patients remained intubated for 19 days or more, two of whom required a tracheostomy. Three patients died; two from intra-abdominal haemorrhage and one from sepsis. Conclusions This is a high-risk cohort of patients, with 4.8% requiring an ICU admission. The need for patients to return to theatre for re-exploratory surgery was associated with longer ICU stays and multiple complications. This demand on critical care services is likely to increase as bariatric surgery continues to grow as a specialty and the NHS will need to increase investment in ICUs accordingly. Five Results Over the 5-year period, both quantities and costs of drugs increased, following a nonsteady, nonparallel pattern. Four ATC codes accounted for 80% of both quantities and costs, with ATC code B (blood and haematopoietic organs) amounting to 63% in quantities and 41% in costs, followed by ATC code J (systemic anti-infective, 20% of the costs), ATC code N (nervous system, 11% of the costs) and ATC code C (cardiovascular system, 8% of the costs). Prescription by SC amounted to 1% in drug quantities, but 19% in drug costs. The rate of increase in quantities and costs was seven times larger for ICP than for SC (Figure 1 overleaf ). Some peak values in costs and quantities were related to a very limited number of patients. Conclusions A 5-year increase in quantities and costs of drug prescription in an ICU is a matter of concern. Rather unexpectedly, total costs and cost increases were generated mainly by ICP. A careful follow-up is necessary to try infl uencing this evolution through an institutional policy co-opted by all professional categories involved in the process. Introduction One consequence of pressure on inpatient hospital bed numbers is a delay in the discharge of critically ill patients to the ward. Although this delay in discharge has been reported in a number of countries, the clinical and economic consequences of this delay have not been studied [1, 2] . Methods We examined data from a mixed London ICU (32 beds) using the medtrack software tool (MASH Ltd) in addition to real-time mapping of ICU bed utilisation and patient fl ows. We defi ned delayed discharge as beginning 3 hours after the actual medical/nursing decision to discharge. We described a set of criteria a priori, to assess the eff ect of delayed patient discharge, searching for both benefi cial and harmful eff ects. We also analysed the economic consequences of discharge delay using the medtrack capture of the critical care minimum dataset and available tariff s. In the 6-month analysis period, there were 15,320 hours of delayed discharge. This resulted in the postponement/delay of 45 operations for patients requiring level 3 intensive care and 177 operations for patients requiring level 2 intensive care. One hundred and fourteen tertiary referrals were refused because of bed blockage. Seventy-two patients were discharged between the hours of 22:00 and 06:00. There was a mean delay of 5.3 hours between identifying a patient requiring unplanned admission to critical care and their actual admission due to delayed discharge. Only 21 patients acquired a hospital-acquired organism while awaiting discharge (seven infections). Family/patient interviews suggested that the negative eff ects of noise and exposure to the resuscitation and/or death of ICU patients were off set by a perceived benefi t of receiving critical care nursing for a prolonged period. This potential benefi t was refl ected by a 72% reduction in the readmission of critically ill patients to critical care. The estimated cost of delayed discharge was £342,000. Conclusions Delayed discharge of patients to the ward following an episode of critical illness is a common and increasing problem. This delay is not benign and should be considered when prioritising bed utilisation in acute hospitals. Introduction Pain is the most relevant factor for prolonged hospital stay after thoracic surgery and is associated with stress known to alter the Th1/Th2 ratio (Th = T helper cells) in the immediate postoperative period. Thoracic epidural block (TEB), central α 2 -receptor stimulation via intravenous clonidine application and stimulation of opioid receptors can decrease either pain and/or stress and might therefore infl uence this immune imbalance. The primary endpoint of the current study was the perioperative Th1/Th2 balance in lung surgery. The secondary endpoints aimed to the incidence of pain and pneumonia. Methods After approval by the ethics committee and informed consent a total of 60 patients was randomized to receive double-blinded either remifentanil intravenously, or remifentanil + clonidine intravenously, or ropivacaine epidurally. Pain intensity was assessed by the numeric rating scale (NRS). The Th1/Th2 ratio was measured using a cytometric bead array. Pneumonia was diagnosed according to the hospital-acquired pneumonia criteria of the American Thoracic Society. The Th1/Th2 ratio adjusted for baseline diff ered between groups over time (P = 0.012). At the end of surgery there was no signifi cant diff erence between the remifentanil and the remifentanil + clonidine groups (P = 0.679) but a signifi cantly lower ratio in the ropivacaine group compared with the remifentanil (P = 0.004) and the remifentanil + clonidine groups (P = 0.019). NRS scores immediately after surgery were lower in the ropivacaine group compared with the remifentanil group and the remifentanil + clonidine group but achieved only borderline statistical signifi cance. None of the patients developed pneumonia. Conclusions Intraoperative TEB decreases the Th1/Th2 ratio and provides better pain therapy immediately after surgery. Introduction Protocol for perioperative usage of beta blockers was used in only 9% of major clinics in the USA. It is a very common practice to discontinue beta blockers during the perioperative period or to maximally reduce their doses. The main purpose of this paper is to investigate the incidence of adverse eff ects of beta blockers during large-scale trauma and orthopedic surgeries in spinal anesthesia. Methods In this open prospective study we evaluated the adverse eff ects of beta blockers in ASA II and III research subjects who received beta blockers 1 to 12 hours preoperatively, n = 37, mean age 67 ± 12, with adjacent metabolic syndrome. Results were compared with the same number of patients and ASA group, mean age 57 ± 18, who have not received beta blockers. Patients from both groups received midazolam premedication 30 minutes before arriving in the operating room and spinal anesthesia with 0.5% bupivacaine. Patient surveillance was conducted by continued EKG, blood pressure, pulse (all non-invasive) and SpO 2 monitoring. Systolic and diastolic pressure and pulse were recorded every 10 minutes. Results were statistically tested (M ± SD, t test). The results are shown in Figure 1 . The group receiving beta blockers shows a higher percentage of patients with hypertension on arrival in the operating room for 11% and hypotension after the start of spinal anesthesia. Occurrence of bradycardia (HR <60/minute) was increased with statistical signifi cance for 24% (P <0.05), with use of atropine for 27% (P <0.05). The occurrence of arrhythmias shows a statistical increase in the same group, for 27% (P <0.05), as well as nausea for 29% (P <0.05). Conclusions Due to their adverse eff ects, beta blockers should be discontinued before spinal anesthesia and surgical procedures with signifi cant circulating volume loss. Introduction Surgical-related neuroendocrine stress response extends over the postoperative period, inducing insulin resistance and hyperglycaemia [1] . Remifentanil continuous infusion, reducing pain and anxiety, may reduce postoperative insulin resistance and result in better glycaemic control. The aim of the study was to assess trends in glycaemia and HOMA scores in postoperative patients during remifentanil-driven, postoperative analgo-sedation. Methods Enrolled patients were those consecutively admitted to a surgical ICU after major abdominal interventions during a 1-month period (July 2009). Group R patients underwent analgo-sedation with continuous remifentanil infusion at the recommended doses [2] . Glycaemia and HOMA scores were evaluated at admission time (T0) and after 12 and 24 hours. Control patients were those who underwent morphine-based continuous analgo-sedation (M Group Conclusions Remifentanil-based analgo-sedation was associated with lower glycaemia and HOMA score in patients following major abdominal surgery, particularly at 24 hours from ICU admission. HOMA scores showed that morphine-treated patients were insulin resistant, although normoglycaemic. A better control of the neuroendocrine stress response in patients analgo-sedated with remifentanil might explain these results. Introduction Ultralow-dose opioid antagonists can enhance opioid analgesia and prevent tolerance in rodent nociceptive pain assays. Methods A randomized, double-blind controlled trial is designed to investigate whether the addition of 5 ml ultralow-dose naltrexone (1 μg in a liter of sterile water) to morphine (0.05 mg/kg) changes the total opioid requirement and side eff ects. Results Two hundred and sixty-seven patients (18 to 45 years old) with moderate extremities trauma entered the study, Pain control measurements were evaluated every 15 minutes for the fi rst hour, then every 30 minutes for the second and third hours and fi nally at the fourth hour. Effi cacy was measured by the 11-point numerical pain rating scale. The following side eff ects were evaluated: sedation, nausea, vomiting and pruritus. We found that opioid requirements did not diff er signifi cantly between groups. The morphine + naltrexone group on average required 0.04 mg more morphine during the 4 hours than the morphine group. Conclusions The morphine + naltrexone group had a lower incidence of nausea than the morphine + placebo group. However, the incidence of vomiting, pruritus and sedation between the two groups were similar. The combination of ultralow-dose naltrexone and morphine in moderate Introduction Pulmonary hypertension (PH) is a life-threatening disease commonly seen in ICU septic patients and associated with poor outcome. New and better therapies are required, since the response to various agents such as NO, prostaglandins and phosphodiesterase inhibitors is usually partial and the mortality rate remains high. Inhaled drugs seem to be an attractive treatment option, since they are delivered directly to pulmonary resistance vessels. A new device (anesthetic conserving device -AnaConDa (ACD)) permitting direct administration of volatile anaesthetics -such as sevofl urane -to the breathing circuit of a conventional ICU ventilator in a safe and eff ective way has recently been introduced. The aim of the present study was to evaluate the effi cacy and safety of sevofl urane administration via the ACD on a porcine model of acute PH during sepsis. Methods PH was induced in 16 anaesthetized, mechanically ventilated swine (25 kg) by intravenous infusion of 0.5 mg/kg LPS (Escherichia coli, 111:B4) in a period of 30 minutes. After LPS, animals were randomized into two groups. In group A sevofl urane was infused via the ACD, according to the manufacturer's recommendations in order to obtain an end-tidal concentration of 0.5%, whereas group B did not receive sevofl urane and served as control. Haemodynamic parameters (systemic and pulmonary) were recorded before (phase 0) and after the LPS (phase 1) and every 20 minutes for the next 2 hours (phases 2 to 7). Results LPS was found to produce PH (phase 1) and to reduce arterial blood pressure in both groups. After sevofl urane infusion, both systolic (PAPs) and diastolic (PAPd) pulmonary pressures exhibited step-by-step reduction, which became statistically signifi cant in phase 3 and thereafter, in relation to phase 1 values. On the contrary, in group B animals, not sevofl urane-treated, pulmonary pressures remained at high levels throughout the study period. Systemic arterial pressure exhibited an endotoxin-related reduction in both groups, which was not found to be aff ected by sevofl urane treatment. Conclusions Sevofl urane administration via the ACD was found to reduce the pulmonary pressure in a porcine model of endotoxin-induced acute PH without any detrimental eff ects on the systemic circulation. This may represent a signifi cant advance in the treatment of acute PH. However, the potential clinical implications of the method merit further study. Introduction Methyl-naltrexone (Relistor®), a peripheral μ-opioid receptor antagonist shown to induce laxation in patients receiving opioids, is licenced for use in palliative care; a similar benefi t is proposed in critical care patients. Impaired gut motility and constipation are common issues in the critical care setting with contributing factors including trauma, surgery and use of opiate analgesics. The eff ects of methyl-naltrexone were studied on seven patients in whom conventional treatment for constipation with stimulant and osmotic laxatives, faecal softeners and suppositories failed to produce eff ects. Methods Over a 6-week period, 15 critically ill patients with opiateinduced constipation (OIC) of more than 4 days duration were pros pectively studied. All patients were treated with conventional methods from admission to critical care, which included regular senna and sodium docusate, supplemented with glycerin suppositories and picolax for resistant constipation. Methyl-naltrexone was administered to seven patients at 0.15 μg/kg subcutaneously on alternate days until laxation occurred. The remainder of patients in the study continued with conventional therapy. Results Six out of seven patients responded to methyl-naltrexone with laxation occurring between 12 and 24 hours. One patient did not respond due to faecal impaction but subsequently experienced laxation after manual disimpaction. Side eff ects were few (nausea 14%, vomiting 14%) and there was no increase in opiate requirement. Those patients treated by conventional means eventually produced laxation after a further 3 to 5 days. Conclusions Previous studies have shown the earlier return of gut motility improves patient progress, facilitates earlier respiratory weaning, prevents debilitation and results in reduced length of stay. Methyl-naltrexone is a safe and effi cacious drug for the treatment of OIC in critical care patients in whom conventional treatments prove unsuccessful. Side eff ects are few and opiate requirements remain unchanged. Further investigation to elucidate the cost-eff ectiveness of methyl-naltrexone use in OIC of the critically ill patient is warranted. Introduction Evidence suggests prolonged sedation is associated with prolonged ventilation and mortality. Further evidence suggests that some sedatives and analgesics are more prone to cause oversedation than others, possibly due to disturbances in pharmacokinetic properties in the context of acute organ failure. Our aim was to describe current sedation and analgesic practice in UK ICUs, and to determine whether recent evidence has changed practice in recent years. Methods We performed a web-based survey using a tool developed via a systematic, step-wise process following recognised practice for questionnaire design. These steps included question development and questionnaire formatting and testing before being published on the Zoomerang™ website. Conclusions Propofol has been proposed as an alternative to benzodiazepines in severe AWS, but it could increase the rate of complications (especially seizures and respiratory infections) that could lead to an increase in the intubation rate and length of stay. The APACHE score could be a predictor for the risk of complications and intubation. Introduction The objective of this study was to identify the proportion of patients who experienced agitation and/or pain during physiotherapy sessions in intensive care (ICU). The prevalence and causes of distress in critically ill patients are poorly described. Identifi ed stressors include physiotherapy [1, 2] , but no standards for sedation/analgesia management during physiotherapy exist. Recent evidence supporting routine ICU management at lighter levels of sedation potentially increases the importance of interventions to avoid distress during interventions such as physiotherapy. Methods A prospective observational study was undertaken. Fortynine patients admitted to the ICU requiring mechanical ventilation and physiotherapy were recruited into the study. A single session of physiotherapy was observed by an independent assessor. Agitation was measured using the Richmond Agitation Sedation Scale (RASS) and pain using the Behavioural Pain Scale (BPS). The RASS and BPS were collected immediately pre physiotherapy, after each intervention during physiotherapy and at 5 and 30 minutes post physiotherapy. Results Sixteen participants (33%) experienced agitation (RASS score ≥1) while 48 participants (98%) experienced pain (BPS score ≥4). Figure 1 shows the number of participants who experienced agitation and pain in relation to their RASS level prior to physiotherapy. Of interest, the independent assessor observed little communication, in regards to the management of sedation or pain at any stage during physiotherapy, between nursing and physiotherapy staff . Regardless of the RASS level pre physiotherapy, the majority of participants experienced pain. In contrast, only one-third of participants experienced agitation; this was more prevalent for those at a lighter sedation level. More routine use of RASS, BPS and communication with nursing staff should be undertaken during physiotherapy to ensure optimal levels of sedation and adequate levels of analgesia are achieved. Introduction Contrary to wards that manage usually chronic or acute pain (algology, gerontology, surgery, recovery room, and so forth), a comparison of the fi ve most popular self-report pain tools (vertical and horizontal Visual Analog Scale (VAS-V, VAS-H), 0 to 10 oral Numeric Rating Scale (NRS-O), 0 to 10 visual enlarged NRS (NRS-V), Verbal Descriptor Scale (VDS)) has never been evaluated in an ICU setting. Methods Consecutive patients admitted to a medical-surgical ICU during 1 year were included when alert (RASS >-2) and able to follow simple commands. Exclusion criteria: previous self-report pain assessment without the presence of an investigator. Pain assessment using the fi ve scales in random order either at baseline (T1) and after (T2) administration of an analgesic, or during a nociceptive procedure, in absence of pain at baseline. Evaluated parameters: psychometric properties of scales (feasibility, validity, responsiveness and preference). Nonparametric tests were used for statistical analysis (Statview 5.0). Data are expressed as median (25th to 75th). Results All basal levels were below the expected values. Following enteral administration, pharmacological levels were already reached in 5 minutes, with a serum peak after 16 minutes (half-absorption time: 3 minutes 17 seconds). The maximum serum level observed was 11,040 pg/ml and the disappearance rate indicated a half-elimination time of 1 hour 34 minutes. Serum melatonin levels decreased signifi cantly after midnight; pharmacological levels were maintained up to 10 hours following administration. No excessive sleepiness was reported in this patient group. Conclusions Critically ill patients exhibited reduced melatonin secretion, as reported in the literature. Despite the critical illness, the oral bioavailability was satisfactory: serum levels after oral administration showed basically unchanged intestinal absorption, while the disappearance rate was slower than reported elsewhere in healthy volunteers. Introduction The aim of our study is to describe clinical progress, need for mechanical ventilation (MV), complications and mortality of patients with delirium tremens (DT) admitted to our ICU. Methods Patients with a diagnosis of DT admitted to a medical ICU of a tertiary hospital from January 2001 to December 2008 were included. We recorded: admission diagnosis, pathologies associated with DT, APACHE II score, treatment, need for MV and duration, complications, length of stay in the ICU and total hospital stay, mortality and survival at 2 years. Results There were 50 cases of DT. Median age 45 years, 96% male. Reasons for hospital admission: DT (68%), seizures (36%), sepsis (16%), brain injury (10%). Reasons for admission to the ICU: DT (54%), DT and seizures (26%), DT and brain injury (10%), DT and sepsis (8%), DT and other (2%). Median APACHE II score was 9 (range: 8 to 13). Seventy-four percent of the patients were controlled with no need for MV with a midazolam infusion of 45 to 50 μg/kg/hour and haloperidol of 17 to 30 μg/kg/hour. Twenty-eight per cent (14 patients) required MV, of them 78% had DT and other associated pathology on admission (Figure 1 ). From a total of 28 patients admitted with DT and no associated pathology, only four needed MV. The median time on MV was 2 days (range: 1 to 9). Complications were present in 22%: ventilation-associated pneumonia, and other infectious conditions. Total hospital length of stay 14 days ((SD: 11), 5.5 days in the ICU (SD: 9). The registered hospital mortality was 4%. Rate of survival at 2 years was 84%, from these 72% had continuous visits to emergency (≥5 consultations in 2 years) for episodes related to alcohol withdrawal or complications associated with chronic alcohol abuse. Conclusions Even though our patients with DT are frequently admitted to the ICU, this is a pathology that most of the time has a benign course with low hospital mortality and high rate of survival at 2 years. These patients are regular users of the health system. The occurrence of complications and the need for MV in our patients were low, and were present in the group of DT with other diagnoses. Introduction Delirium in ICU patients increases time on mechanical venti lation, is an independent risk factor for death and can cause cognitive impairment in survivors [1, 2] . The two most commonly used validated assessment methods for diagnosing delirium in intensive care are the Confusion Assessment Method for ICU (CAM) and the Intensive Care Delirium Checklist Score (ISDCS) [3] . We wished to determine the prevalence of delirium in our unit, whether there was a diff erence in results between these methods of assessment and whether good agreement could be achieved between two trained assessors. Methods We performed a prospective prevalence study in a single tertiary ICU. Patients were assessed between days 3 and 10 of their stay and tested provided their Richmond Agitation Sedation Score (RASS) was ≥-3 [1] . Each patient was independently assessed by the two trained assessors with the CAM and ISDCS, with randomisation of both the order of interview and the score used. Both sets of assessments were carried out within an hour of each other. Exclusions included readmissions, those who did not speak English, the deaf and registered psychiatric patients. The CAM was completed by the assessors and the ISDCS required observational answers from the bedside nurse. We performed 104 assessments (208 tests) on 52 patients. Mean (SD) age, APACHE II and total SOFA on the day of admission were 67.4 (14.0), 14.0 (5.3) and 5.7 (2.7), respectively. Delirium was found in 37.5% of patients using the CAM, but only 14.4% using ISDCS. Inter-rater agreement for CAM and ISDCS was 86.5% (κ 71.4%, SE 13.8%, P <0.001) and 94.2% (κ 76.7%, SE 13.8%, P <0.0001), respectively. There was signifi cant underscoring of the RASS by the bedside nurses. When the trained assessors found a diff erence, it was always lower than the score documented by the bedside nurse. Conclusions There was good inter-rater agreement in the diagnosis of delirium between the two trained assessors, but the prevalence found was lower than previously reported [1] and varied considerably with the method used. The diff erence in results between the two scores may be due to a lack of discernment on the part of the bedside nurse of the ISDCS assessment process. Introduction Delirium aff ects up to 33% of acutely hospitalized patients. In ICU patients it can prolong the length of stay and increase mortality [1] . The optimal management of delirium requires a calm environment, sleep hygiene and correction of underlying factors (for example, infection). This can be a challenge in the ICU, and drug therapy, commonly haloperidol, clonidine, benzodiazepines or propofol, is often used [1] . Quetiapine, an atypical antipsychotic, has been used in acute delirium outside the ICU [2] . It has few extrapyramidal side eff ects, a short half-life and is mildly sedating. Our impression was that it had potential in the treatment of delirium. Methods In a 30-bed tertiary ICU with more than 1,200 admissions annually we reviewed the notes of patients admitted from February 2008 to November 2009 who had a delirium treated with quetiapine. Patients were excluded if they were taking quetiapine prior to admission. The following data were recorded: Richmond Agitation and Sedation Score Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 S166 (RASS), duration of delirium, agents used, patient demographics and adverse events. Results Five patients met the inclusion criteria (Table 1) . All were males, aged 37 to 85 years. All had a prolonged delirium prior to the commencement of quetiapine and all were on a combination of four drugs (clonidine, haloperidol, lorazepam and propofol) that were not controlling their delirium. At 3 to 7 days following commencement of quetiapine the RASS scores were 0 and other drugs were ceased. No adverse eff ects were noted. Conclusions Quetiapine was successful in controlling prolonged ICU delirium and allowed weaning of other medications in these patients. It may be a useful delirium therapy. Further studies are required to demonstrate effi cacy and safety. References vs 36.9%, respectively). The ICU length of stay tended to be longer in the SynColl-Group (14 versus 10 days, P = 0.055). Conclusions In patients with severe sepsis, fl uid therapy with synthetic colloids -gelatin or the third-generation 6% HES 130/0.4 -considerably increases the incidence of ARF and need for RRT compared with crystalloids. These results confi rm recent meta-analysis and RCTs, which demonstrated an increased incidence of ARF by synthetic colloids [2, 3] . Introduction Dehydration is an important problem among patients admitted to the emergency department (ED). However, there has not yet been a consensus on the ideal fl uid for these patients. This study was planned to evaluate the ideal crystalloid for the patients admitted to the ED who have symptoms of dehydration. Methods We conducted a randomized controlled trial that included 90 dehydrated patients. Three groups of patients each included 30 randomized for one of the solutions Lactated Ringer's, 0.9% NaCl or Isolyte® (Eczacıbaşı-Baxter, Turkey). Solutions were infused at a rate of 20 ml/kg/ hour for 2 hours. Venous blood sample pH, Na + , K + , Cl -, and HCO 3 levels were evaluated at 0, 1, and 2 hours. Results We detected a decrease in serum pH (7.406 to 7.365) and HCO 3 -(23.1 to 21.5) levels at the second hour in the 0.9% NaCl group. However, in the Isolyte® group an increase in both serum pH (7.410 to 7.434) and HCO 3 -(23.4 to 24.4) levels were observed. In the Lactated Ringer's group, Introduction To evaluate early continuous venovenous hemodiafi ltration (CVVHDF) in patients with refractory septic shock and multiorgan failure upon mortality and morbidity in the ICU. Methods Forty patients were prospectively studied and randomly treated with either conventional treatment (20 patients; group II) or with early CVVHDF (less than 6 hours of maximal hemodynamic support) in addition to the conventional treatment (20 patients; group I). Metabolic acidosis, serum lactate and serum procalcitonin level (PCT) before and 5 days after CVVHDF were monitored to evaluate the outcome. APACHE II and ΔSOFA scoring systems were used before and 5 days after CVVHDF. Results Compared with group II, patients of group I had lower mortality (55% vs 70%) with an insignifi cant P value (P = 0.54). Group I patients showed a nonsignifi cant ΔSOFA (5.95 ± 4.39 vs 6.2 ± 3.3 in groups I and II, respectively, P = 0.66); regarding APACHE II scores, group I also showed statistically nonsignifi cant lower fi gures than group II (on admission APACHE II scores were 39.35 ± 10.65 vs 41.85 ± 10 in groups I and II, respectively, P = 0.45, while on day 5 APACHE II scores were 34.8 ± 10.6 vs 36.1 ± 10.9 in groups I and II, respectively, P = 0.41). Group I patients showed lower PCT on admission and day 5 than group II patients (on admission PCT level was 0.64 ± 0.18 vs 0.68 ± 0.17 in groups I and II, respectively, P = 0.5) while the day 5 PCT level was (0.51 ± 0.15 vs 0.52 ± 0.17 in groups I and II, respectively, P = 0.83). Indicators of improvement showed a statistically signifi cant diff erence between survivors and nonsurvivors in group I regarding serum lactate level at day 5 (P <0.001), while other indicators as fever, renal profi le, WBC count, metabolic acidosis, serum lactate level on admission and platelet count were statistically insignifi cant (on admission P = 0.2, 0.55, 0.45, 0.41, 0.65, 0.55, respectively, and on day 5 P = 0.37, 0.94, 0.71, 0.5, <0.001, 0.88, respectively). There was a signifi cant statistical diff erence between survivors and nonsurvivors in group I considering the number of organ failures as less than or equal to three organs involved in comparison with more than three organs involved (P = 0.008). Introduction Both inhaled hydrogen sulfi de (H 2 S) [1] and intravenous H 2 S donors protected against kidney ischemia/reperfusion (I/R) injury [2] [3] [4] , but all these data originate from unresuscitated rodent models. Therefore, we investigated the eff ect of the H 2 S donor Na 2 S in a clinically relevant porcine model of aortic occlusion-induced renal I/R injury. Methods Anesthetised and ventilated pigs received Na 2 S (n = 9) or vehicle (n = 10) for 2 hours before and 8 hours after 90 minutes of intra-aortic balloon occlusion-induced kidney ischemia. During reperfusion noradrenaline was titrated to keep blood pressure at baseline levels. Before Na 2 S, prior to aortic occlusion and at 1, 2, 4 and 8 hours of reperfusion, we measured renal blood fl ow and function (creatinine clearance and blood levels, fractional Na + excretion), blood cytokines (TNFα, IL-6, IL-1β) and nitrates, renal tissue DNA damage (comet assay), HO-1 and caspase-3 expression (western blotting), and NF-κB activation (EMSA). Histological damage (glomerular tubularisation [5] ) was assessed immediately post mortem. Results Na 2 S pretreatment was associated with a progressive fall in core temperature and signifi cantly lower noradrenaline infusion rates needed to achieve the hemodynamic targets. While renal blood fl ow and fractional Na + excretion were comparable, Na 2 S attenuated the fall in creatinine clearance and the rise in creatinine blood levels, respectively, which coincided with signifi cantly lower IL-6, IL-1β, and nitrate blood levels. Kidney glomerular and tissue DNA damage were markedly attenuated, whereas NF-κB activation was signifi cantly higher in the Na 2 S-treated animals. Conclusions In a clinically relevant porcine model mimicking aortic cross-clamping-induced kidney I/R injury, Na 2 S attenuated tissue injury and organ dysfunction as a result of reduced systemic infl ammation and oxidative stress. The higher NF-κB activation and the unchanged fractional Na + excretion were most probably due to the drop in temperature [6] and the direct eff ect of H 2 S on tubular Na + absorption [7] , respectively. Introduction Sepsis has been identifi ed as the most common cause of renal injury in ICUs although the pathophysiology is not well understood. No large clinical studies are available that show an improvement of renal function in patients with sepsis and this may be related to the lack of early diagnostic tests that indicate the onset of renal injury. The aim of the current study was to search for potential new early markers of renal injury during acute endotoxemia and to investigate whether renal injury can be ameliorated by the induction of lipopolysaccharide (LPS) tolerance. Methods Five healthy males received intravenous bolus injections of 2 ng/kg/day Escherichia coli LPS for 5 consecutive days. We used surfaceenhanced laser desorption/ionization time-of-fl ight mass spectrometry (Seldi-TOF MS). This approach allows for rapid high-throughput profi ling of multiple urine samples and detects low molecular weight biomarkers. Results Repeated LPS administrations induced a diminished glomerular fi ltration rate of 33 ± 7% (P = 0.02) on day 2 and an increase in serum creatinine of 11 ± 3% (P = 0.002) on day 3, which was associated with the appearance of 15 peak intensities in the urinary protein profi le including an increase in β-microglobulin levels (P = 0.04) 6 hours after the fi rst LPS administration. Four of the 15 peak intensities on day 1 correlated with serum creatinine levels on day 3; 3,950, 4,445, 6,723 and 7,735 m/z (P = 0.03; 0.01; 0.02 and 0.05, respectively). With the development of LPS tolerance, renal function was restored, refl ected by a decrease in serum creatinine and β-microglobulin levels to baseline (P = 0.2 and 0.4, respectively, between days 1 and 5), and by attenuated peak intensities in the urinary protein profi le (P <0.0001 for all 15 peak intensities). In conclusion, renal injury occurs during repeated endotoxemia and can be predicted by new urinary markers using proteome research. The four markers that correlated with the extent of renal injury may represent potential new biomarkers for renal injury and need further identifi cation. The infl ammation-induced renal injury subsided when LPS tolerance developed after 5 consecutive days of LPS administrations. We developed a technique for detection of MDZ and 1-hydroxymidazolam glucuronide (1-OHMG). No commercial source of 1-OHMG prevented assay development. 1-OHMG had pharmacological activity and is renally excreted. We postulated that ultrafi ltrate (UFR) would be rich in 1-OHMG. We describe a method to purify 1-OHMG from UFR. Methods Ethics approval was granted. To purify, UFR was extracted on a C-18 column, it was washed and eluted. HPLC-MS separation was performed. The 1-OHMG (mol. wt 517/ 518 in MS) was desiccated and re-dissolved four times to maximise purity. Electrospray (ESI) mass spectrometry (MS) characterised 1-OHMG. The purity was calculated using NMR. A calibration plot of 1-OHMG, MDZ and DZP (internal standard) for HPLC-MS was performed. Results 1-OHMG identity was confi rmed using MS/MS (Figure 1 ). Two milligrams of 1-OHMG was purifi ed from 5,000 ml UFR. The 1-OHMG was 98% pure (NMR). The extinction coeffi cient was identical to MDZ. The calibration plot resulted in correlation of 0.912. The assay was applied into clinical practice, to report sera and UFR levels. Introduction Complex care in the hemato-oncological ICU sometimes requires the use of CRRT. As the hemato-oncological patients have some specifi cations we retrospectively followed up the results of our eff ort in patients with acute renal failure due to veno-occlusive disease (VOD) after allogeneic hematopoietic stem cell transplantation. VOD is one of the most severe early complications in this type of patients and its mortality is high. Methods From 1 January 2007 to 30 October 2009 we performed 94 procedures (one procedure = 24 hours) in 15 patients with VOD and acute renal oligo-anuric failure. As a standard in our ICU we perform CVVH with post dilution on a Fresenius Multifi ltrate machine. The key drug for the treatment of VOD is defi brotide and the patients were severely thrombocytopenic, we have not used any anticoagulation in the CVVH set. The dose of defi brotide ranged from 10 to 20 mg/kg/day according to the severity of thrombocytopenia and eventually hemorrhagic symptoms. Blood count, acid-base balance and biochemistry were monitored at least twice daily. The thrombocytopenia was corrected with the donor platelets and the platelet level was sustained at 20 x 10 9 l. The introduction of CVVH in all 15 patients dramatically helped to overcome the severe part of VOD until the defi brotide helped to restore Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 the right function of vascular endothelium. None of the 15 patients died from VOD as a cause of death. Surprisingly, the median patency of the CVVH tubing set was 46 hours despite no anticoagulation in the tubing set. Defi brotide is effi cient in preventing coagulation in the tubing set. No serious hemorrhagic events were observed. The fl uent maintenance of fl uid balance, the removal of waste products of metabolism and the maintenance of acid-base balance were very important for the patient in the critical point of advanced VOD. Conclusions CVVH is an integral part of intensive care in hemato-oncology and the use of this therapeutic modality in patients with VOD with acute renal failure can signifi cantly improve the results of our therapeutic eff orts. Comparison of the effi cacy and safety of two regional citrate anticoagulation protocols using acid citrate dextrose A or Introduction Continuous renal replacement therapy (CRRT) is the treatment of choice for acute renal failure in critically ill patients. Our study compared the effi cacy and safety of regional citrate anticoagulation (RCA) with acid citrate dextrose A (ACD-A) against Prismocitrate 10/2 (Gambro). Use of Prismocitrate 10/2 eliminates the need for a separate infusion pump for ACD-A as it doubles as replacement fl uid and anticoagulant. This removes a source of error in the calculation of fl uid balance. The combination of tri-sodium citrate and citric acid in Prismocitrate 10/2 avoids the metabolic alkalosis associated with ACD-A [1, 2] . Methods This was a prospective sequential cohort study. All patients admitted to the surgical ICU who required CRRT for ARF were recruited. Group A, using ACD-A, was recruited from October 2007 to September 2008 (n = 23). Group B, using Prismocitrate 10/2, was recruited from October 2008 to September 2009 (n = 20). We evaluated the incidence of metabolic alkalosis and other biochemical changes, azotemia control, fi lter lifespan and complications. The incidence of metabolic alkalosis in Group B from treatment day 2 onwards was signifi cantly lower than Group A. The Group A median pH was 7.42 (6.81 to 7.62) compared with Group B pH 7.35 (7.2 to 7.49), P = 0.001. The control of electrolytes and azotemia was not signifi cantly diff erent. The mean fi lter duration was 58.8 hours (95% CI 38.0 to 79.6) for Group A and 61.8 hours (95% CI 45.8 to 77.8) for Group B (P = 0.678). Longitudinal analysis revealed a statistically signifi cant result for reduced metabolic alkalosis for Group B. (standard bicarbonate P <0.001, base excess P <0.001). Repeat treatment sessions also showed a statistically signifi cant reduction of metabolic alkalosis using Prismocitrate 10/2 rather than ACD-A (P = 0.029). Conclusions RCA with Prismocitrate 10/2 reduces the incidence of metabolic alkalosis associated with ACD-A. This regime is safe, feasible and improved patient safety, with no increase in complication rates. Our unit has now converted to Prismocitrate 10/2 for RCA. injury (TBI) is common. We hypothesised that the use of such therapies would signifi cantly augment creatinine clearances (CrCl) in this population. Methods Head-injured patients requiring hyperosmolar therapy using 3% or 20% saline solutions and/or norepinephrine infusion for the maintenance of a CPP >60 mmHg were recruited into the study. Additional management was consistent with local practice and in line with the Brain Trauma Foundation guidelines [1] . An 8-hour CrCl, physiological variables, fl uid balance, and medications were recorded daily during active management of CPP. A further CrCl was collected just prior to discharge (off CPP therapy), and if this was elevated, was repeated on the ward. Augmented renal clearance (ARC) was defi ned as a CrCl >160 ml/minute/1.73m 2 for males and >150 ml/minute/1.73m 2 for females [2] . Results Twenty consecutive patients were enrolled. The average ICU length of stay was 15 days (CI 95% 11 to 18), and time to study entry averaged 2.3 days (CI 95% 1.7 to 2.8). All patients received norepinephrine (n = 20), 85% (n = 17) received hypertonic saline, and therapy lasted on average 7.6 days (CI 95% 5.6 to 9.5). ARC was demonstrated in 17 (85%) patients at any point during active management of CPP. The mean maximum CrCl was 179 ml/ minute/1.73 m 2 while on CPP therapy (CI 95% 159 to 198) returning to a mean CrCl of 111 ml/minute/1.73 m 2 (CI 95% 91 to 131, P <0.001) when measured in the ward. The mean CrCl in the ICU while not receiving CPP therapy was 150 ml/minute/1.73 m 2 (CI 95% 134 to 167, P = 0.03). The mean time to reach peak CrCl while on active treatment was 4.7 days (CI 95% 3.0 to 6.4). Norepinephrine use, saline loading, mean arterial pressure, and central venous pressure, predicted CrCl on the day of measurement. Conclusions ARC is common in head-injured patients receiving active management of CPP and persists even after ceasing such therapy. This has signifi cant implications for appropriate dosing of renally excreted drugs in this setting. Introduction The kidney is the organ of the human body designed to sense and regulate intravascular volume. The KING (Kidney Instant Monitoring; Orvim, Paderno Dugnano, Italy) is a new device that allows continuous measurement of urinary electrolytes and real-time body response to changes in intrathoracic and intra-abdominal pressures and to lung injury. Methods Sixteen pigs (weight 20 ± 3 kg) were anaesthetized, tracheotomized, catheterized and mechanically ventilated. Pigs were ventilated for approximately 10 hours with a TV of 10 ml/kg, RR 15 breaths/minute, F I O 2 0.5 and no PEEP. After TV was increased, pigs were divided into two groups: eight pigs with a lower TV and eight pigs with higher TV. A variable dead space was added to maintain normocapnia. Pigs were mechanically ventilated with the new TV up to 48 hours. The KING measured urinary output and urinary concentration of sodium and potassium every 10 minutes. NaCl 0.9% was exclusively infused. The average urinary concentration of each electrolyte was expressed as the total concentration Introduction NGAL levels are dramatically increased in urine and plasma from patients with acute kidney injury (AKI). In this study we tested the clinical utility of NGAL in critically ill patients with many confounding conditions including sepsis. In addition, a new fully automated NGAL assay was validated with a selection of clinical samples from these patients. Methods Plasma and urinary NGAL was monitored (daily to alternate days) in 135 consecutive patients admitted to intensive care. AKI was classifi ed according to the RIFLE criteria. Three patients were excluded because of incomplete data. NGAL was determined with an ELISA kit (BioPorto Diagnostics). Diff erences between maximal NGAL levels in the patient were analyzed nonparametrically by the Kruskal-Wallis test. Data are reported as median (interquartile range) in ng/ml. A fully automated immunoturbidimetric NGAL test (The NGAL Test; BioPorto Diagnostics) was tested. The NGAL Test was validated by means of linearity studies with dilutions of calibrator material, determination of antigen excess zone and a method comparison with the ELISA kit on 40 urine and 40 plasma samples with NGAL levels from 25 to 2,995 ng/ml as measured by ELISA. Results Sixty-two patients did not have AKI; 15 were classifi ed as risk (R), 11 as injury (I) and 44 as failure (F). The urinary NGAL was signifi cantly (P <0.0001) higher in patients with AKI (R 377 (102 to 830), I 519 (321 to 1,845) and F 2,747 (559 to 7,621)) than in patients without AKI (51 (29 to 213)). Plasma NGAL levels were signifi cantly (P <0.0001) higher in patients with AKI (R 354 (259 to 511), I 546 (331 to 1,106) and F 1,062 (584 to 1,817)) than in patients without AKI (199 (128 to 308) ). Linearity studies of The NGAL Test demonstrated a measuring range from 25 to 5,000 ng/ml. No deleterious eff ect of antigen excess was seen up to a level of 40,000 ng/ml. There was good agreement between the ELISA and The NGAL Test results, the Pearson coeffi cient of correlation being 0.98 for both urine and plasma samples. Conclusions It is concluded that NGAL is dramatically increased in the urine and plasma of unselected critically ill patients with AKI and the degree of injury is refl ected by the observed levels despite several confounding conditions. NGAL determination may therefore be useful for the diagnosis of renal injury and monitoring the management of patients admitted to intensive care. Validation of The NGAL Test demonstrates a performance that complies with the expectations for a fully automated clinical laboratory assay, which can thus be a valuable tool for the management of critically ill patients. Introduction Intra-abdominal hypertension (IAH) is associated with signifi cant mortality in surgical and trauma patients. However, IAH can occur in medical patients as well. Thus, the main objective of this study was to assess whether the IAH during the fi rst 24-hour period of admission was an independent predictor for 28-day mortality. Methods We conducted a prospective observational study. All patients admitted to the medical ICU were enrolled in this study and underwent intra-abdominal pressure (IAP) measurement via the bladder. The primary outcome was the association between IAH and 28-day mortality rate. Results Seventy-seven eligible patients were enrolled to this study. The incidence of IAH was 44%. The mean age was 60.52 ± 18.34 years. In terms of 28-day mortality, there was no association between the presence of IAH and mortality outcome. However, regarding ICU mortality, nonsurvivors had signifi cantly higher mean IAP within 24 hours of admission than survivors: 11.55 (8.0 to 20.3) vs 9.95 (3.0 to 27.0) (P = 0.041). Conclusions Neither the mean IAP nor the presence of IAH within the fi rst 24 hours of admission was an independent predictor for 28-day mortality. Nonetheless, this study demonstrated the diff erence of IAP between survivors and nonsurvivors in terms of ICU mortality in the medical ICU. Introduction Intra-abdominal hypertension (IAH) and abdominal compartment syndrome (ACS) have become serious causes of morbidity and mortality in critical surgical and medical patients, especially in the past 10 years. Increased IAP levels became one of the routine physiologic parameter measurement elements in critical patients and indicate prognosis. In this study, our aim was to observe IAP increase, APB decrease and their clinical manifestations in ICU patients. Methods Eighty-nine ICU patients were included in this study. IAP levels were measured by infusing 25 ml saline into the urinary bladder. The symphisis pubis plane affi liated as the zero point. Patients were divided into groups according to IAP and APB levels. Clinical follow-up, required medical care and survival were investigated. Results Within the 89 patients, 36 of them diagnosed as IAH and 34 of them diagnosed as low APB. There were 25 patients that had both IAH and low APB. In these patients, during the fi rst 2 days of the study, statistically signifi cant SGOT, SGPT, PO 2 , urea and creatinine levels were found. We also determined increased positive inotropic support and ventilatory support need with increased sepsis and multiple organ failure incidence in these patients. Mortality rates were strongly related to IAH and low APB levels. Conclusions In critically ill patients, IAP measurement, a very simple and valuable method, must be performed. IAH and low APB levels are indicators of high morbidity and mortality rates. Therefore, IAP measurement may become a routine element in follow-ups and survival rate determinations, in critically ill patients. Introduction Acute liver failure (ALF) is a rare disease with a spectrum of presentations from mild coagulopathy and altered conscious level to multiple organ failure and intracranial hypertension (ICH). A new group, Acute Liver Failure studies in Europe (ALSiE), has and wishes to further initiate collaboration between centres with clinical expertise in the management of ALF. It aims to establish a pan-European database of clinical and demographic data in ALF and initiate clinical studies. Methods We describe the experience of 13 centres in seven countries over a 3-month period to 31 March 2009. ALF was defi ned as an INR >1.5 and encephalopathy in the absence of chronic liver disease. Results are presented as median and IQ range. Results Eighty-fi ve patients were treated, acetaminophen was the dominant aetiology in the UK, 60%, and represented 34% in other centres. At presentation INR was 3.8 (2.1 to 6.5), hepatic encephalopathy (HE) grade I (0 to 2) was observed and 20% required pressors. Grade III/IV coma was seen in 64% during their course and of these 25% developed ICH. Seventyone per cent required ventilation, 58% renal replacement therapy and pressors in 65%. Ninety per cent required management in a critical care environment. Overall survival was 75% -42 cases fulfi lled poor prognostic criteria (PPC), of whom 31 were transplanted, 28 (90%) surviving to hospital discharge. Of the 11 remaining, four survived and seven died. Thirty-two out of 43 who did not fulfi ll PPC survived. Patients who died were older and had a predominant aetiology of hypoxic hepatitis. One organ support or less was associated with 70% survival with medical management alone. Two organ support or more was required in 55 patients and 27% survived with medical management, the remainder requiring transplantation or died. Conclusions ALF has a high prevalence of progression to multiple organ failure requiring a multidisciplinary approach (critical care, transplant surgery and hepatology) to achieve optimal outcome. The development of ICH is 25% in grade III/IV HE. Despite this, outcomes are good. Introduction Following liver transplantation, hepatic artery stenosis and portal vein obstruction occur in 3 to 12% of the patients, and more frequently in children than adults. Today, the standard of care is Doppler ultrasound and liver enzyme assessment daily. Accordingly, detection of severe hypoperfusion may be delayed. The aim of the present study was to explore whether microdialysis catheters implanted in the left and right liver lobe, by measurement of glucose, lactate, pyruvate and glycerol every 2 hours, detected vascular complications and rejection. Methods Seventy-three patients undergoing 82 liver transplantations were included. Nine of the patients were children. Two microdialysis catheters were inserted in the liver and one in subcutaneous tissue by a split needle technique. They were kept for as long as the catheters functioned, and maximally 4 weeks. Metabolic parameters (glucose, pyruvate, glycerol and lactate) were collected every 2 hours and measured bedside. Results Median age of the patients was 52 years (6 months to 70 years). The median time for catheters inserted in the liver was 9.5 days, with a range from 0.5 to 26 days. Six patients had hepatic artery stenosis/ occlusion, and in fi ve of them lactate increased to values >10 mM with a lactate/pyruvate ratio of several hundred and a concomitant decrease in glucose and increase in glycerol. In one patient (a 6-month-old child) lactate only increased to 2 mM and the LP ratio to 20. On the background of pathological metabolic values, all patients underwent immediate reoperation and blood clots were removed and the artery reanastomosed. In the child with low lactate values, the fl ow of the artery was less than 10% of total hepatic blood fl ow, despite reanastomosis. Thus, the exceptionally high portal fl ow delivered enough oxygen to prevent major ischemia despite very low fl ow in the artery. Thirteen patients had rejection verifi ed by biopsy, and in six patients anti-rejection therapy was given based on liver function tests and clinical judgement. All patients had some increase in lactate during the period of rejection, but the increase was in all but four patients only 1 mM. In the remaining four the increase was 3 to 4 mM. Conclusions By using microdialysis catheters measuring metabolic parameters, hepatic artery occlusions can be detected very rapidly. Patients with rejection show a small, but signifi cant, increase in lactate. Introduction Liver dysfunction in critically ill patients represents a major concern. Many drugs used in the ICU have been associated with hepatotoxicity. Hepatotoxicity presents in three distinct patterns: cholestatic (alkaline phosphatase (AP) ≥2 x ULN and ratio (ALT/ULN)/(AP/ ULN) ≤2), hepatocellular (ALT ≥3 x ULN and ratio ≥5) and mixed (AP ≥2 x ULN, ALT ≥3 x ULN and ratio between 2 and 5). No published studies have assessed the drug-induced cholestastic pattern of hepatotoxicity in the ICU. The aim of this study was to assess whether use of pharmacological classes previously associated with cholestasis are associated with an increased risk of pure or mixed cholestasis in the ICU. Methods A nested case-control study assessed the potential association between use of specifi ed pharmacological classes and cholestasis. Cases were identifi ed from a cohort of patients admitted ≥24 hours in whom at least one value of AP <240 IU/l had been obtained in the 72 hours following admission. We excluded patients with an identifi ed cause of cholestasis as well as patients with bone metastasis and pregnant women. Each case the subject was matched to a control subject based on age, gender, and length of ICU stay and admission year. Exposure to antiepileptics, penicillins, cephalosporins, carbapenems, macrolid antibiotics and parenteral nutrition was collected and included in a multivariate conditional logistic regression analysis with known risk factors. Introduction Energy requirements of critically ill septic shock patients treated in the ICU are particularly diffi cult to determine. Recent research shows that energy expenditure (EE) in those patients may be smaller than it was previously believed. EE values in this group are aff ected by the severity of the disease process as well as by administered treatment (sedation, inactivity of skeletal and respiratory muscles, fi ghting fever). The purpose of this study was to establish the infl uence of septic shock on EE in sedated and mechanically ventilated patients treated in the ICU. Using indirect calorimetry, we assessed EE in those patients and compared results with basal energy expenditure calculated according to the Harris-Benedict equation (BEE), and with EE values in patients subjected to general anesthesia. Methods Two groups of patients were studied with regard to EE measured by means of indirect calorimetry. Group I consisted of 50 critically ill patients treated in the ICU for septic shock. They were evaluated using the APACHE II and SOFA scores, taking into account the 28-day mortality. EE was measured continuously over the fi rst 24 hours of treatment by means of the Datex-Ohmeda M-COVX indirect calorimeter. Group II comprised 50 patients (ASA I and II), whose energy expenditure was measured under general anesthesia in surgical treatment of disc herniation. Energy expenditure measurements were taken with the use of the Datex-Ohmeda E-CAiOVX indirect calorimeter, adapted to work in an atmosphere containing anesthetic gases. Conclusions A decrease in the total metabolic rate of patients with severe sepsis and septic shock can constitute a prognostic indicator of heightened death risk in this group of ICU-treated patients. were excluded from the study. Onset time and occurrence of pneumonia were extracted from the hospital's database; patients who died during the fi rst 48 hours were then excluded. Patients staying for 2 days or longer in the ICU were divided into two groups (with or without NGT) and analyzed separately. Results One hundred and fi fty-six patients were analyzed. CT scan showed a NGT in 30 patients (one malposition in the medium esophagus). Gastric gas, liquid/solid and total volume were not diff erent in patients with and without NGT (Z test; gas: 54 ± 147 vs 95 ± 168 ml, P = 0.179; liquid/solid: 226 ± 282 vs 199 ± 237, P = 0.626; Total: 280 ± 311 vs 295 ± 320, P = 0.824). Twenty out of 153 patients developed a pneumonia in the fi rst 7 days, fi ve of which with NGT and 15 without (chi-squared, P = 0.459). There was no diff erence in gastric residuals between these patients and the 133 others (Mann-Whitney; Q1-median-Q3; Gas: 11-25-126 vs 14-43-90 ml, P = 1; liquid/solid: 16-63-432 vs 22-115-290 ml, P = 0.799; Total: 29-252-661 vs 68-186-371 ml, P = 0.873). Seventy-three patients were admitted to the ICU, of which 23 had a NGT and 20 developed pneumonia. There was no diff erence between gastric residuals of the 23 who had a NGT and the 50 others. A pneumonia developed in the fi rst 3 days in 1/23 patients with NGT vs 9/50 without NGT (chi-squared-Yates, P = 0.226); in the fi rst 7 days in 5/23 patients with NGT vs 15/50 without NGT (chi-squared, P = 0.462). There was no diff erence in gastric residuals between patients who developed early pneumonia (at 3 and 7 fi rst days) and those who did not. There was no diff erence in ICU length of stay (10.8 vs 12.7 days, P = 0.825) and onset time of pneumonia (4.2 vs 3.3 days, P = 0.234). Conclusions Our results suggest that gastric volume is high at admission of trauma patients, irrespective of NGT presence. In this study, pneumonia incidence was related neither with high gastric volume, nor with NGT usage. McWhirter and Pennington stated in 1994 that 40% of patients are malnourished on admission to hospital [1] . Malnutrition increases morbidity and mortality in the ICU. Early feeding improves outcome, length of stay and septic complications. Methods We aimed to assess the standard of nutritional practice, and measure caloric achievement and caloric debt in the fi rst 7 days of admission of critically ill patients. The European Society of Parenteral and Enteral Nutrition guidelines on enteral nutrition are set as the standard [2] . Between April and August 2009, all adult patients not fed orally within 3 days of admission to the ICU were included. The caloric target was 25 kcal/kg/day. All impediments to achievement of feeding targets were recorded. Results Among 30 patients (16 surgical and 14 medical), 11 patients were fed early within 24 hours of admission, 19 patients were fed 24 hours after admission. Twenty-three patients were fed by nasogastric tube, three patients received small bowel feeding. Parenteral nutrition (PN) was used alone in two patients and as a supplement to enteral nutrition (EN) in one patient. Conclusions Mean caloric achievement by day 7 was only 58.4% (Figure 1) , well below target. The following recommendations have now been made to improve the nutritional practice. A higher threshold for residual gastric volumes has been adopted, an earlier aggressive use of prokinetic agents is recommended, strategies to access small bowel feeding earlier and PN is now considered within 24 hours if EN is contraindicated or as supplement to EN if the caloric target is not reached after 48 hours. Introduction Nutritional screening and nutritional support have an important role to prevent malnutrition, which aff ects the prognosis and increases the morbidity and mortality of patients. We aimed to assess the patients who are consulted and followed by the nutritional support team during a 2-year period retrospectively. Methods Demographic characteristics, subjective global assessment scores (SGA) of patients' type, route, duration of nutritional support and complications were assessed. Results A total of 379 patients were consulted during the 2-year period. Two hundred and two of them (53.3%) were male and 177 (46.7%) were female and the mean value of ages was 63.5 ± 17.9 years (7 to 96 years, minimum to maximum). SGA scores were A in 35.1% (n = 133), B in 26.6% (n = 101), and C in 38.3% (n = 145). In 31.9% (n = 121) of the patients combined parenteral and enteral, in 29.6% (n = 112) only parenteral, in 25.6% (n = 97) only enteral and in 12.9% (n = 49) oral nutritional support were administered. Combined parenteral and enteral nutritional support was applied via nasogastric tube and peripheral parenteral in 54.5% (n = 66), nasogastric tube and santral parenteral in 29.8% (n = 36), oral and santral in 5.8% (n = 7), oral and peripheral parenteral in 6.6% (n = 8), nasojejunal and peripheral parenteral in 1.7% (n = 2) and gastrostomy and peripheral parenteral in 21.7% (n = 2). Parenteral nutritional support was applied via peripheral route in 62.5% (n = 70) and santral route in 37.5% (n = 42). Enteral nutrition was commonly applied by nasogastric tube in 87.6% (n = 85) and gastrostomy tube in 10.3% (n = 10). Nutritional support was applied during 18.2 ± 23.7 days (1 to 306 days). The most common complications were constipation (7.4%), nasogastric tube displacement or obstruction (5.2%), diarrhea (2.6%), thrombophlebitis (3.4%), aspiration (1.8%) and hyperglycemia (1.6%). Conclusions Nutritional screening should be a component of physical examination in hospital and outpatients. Are we feeding our critically ill patients appropriately? Introduction Nutrition plays a signifi cant part in the overall treatment plan for critically ill patients. In India, there are varying practices including use of kitchen feeds and/or commercial formula and also predominant use of bolus feeds. The multidisciplinary team in our critical care unit (CCU) includes a senior dietician and we wanted to review whether our standardized practice of routine screening for malnutrition using subjective global assessment, early use of nutrition (preferably within 24 hours) and continuous enteral feeding using commercial formula helped achieve nutritional goals. Methods A retrospective chart review was conducted on 508 patients who received continuous enteral feeding on day 1 and day 5 of the ICU stay. Information on calories prescribed (using the Harris-Benedict equation) and delivered on day 1 and day 5 was collected. Achieving the nutritional goal was defi ned as successful delivery of 90% of prescribed Introduction Early enteral nutrition (EN) is the preferred strategy for feeding the critically ill; however, it is not always possible to achieve suffi cient calories and protein with EN alone. The use of supplemental parenteral nutrition (PN) has been advocated as a strategy to avoid complications associated with protein/calorie defi cit from inadequate enteral feeding. The purpose of this study was to evaluate the eff ect of this practice on nutritional and clinical outcomes. Methods An international, observational study conducted in 2007 and again in 2008 examined nutrition practices in ICUs. Eligible patients were the mechanically ventilated who remained in the ICU for >72 hours and received early EN within 48 hours from admission. Data were collected on patient characteristics and daily nutrition practices for up to 12 days. Patient outcomes were recorded after 60 days. We compared the outcomes of patients who received early EN alone, early EN + early PN, and early EN + late PN (after 48 hours of admission). Regression analyses were conducted to determine the eff ect of increasing age, APACHE score, days in hospital prior to ICU admission, gastrointestinal dysfunction, nutritional strategy, and other baseline variables and their relationship to being alive and discharged within 60 days. Introduction Liver function disturbances have been of concern in parenteral nutrition. The aim of the post hoc analyses of two pooled studies was to compare liver function parameters using fi sh-oil-containing vs soybean oil lipid emulsions. Methods Two prospective, controlled, randomized, parallel-group, double-blind, multicenter studies compared SMOFlipid 20%, a fi shoil-containing lipid emulsion (SMOF: soybean oil 60 g, medium-chain triglycerides 60 g, olive oil 50 g, fi sh oil 30 g per l), vs standard soybean oil emulsion (SO, 200 g soybean oil/l). The studies were as follows: A: postsurgical adult patients, 5 days total parenteral nutrition, 100 SMOF vs 103 SO patients; B: adult patients receiving parenteral nutrition for 28 days, 22 SMOF vs 32 SO patients. Patients with data at baseline and study end were selected. The data were pooled and diff erences of laboratory data at 1 week minus baseline and study end minus baseline were calculated. Analyses of variance were applied using diff erences between 1 week/study end values and baseline values of bilirubin (BIL), AST and ALT as dependent variables and treatment group as independent variable. Covariates used: baseline values for BIL, AST, ALT and mean daily dose of fat per kg body weight. Results Baseline values were not signifi cantly diff erent between the two treatment groups. The mean daily intake of fat/kg bw after 1 week and study end was the same between both treatment groups. One-weekbaseline values of liver parameters (mean; SMOF group: BIL -5.1* μmol/l, AST -5.4** U/l, ALT 0.6*** U/l; SO group: BIL -2.0* μmol/l, AST 0.6** U/l, ALT 11.1*** U/l) were signifi cantly diff erent between treatment groups (P: 0.035*, 0.001**, 0.001***). Extending the analysis until 28 days did not alter the results. Conclusions Infusion of SMOFlipid 20% compared with a soybean oil standard lipid emulsion exerts a signifi cant decrease in the values of BIL and AST. It also signifi cantly attenuates the rise seen in ALT. These results indicate that SMOFlipid 20% is promising in preventing parenteral nutrition-induced liver disturbances. We supposed the diagnostic signifi cance of the ratio between proinfl ammatory and anti-infl ammatory cytokines in the peritoneal cavity and system blood fl ow in patients with abdominal sepsis. So we tried to correct such disbalance with glutamine intravenous and enteral supplementation. Methods Prospective controlled randomized investigation of the patients with abdominal sepsis (excluding pancreatitis). Group 1 (n = 16): standard therapy. Group 2 (n = 11): with intravenous infusion of glutamine (Dipeptiven; Fresenius Kabi, Germany). Group 3 (n = 11): with intravenous infusion of glutamine and enteral supplementation of glutamine (Intestamine; Fresenius Kabi). The patients in groups were identical according to the severity of disease (APACHE II), the standard of intensive care, the volume of surgery care. We investigated the proinfl ammatory cytokines: TNFα, IL-1, IL-6, IL-8; and anti-infl ammatory cytokines: soluble sTNF-RI and sTNF-RII, antagonist receptor IL-1, IL-10 in blood serum and in the peritoneal cavity during the initial 3 days of intensive care (ELISA; BD Biosciences Pharmingen, San Diego, CA, USA; EASIAs; Biosource, Nivelles, Belgium). The probability of survival on day 28 was 73% in the standard therapy group, in the group with glutamine intravenous -78%, in the group with glutamine intravenous and enteral -84%. We did not fi x the decrease of the duration of respiratory support in all of the groups. The duration of acute intestinal injury was signifi cantly diff erent (standard group 49 hours vs 38 hours in group with glutamine intravenous supplementation -35 hours in group with intravenous and enteral glutamine supplementation). We investigated the prevalence of the concentration of proinfl ammatory cytokines in the peritoneal cavity and in blood serum according to the molar coeffi cient in the control group. The molar coeffi cient had a positive correlation with the SOFA scale. In group 2 (glutamine intravenous) the molar coeffi cients were decreased to the prevalence of anti-infl ammatory cytokines (in serum on day 3, in peritoneal on day 2). In group 3 (glutamine intravenous and enteral) we investigated the signifi cance diff erence and decrease of all cytokine levels in blood serum and in the peritoneal cavity. Conclusions Intravenous and enteral supplementation of glutamine improved the cytokine balance in blood and the peritoneal compartment. Introduction Glycemic control is mandatory in the critically ill, because hypoglycemia and hyperglycemia are associated with increased mortality [1] . We compared in a prospective observational study three new pointof-care devices with the hexokinase reference method and we evaluated whether their results would modify insulin titration. Methods Arterial blood glucose was simultaneously measured by the blood gas analyser RapidLab 1265, by three glucose meters (Accu-Chek performa, Precision XceedPro, Nova StatStrip), and with the hexokinase reference method. All values were duplicated and the average value of each was computed. Bland-Altman, Passing Bablok, Kanji [2] and modifi ed Kanji approaches were performed. Biases were expressed as the glucose result of the point-of-care method minus the reference method. We evaluated the theoretical impact on insulin titration by comparing glucose meter results with the hexokinase reference method on a dynamic sliding scale targeting a glycemia of 80 to 130 mg/dl. Results A total of 156 matched analyses were done in 80 patients. The mean fl ash SOFA score was 4.5. The range of the reference glucose was 25 to 327 mg/dl. The numbers of discrepancies of dosing insulin were respectively 8-5-6-14 at 0.1 U/hour, 0-6-3-6 at 0.2 U/hour and 2-3-0-0 at 0.3 U/hour. None was greater than 0.3 U/hour. Regarding the point-of-care results, total theoretical insulin dose changes were respectively: -11.7 U, -19.2 U, -22.2 U, +4.9 U for all these measurements (devices order as in Methods). Table 1 presents the standard comparisons. Introduction Chronic critical illness is characterized by a severe metabolic disorder, caused by the loss of the physiologic hypothalamic and pituitary functions and the onset of the so-called wasting syndrome, as many studies have just confi rmed [1] . In this study we assessed the correlation between the neuroendocrine pattern in chronic ICU patients and mortality. Methods The patients enrolled were 25 (18 male and fi ve female) with a mean age of 67 years and APACHE II score of 12 ± 5. We excluded females in a premenopausal state, patients with previous endocrine problems or in therapy with dopamine, high dose of cortisone or amiodarone. In these patients, we evaluate, on the seventh day of stay in the ICU (considered the chronic phase of critical illness), the mean value of four nocturnal hormonal measurements (LH, FSH, estrogen, testosterone, DHEAS, androstenedione, androstenediol, progesterone, TSH, FT3, FT4, RT3, GH, IGF1, prolactin, cortisol). Furthermore, we observed hormonal patterns in people who died in the ICU. We found statistical evidence in the correlation of high levels of estrogens, related to aromatase increased function [2] , and the percentage of death. In the group of patients with estrogens more than 50 pg/ml, six of them died, while if estrogens were normal or low, none died ( Figure 1 ). Survivors and nonsurvivors did not diff er by mechanism of injury or APACHE score (11.6 vs 11.8). Medium serum estradiol levels were 44.96 pg/ ml in survivors and 115.08 pg/ml in nonsurvivors. Conclusions To date, the role of these hormones in critical illness pathophysiology and the increase of estrogen levels are still uncertain. Further studies are required. Introduction Patients with severe burn injury experience a rapid elevation in multiple circulating proinfl ammatory cytokines, with the levels correlating with both injury severity and outcome. Accumulations of these cytokines in animal models have been observed in remote organs, however data are lacking regarding early serial heart cytokine levels following burn injury, and the therapeutic eff ects of estrogen on these levels. Using an animal model, we studied the acute eff ects of a fullthickness third-degree burn on cardiac levels of IL-1β, IL-6, IL-10, and TNFα. In addition, we analyzed the eff ect of estrogen on levels of cytochrome C, signifying apoptosis, and levels of NF-κB, signifying infl ammation. Here, we hypothesized that acute estrogen treatment decreases infl ammation in the heart and blocks the induction of apoptosis. Methods In this study, 144 male rats received third-degree 40% total body surface area burns. Fifteen minutes following burn injury, the animals received a subcutaneous injection of either placebo or 17β-estradiol (0.5 mg/kg). The hearts were harvested at 0.5, 1, 2, 4, 6, 8, 12, 18, and 24 hours after injury, and the heart cytokine (IL-1β, IL-6, IL-10, TNFα) levels were measured using the ELISA method. In addition, we assessed the cytosolic levels of cytochrome C and NF-κB at the 24-hour time point using western blot analysis. In the burned rats, 17β-estradiol signifi cantly decreased the cardiac levels of TNFα (~95%), IL-6 (~50%), IL-1β (~25%), and IL-10 (~20%), when compared with the placebo group. In addition, we determined that estradiol treatment restored cytosolic levels of NF-κB (65% increase) at the 24-hour time point. Also, estrogen decreased cytosolic accumulation of cytochrome C (50% reduction) at the 24-hour time point. Conclusions Following severe burn injury, estrogens decrease cardiac infl ammation and levels of the pro-apoptotic cytochrome C. In addition, estrogen signaling promotes cell survival, as indicated by an increase in NF-κB levels. Importantly, estrogen treatment following burn injury nearly ablated the deleterious burn-induced increase of TNFα in the heart to levels similar to unburned control animals. Elevated cardiac levels of TNFα have previously been linked to a poor outcome. Introduction Glucocorticoids are known to have an anti-insulin action on glucose metabolism, leading to increased lactate production [1] . Alternatively, glucocorticoid-induced apoptosis is a well-recognized phenomenon initiated by mitochondrial dysfunction [2] . Increased lactate production follows loss of mitochondrial membrane potential during apoptosis. Both mechanisms may lead to clinically relevant hyperlactatemia following glucocorticoid administration during cardiac surgery requiring cardiopulmonary bypass. Methods All adult patients undergoing cardiac surgery and cardiopulmonary bypass from 5 October through 21 November 2005 in a large academic teaching hospital in Rotterdam, the Netherlands were included in this study. Dexamethasone (60 to 80 mg) was given perioperatively at the discretion of the anesthetist. Lactate levels were measured within 1 hour postoperatively. The association between dexamethasone treatment, serum lactate levels and possible confounders was evaluated using ANOVA and linear regression. Results A total of 82 patients 18 years and older underwent cardiopulmonary bypass for cardiac surgery in the above-mentioned period. Three patients undergoing heart or lung transplantation, who thus received methylprednisolone, were excluded; a further two patients who received hydrocortisone for allergic reactions were also excluded. Data were incomplete for one patient, leaving a total of 76 patients for analysis. The mean lactate level was 1.2 mmol/l in the 47 patients who did not receive steroids and 2.6 mmol/l in the 29 patients who received dexamethasone (P <0.0001, Figure 1 ). When adjusting for potential confounders such as age, glucose level, duration of cardiopulmonary bypass and preoperative NYHA heart failure classifi cation, this diff erence remained signifi cant (P <0.0001). Conclusions Administration of dexamethasone during cardiac surgery requiring cardiopulmonary bypass is associated with a signifi cant hyperlactatemia. Introduction We previously observed that critically ill patients are vitD defi cient. This status cannot be restored by increasing the normally recommended intravenous vitD dose. The antimicrobial peptide LL-37 is expressed in macrophages upon stimulation by 25OHD 3 . sCD163 is a soluble pattern-recognition receptor, which is a marker of macrophage activation and a predictor of mortality in SIRS. The aim of this study is to unravel the impact of 25OHD 3 defi ciency and normalization on innate immunity parameters -CRP, LL-37 and sCD163 -in critical illness. Methods We randomly allocated patients upon ICU admission to receive either placebo (n = 13) or a 10-day treatment of 15 μg/day 25OHD 3 intravenously after a loading dose of 200 μg intravenously (n = 11) on top of the normal vitD dose (200 IU). Patients with chronic bone or kidney disease, glucocorticoid treatment, age younger than 18 years, or with an anticipated ICU length of stay less than 10 days were excluded. On admission, patients were compared with healthy age-matched, gendermatched and BMI-matched controls (n = 24). Results Administration of the 25OHD 3 regimen resulted in a rapid and sustained increase of serum 25OHD 3 levels (P <0.05 vs placebo). 1,25(OH) 2 D 3 levels rose only transiently (P = 0.07 on day 1, P = 0.08 on day 2 vs placebo). 25OHD 3 and 1,25(OH) 2 D 3 levels were lower than those of healthy controls (sampled during summer). 25OHD 3 treatment had no signifi cant eff ect on CRP. On admission, LL-37 levels (26 ± 6 μg/l) were comparable with those of healthy controls (22 ± 2 μg/l). The treatment tended to transiently increase LL-37 levels (change from baseline: P = 0.1 on day 6 vs placebo). Serum sCD163 levels tended to be higher at baseline Critical Care 2010, Volume 14 Suppl 1 http://ccforum.com/supplements/14/S1 S198 (P = 0.06 vs healthy controls) and rose signifi cantly over time (P <0.05 on day 5 and on day 10 vs day 0) but were not infl uenced by the treatment. The preliminary results of this randomized placebocontrolled pilot study indicate that, in contrast to a high intravenous 500 IU vitD supplement, intravenous 25OHD 3 treatment resulted in an immediate and persistent increase in serum 25OHD 3 , but did not aff ect 1,25(OH) 2 D 3 levels. Plasma LL-37 alterations in the studied prolonged critically ill patients hints at ameliorated LL-37 production provided that the 25OHD 3 status is restored. Serum sCD163 and CRP levels evolved during the ICU stay, but were not signifi cantly altered by 25OHD 3 supplementation. The results of this pilot study warrant further research into the potential modulation of innate immune processes by 25OHD 3 in the critically ill. Introduction Critical illness is characterized by lean tissue wasting whereas adipose tissue is preserved. Obese critically ill patients may have a lower risk of death than nonobese patients, a recent observation that may suggest a protective role for adipose tissue during illness, a role which has not been previously investigated. We hypothesized that adipose tissue could function as a waste bin for potentially toxic metabolites, such as glucose, during critical illness. Methods We studied adipose tissue biopsies from 61 of critically ill patients, more specifi cally morphology and the potential to take up and metabolize glucose, and compared this with biopsies from 20 matched controls. Results Adipose tissue biopsies from critically ill patients revealed a higher number and a smaller size of adipocytes as compared with matched controls, coinciding with increased preadipocyte marker levels. Also, >95% of adipose biopsies from critically ill patients displayed positive macrophage staining. Gene and protein expression of insulin-independent GLUTs, and tissue glucose content was increased. Glucokinase expression was upregulated whereas glycogen and glucose-6-phosphate levels were low. Acetyl-CoA-carboxylase protein level and activity of fatty-acid synthase were increased. A substantially increased activity of AMPK may play a crucial role. Conclusions The larger number of small adipocytes in response to critical illness appears to have an increased ability to take up and metabolize glucose into fatty acids. Such changes may render adipose tissue biologically active as a functional waste bin for toxic metabolites during critical illness, which could contribute to survival. Good communication with patients' relatives has humanitarian, professional and medico-legal benefi ts. The NHS Litigation Authority requires clear information documentation and monitoring that this is satisfactory [1] . This review was performed to ascertain relatives' opinions regarding the current standard of communication received from medical staff . Methods A paper questionnaire was made available to relatives after discussion with a member of the medical staff . Data for the fi rst month are discussed below, based on an average 50 admissions/month. The questionnaire was at the main reception desk, and staff also off ered them to relatives after each professional contact. Closed questions with single answers chosen from fi ve options (including one neutral option) were asked along with space for free comment. Results Eighteen questionnaires were completed: to our knowledge no relatives refused completion. Thirteen out of 18 discussions took place between 8 am and 6 pm, Monday to Friday, three between 8 pm and 6 am weekdays, and two at the weekend. All 18/18 respondents felt the doctor had explained their role in care. Thirteen out of 18 discussions took place in the dedicated interview room, four at the bedside and one by telephone. Seventeen out of 18 rated the service as very good for information given, clarity and the eff ect of the exchange on overall impressions of care; one questionnaire rated the service as good. Conclusions Relative communication is highly subjective but is known to infl uence impressions formed of overall quality of care. National audit frameworks emphasise objective quantifi able standards, which do not capture the quality of interaction [2] . We have monitored service quality based on subjective relative impressions, however the response rate (<40%) has potential for a positive bias in our favour. Many discussions are complex and there is a possibility that questionnaires were not off ered in certain situations. Additionally, despite full anonymity of the form, relatives may have been concerned about what was written impacting on their family member's care. It is known that benefi ts in the processes of clinical care leading to improvements in quality can arise purely from measurement; that is, performing service reviews such as this one [3] . If targets are to be used to benchmark quality of care in this area, there is a need for more consideration of the appropriate targets or alternatively an acceptance that there is the potential to have lower data capture in these areas. Introduction Patient satisfaction data collation is a priority for the UK's Care Quality Commission. In the adult intensive care unit (AICU), patients are often sedated for prolonged periods. Family satisfaction data may be useful to gauge quality of service delivery. The Family Satisfaction-ICU (FS-ICU 34) questionnaire was developed and validated in the USA [1] . To our knowledge, we are the fi rst group to use it in the UK. Methods The FS-ICU 34 is an anonymised questionnaire. We adapted the American terminology for use in the UK. Inclusion criteria were: AICU admission of 5 days or more; presence of next-of-kin. Patients who died were excluded. It was handed personally to AICU patients' next-of-kin following discharge. Nursing staff from both the AICU and general wards assisted in returning completed questionnaires. Key areas of questioning using the FS-ICU were: perception of treatment of patient discomfort; coordination of AICU services; skill and competencies of AICU staff ; consistency and frequency of communication; standard of family facilities; and emotional support. Relatives were asked to grade their answers on a fi ve-point scale (excellent to poor). Results Data from the fi rst 4 months of this survey have now been analysed. One hundred per cent of patients' relatives who fulfi lled the inclusion criteria received a questionnaire. The response rate was 68%. The majority of respondents were satisfi ed with overall care and decision-making. Similarly to published data [2] , families were most satisfi ed with nursing skill and competence (94.7% satisfi ed), and least satisfi ed with waiting room atmosphere (42.4%) and frequency of communication with doctors (71.2%). Conclusions The FS-ICU has enabled us to identify and target resources at key aspects of our service delivery. It has presented an opportunity for us to address misunderstandings and misconceptions regarding the ICU within our client base. This survey is unique in examining the links the ICU has with the community and informs the process of understanding that relationship. Methods A 16-item questionnaire around various issues on consciousness was presented to attendees at conferences in Europe. Data were obtained from 3,672 respondents (M age 36 ± 16 years, range 14 to 88; 55% women; 34 EU countries) and were analyzed with SPSS v. 16.0. Results Sixty-seven percent (n = 2,454) agreed with ANH withdrawal in chronic VS (31%, n = 1,138 disagreed; 2%, n = 80 no response). A signifi cant agreement was expressed by nonreligious respondents (vs religious; B = 0.70, P <0.001) and nonmedical professionals (vs doctors; B = 0.34, P = 0.001). Signifi cant disagreement was expressed by women (vs men; B = -0.25, P = 0.004), central and south Europeans (vs northern; B = -0.85, P <0.001 and B = -1.23, P <0.001, respectively) and those of higher age (B = -0.008, P = 0.01). Eighty percent (n = 2,956) did not wish to stay alive if themselves were in a permanent VS (18%, n = 625 wished to stay alive; 2%, n = 64 no response). Seventy-eight percent considered that being in permanent VS is worse than death for the patient's family (55% considered it worse than death for patients themselves). Sixty-nine percent (n = 2,523) disagreed with ANH withdrawal in chronic MCS (29%, n = 1,073 agreed; 2%, n = 76 no response). A signifi cant disagreement was expressed by central and south Europeans (vs northern; B = -0.58, P <0.001 and B = -1.3, P <0.001, respectively) and respondents of higher age (B = -0.007, P = 0.019); a signifi cant agreement was expressed by non-religious respondents (vs religious; B = 0.65, P <0.001). Sixty-four percent (n = 2,355) did not wish to be kept alive if themselves were in a permanent MCS (34%, n = 1,248 wished to stay alive; 2%, n = 69 no response). Forty percent considered that being in a MCS is worse than VS for the patient's family (50% considered it worse than VS for patients themselves). Conclusions These fi ndings raise important ethical issues concerning our care for patients with chronic disorders of consciousness. In light of high rates of diagnostic error in these patients [2] , the necessity for adapted standards of care is warranted. Introduction The aim of this study was to demonstrate improvements in both staff and patient experiences with end-of-life care. The ICU is a setting where death is common; it has been suggested that 20% of patients in the US die on the ICU [1] . Given that the majority of ICU deaths involve the withholding or withdrawing of treatment [2] , the importance of end-of-life care is clear. Despite this frequency, studies suggest that the current quality of end-of-life care is suboptimal on the ICU [3, 4] . As a result, we developed a new framework to address this issue. Methods We introduced our new framework over a 1-year period; we circulated questionnaires to the staff pre and post study to demonstrate any improvements in end-of-life care. Results Our framework was found to be helpful by 97% of respondents and was associated with an improvement in communication and knowledge of end-of-life care. We discovered an increase in the number of staff who felt that patients, along with having their analgesia/sedation needs met, were now experiencing care that was more conducive to a good quality of dying. The number of staff who now felt confi dent in managing withdrawal of care trended towards a signifi cant P value. See Figure 1 for details. Conclusions Quality of end-of-life care was improved with our new framework; however, further research is vital to ensure our patients receive the same kind of evidence-based medicine in their fi nal hours as they did during their acute illness. The acknowledgement of local practice with respect to endof-life decisions in accordance with laws and ethical principles is inevitable for intensive physicians in all countries. The fi rst step for the required social dialogue is to search for local customs for harmonisation with ethical and legal regulations, as well as the interests of physicians and patients. Methods In 2007 we performed the fi rst Hungarian survey with the purpose to learn more about the local practice of end-of-life decisions. Questionnaires were sent out electronically to 743 registered members of the Hungarian Society of Anaesthesiology and Intensive Care. Respecting anonymity we have statistically evaluated 103 replies (response rate was 13.8%) and compared with data from other European countries. Results As expected, it turned out from the replies that the practice of domestic intensive care physicians is very paternal and this is promoted by legal regulations that share a similar character. Intensive care physicians generally make their decisions alone (3.75/5 points) without respecting the opinion of the patient (2.57/5 points), the relatives (2.14/5 points) or other medical personnel (2.37/5 points). Furthermore they prefer not to start a therapy rather than withdraw an ongoing treatment. Nevertheless, the frequency of end-of-life decisions (3 to 9% of ICU patients) is similar to other European countries. Conclusions Hungarian intensive care physicians make end-of-life actions routinely. They usually decide based on their own opinion, slightly considering the opinion of nursing personnel, the patients or their relatives. They are not supported by the Hungarian legal regulation in making these decisions. Although the living will and the advanced directive are both acknowledged, they are not as widespread as required. Our study is the fi rst step to commence a social dialogue, which is taken in the evolution of end-of-life decision-making procedures.

projects that include this document

Unselected / annnotation Selected / annnotation