Home About us Contact | |||
Checking
Kinds of Checking Selected AbstractsChecking the Route to Cluster HelicatesEUROPEAN JOURNAL OF INORGANIC CHEMISTRY, Issue 24 2008Manuel R. Bermejo Abstract The aim of the work described here was to test the general applicability of our recently reported route to cluster helicates and to carry out a systematic study to relate the structural and coordinative properties of the organic strands with the microarchitectures of the resulting cluster helicates. Nine new ZnII, CuI and AgI complexes were prepared from three Schiff base ligands [H2La: bis(4-methyl-3-thiosemicarbazone)-2,6-diacetylpyridine; H2Lb: bis(4-methyl-3-thiosemicarbazone)-2,6-diacetylbenzene; H2Lc: bis(4-ethyl-3-thiosemicarbazone)-2,6-diacetylbenzene]. The experimental data confirm that AgI and CuI tetranuclear cluster helicates were obtained with a [M4(Lx)2] stoichiometry, and this finding demonstrates the general applicability of the synthetic route. The cluster helicates presented in this work were characterized for the first time in solution by NMR spectroscopy. In addition, six of the nine complexes were characterized by X-ray diffraction studies, and three of them were found to be tetranuclear cluster helicates. A detailed study of these three crystal structures leads us to state that the changes introduced in the organic strands do not prevent the assembly of the tetranuclear cluster dihelicates, but they do affect the microarchitectures. (© Wiley-VCH Verlag GmbH & Co. KGaA, 69451 Weinheim, Germany, 2008) [source] Sensitivity of comparative analyses to population variation in trait values: clutch size and cavity excavation tendenciesJOURNAL OF AVIAN BIOLOGY, Issue 4 2000Mikko Mönkkönen Importance of within-species (population) variation in trait values to correlations of traits among species has received very little attention in comparative analyses. We use randomization and bootstrapping techniques to provide a sensitivity analysis of the influence of population variation on correlations between clutch size and propensity to excavate. These traits are predicted to be negatively correlated under the limited breeding opportunities hypothesis, but opposing results have been found by two studies using different population estimates for western Palearctic Paridae. Our analyses support the limited breeding opportunities hypothesis and suggest low sensitivity to within-species variation in trait values. Yet, a small proportion of population data provide non-significant results. Checking for the effects of this variation on the postulated association between traits is necessary in comparative studies if one wishes to avoid type I and type II errors. [source] Trigger arrhythmia to confirm the position of totally implantable access ports (TIAP)JOURNAL OF SURGICAL ONCOLOGY, Issue 5 2007Huan-Fa Hsieh MD Abstract Background Totally implantable access ports (TIAP) with cutdown method has few complications, but needs assessment of fluoroscopic system. Methods We present a method to confirm the position of TIAP catheter without fluoroscopic assessment. We use the cutdown method and trigger arrhythmia while introducing the TIAP catheter. Results This method was applied in 54 patients and no complications were found. Conclusions Checking the position by triggering arrhythmia while performing TIAP with cephalic vein cutdown in case of C-arm was not available is simple and safe. J. Surg. Oncol. 2007;96:436,437. © 2007 Wiley-Liss, Inc. [source] assessment: Checking the checklist: a content analysis of expert- and evidence-based case-specific checklist itemsMEDICAL EDUCATION, Issue 9 2010Agatha M Hettinga Medical Education 2010: 44: 874,883 Objectives, Research on objective structured clinical examinations (OSCEs) is extensive. However, relatively little has been written on the development of case-specific checklists on history taking and physical examination. Background information on the development of these checklists is a key element of the assessment of their content validity. Usually, expert panels are involved in the development of checklists. The objective of this study is to compare expert-based items on OSCE checklists with evidence-based items identified in the literature. Methods, Evidence-based items covering both history taking and physical examination for specific clinical problems and diseases were identified in the literature. Items on nine expert-based checklists for OSCE examination stations were evaluated by comparing them with items identified in the literature. The data were grouped into three categories: (i) expert-based items; (ii) evidence-based items, and (iii) evidence-based items with a specific measure of their relevance. Results, Out of 227 expert-based items, 58 (26%) were not found in the literature. Of 388 evidence-based items found in the literature, 219 (56%) were not included in the expert-based checklists. Of these 219 items, 82 (37%) had a specific measure of importance, such as an odds ratio for a diagnosis, making that diagnosis more or less probable. Conclusions, Expert-based, case-specific checklist items developed for OSCE stations do not coincide with evidence-based items identified in the literature. Further research is needed to ascertain what this inconsistency means for test validity. [source] Latest news and product developmentsPRESCRIBER, Issue 6 2008Article first published online: 24 APR 200 Government responds to NICE report The Government has published its response to the Health Select Committee's report into NICE, broadly arguing that the Committee's recommendations are either already being dealt with or are not appropriate. The Committee recommended appraisals for all new drugs, shorter, rapid appraisals to coincide with their launch, and improved mechanisms for setting drug prices. The Government says its negotiations on the PPRS preclude a detailed response but suggests a rapid system may not be transparent or legally robust. It is exploring how high-cost drugs can be brought within the payment-by-results tariff. While defending NICE's reliance on QALYs, the Government accepts the need to explore how wider economic factors can be considered. As for the threshold cost per QALY by which NICE defines cost effectiveness, it says this is being validated scientifically and NICE will continue to determine the threshold. More topically, the Committee criticised the quality of clinical trial data available to NICE. The Government sees no need to compel pharmaceutical companies to disclose information and says NICE is already becoming more involved with research programmes. All clinical trials must be registered (confidentially) with the EU and the Government believes mandatory registration in the UK would be ineffective and illegal. Prescription charge up again from April The Government has raised the prescription charge by 25p to £7.10 per item with effect from 1 April. Prescription prepayment certificates will cost £27.85 for three months and £102.50 for 12 months. The increase, below the annual rate of inflation for the 10th successive year, will be levied on the 12 per cent of prescriptions that are liable for the charge: 5 per cent via prepayment certificates and 7 per cent from other prescriptions. The charge will generate £435 million in England in 2008/09; this excludes money from prescriptions written by dispensing doctors, which is retained by the PCT. Following criticism of the charge by the Health Select Committee, the Government says it has reviewed the charge and is now consulting on ,cost-neutral' options. MHRA safety update The MHRA warns of possible dose errors associated with Boots Medisure Domiciliary Dosage System in its latest issue of Drug Safety Update (2008;1:issue 8). One case has been reported in which incomplete sealing allowed tablets to mix between compartments. No other cases are known and the MHRA says no harm was reported but the risk is serious. The system should be carefully sealed and inspected visually and physically. The MHRA reaffirms its plans to reclassify all pseudoephedrine and ephedrine products to prescription-only status in 2009 if the new restrictions on sales do not reduce misuse. Other topics in this month's Update include revised indications for oral ketoconazole (Nizoral), restricting its use to selected conditions unresponsive to topical therapy; reformulation of the injectable antibiotic Tazocin (piperacillin plus tazobactum); the risk of peripheral neuropathy associated with pegylated interferon and telbivudine (Sebivo) in the treatment of hepatitis B; and serious adverse events associated with modafinil (Provigil). First oral anticoagulant since warfarin In January this year the EMEA issued a positive opinion to recommend marketing authorisation of the oral, fixed-dose, direct thrombin inhibitor dabigatran etexilate (Pradaxa) for the primary prevention of venous thromboembolism (VTE) in adult patients that have undergone elective knee or hip replacement surgery. Marketing authorisation for the EU (including the UK) is expected from the European Commission in the next few weeks, making dabigatran the first oral anticoagulant since warfarin was introduced in 1954. Dabigatran etexilate has been shown to be as safe and effective as enoxaparin (Clexane) with a similar adverse event profile in the noninferiority phase III RENOVATE (Lancet 2007;370: 949-56) and RE-MODEL (J Throm Haemost 2007;5:217885) trials, which investigated the efficacy and safety of dabigatran compared to enoxaparin in reducing the risk of VTE after total hip and knee surgery respectively. Dabigatran has the practical advantage over low-molecular-weight heparin of oral postoperative administration and no risk of heparin-induced thrombocytopenia and, unlike warfarin, does not require monitoring or dose titration. Risk scale predicts anticholinergic effects US investigators have developed a scale for predicting the risk of anticholinergic side-effects from older patients' medicines (Arch Intern Med 2008;168: 508-13). The scale assigns a score from 1 (low) to 3 (high) for the risk of anticholinergic effects such as dry mouth, constipation and dizziness associated with commonly prescribed medicines. Checking the scale retrospectively in older patients in residential care, a higher score was associated with a 30 per cent increased risk of side-effects after adjustment for age and number of medicines. When this was repeated prospectively in a primary-care cohort, the increased risk was 90 per cent. HRT cancer risk persists The latest analysis of the Women's Health Initiative (WHI) trial of HRT shows that the small increase in the risk of cancer persists for up to three years after stopping treatment (J Am Med Assoc 2008;299:1036-45). WHI was stopped after 5.6 years' follow-up when it became clear the risks of HRT outweighed its benefits. This follow-up after a further three years (mean 2.4) involved 15 730 women. The annual risk of cardiovascular events was similar for HRT (1.97 per cent) and placebo (1.91 per cent). Cancers were more common among women who had taken HRT (1.56 vs 1.26 per cent), in particular breast cancer (0.42 vs 0.33 per cent). All-cause mortality was higher, but not statistically significantly so, with HRT (1.20 vs 1.06 per cent). Tight glycaemic control may increase falls Maintaining HbA1C at or below 6 per cent with insulin is associated with an increased risk of falls, a US study suggests (Diabetes Care 2008;31:391-6). The Health, Aging and Composition study involved 446 older people with type 2 diabetes (mean age 74) followed up for approximately five years. The incidence of falls ranged from 22 to 30 per cent annually. Comparing subgroups with HbA1C of ,6 per cent and >8 per cent, an increased risk of falls was associated with insulin use (odds ratio 4.4) but not oral hypoglycaemic drugs. Copyright © 2008 Wiley Interface Ltd [source] Latest news and product developmentsPRESCRIBER, Issue 5 2008Article first published online: 3 APR 200 Newer antidepressants no better than placebo? A new meta-analysis suggests that newer antidepressants are no superior to placebo in most patients with depression , the exception being those with very severe depression, who can expect a small benefit. Writing in the online-only open access journal PLoS Medicine (5:e45.doi:10.1371/ journal.pmed.0050045), researchers from Hull and the US analysed published and unpublished trials submitted to the Food and Drug Administration in marketing applications for fluoxetine, paroxetine, venlafaxine (Efexor) and nefazodone (no longer available). Using the Hamilton Rating Scale for Depression (HRSD) score as an endpoint, meta-analysis of 35 trials involving 5133 patients and lasting six to eight weeks showed that mean HRSD score improved by 9.6 points with drug treatment and 7.8 with placebo. The authors say the difference of 1.8 was statistically significant but below the criterion for clinical significance (3.0) set by NICE in its clinical guideline on depression. A review of the study by the NHS Knowledge Service (www.nhs.uk) points out that it omits trials published after the drugs were licensed (1999) and those not sponsored by the pharmaceutical industry. It did not include any patients with severe depression and only one trial in patients with moderate depression. An earlier US study of data submitted to the FDA (N Eng J Med 2008;358:25260) showed that published trials of antidepressants were more likely to be positive (37/38) than unpublished ones (3/25). Further, FDA analysts concluded that 51 per cent of trials (published and unpublished) demonstrated positive findings compared with 94 per cent of those that were published. Audit reveals variations in hospital psoriasis care There are unacceptably large variations in the quality of care for patients with psoriasis in UK hospitals, a report by the British Association of Dermatologists and the Royal College of Physicians reveals. The audit of 100 hospital units found that 39 per cent restricted access to biological therapies because of cost, and over one-third of pharmacies could not supply ,specials' such as topical coal tar preparations. More positively, the units are adequately resourced to provide timely communication with GPs. RCGP responds to Public Accounts Committee The Royal College of General Practitioners has agreed with the Commons Public Accounts Committee that drug package labelling should include the cost of the medication. The suggestion was made by the Committee in its report Prescribing Costs in Primary Care. While recognising the importance of generic prescribing, the RCGP cautions against frequent medication switches because it may unsettle patients. ,Any changes must be carried out for sound clinical reasons with good communication between GPs and their patients,' it adds. Statins for patients with kidney disease? Statins reduce cardiovascular risk in people with chronic kidney disease, a new study suggests, but their effects on renal function remain unclear (BMJ 2008; published online doi: 10.1136/bmj. 39472.580984.AE). The meta-analysis of 50 trials involving a total of 30 144 patients found that statins reduced lipids and cardiovascular events regardless of the severity of kidney disease. However, all-cause mortality was unaffected and, although proteinuria improved slightly, there was no change in the rate of decline of glomerular filtration rate. An accompanying editorial (BMJ 2008; published online doi:10.1136/ bmj.39483.665139.80) suggests that the indications for statin therapy to reduce cardiovascular risk in patients with chronic kidney disease should be the same as for those with normal renal function. New NICE guidance New clinical guidelines from NICE (see New from NICE, pages 14,15) include the diagnosis and management of irritable bowel syndrome in adults in primary care, the care and management of osteoarthritis in adults, and the diagnosis and treatment of prostate cancer. In a public health guideline on smoking cessation services, NICE endorses the use of nicotine replacement patches for 12,17 year olds. Suspect additives in children's medicines The Food Commission (www.foodcomm.org.uk) has drawn attention to the presence in children's medicines of food additives it says are linked with hyperactivity. The Commission, a national nonprofit organisation campaigning for ,the right to safe, wholesome food', says that seven common additives (including tartrazine, sodium benzoate and Ponceau 4R) are associated with hyperactivity in susceptible children. Checking the SPCs, it found that 28 of 70 children's medicines , including formulations of paracetamol, ibuprofen, amoxicillin, erythromycin and codeine phosphate throat linctus , contain at least one suspect additive. Digoxin may increase mortality in AF patients An observational study has suggested that digoxin may increase deaths in patients with atrial fibrillation (Heart 2008;94:191,6). The study was a planned subgroup analysis of a trial evaluating anticoagulant therapy in 7329 patients with atrial fibrillation. Of these, 53 per cent were treated with digoxin. Mortality was significantly higher among digoxin users than nonusers (4.22 vs 2.66 per cent per year); myocardial infarction and other vascular deaths (but not stroke, systemic embolic episodes and major bleeding events) were significantly more frequent with digoxin. Poor communications cause readmission Elderly hospital patients are often discharged with inadequate information or arrangements for care, causing almost three-quarters to be readmitted within a week, say investigators from Nottingham (Qual Safety Health Care 2008;17:71,5). Retrospective review of records for 108 consecutive patients aged over 75 found that readmission was related to medication in 38 per cent and, of these, 61 per cent were considered avoidable. Almost two-thirds had no discharge letter or were readmitted before the letter was typed; two-thirds of discharge letters had incomplete documentation of medication changes. Copyright © 2008 Wiley Interface Ltd [source] The implementation and assessment of a comprehensive communication skills training curriculum for oncologistsPSYCHO-ONCOLOGY, Issue 6 2010Carma L. Bylund Abstract Objective: The objective of this paper is to report the implementation and assessment of the Comskil Training Curriculum at Memorial Sloan-Kettering Cancer Center. Method: Twenty-eight attending physicians and surgeons participated in communication skills training modules as part of a train-the-trainer program. Doctors were video recorded in clinical consultations with patients two times before training and two times after training, resulting in 112 video recordings for analysis. Recordings were coded using the Comskil Coding System. Results: Communication skills related to two of the six major skill sets, Establishing the Consultation Framework and Checking, increased following training. Limited changes emerged in three skill sets, while one skill set, Shared Decision Making, did not change. Doctors who attended more training modules had higher levels of change. Female participants demonstrated three skills more frequently than males post-training. Conclusions: The intervention produced significant communication skills uptake in a group of experienced attending clinicians, mediated by the amount of training. Future research should focus on the dose of training necessary to achieve skills uptake and the effect of skills training on patient outcomes. Copyright © 2009 John Wiley & Sons, Ltd. [source] Checking the Map: Critiquing Joanne Martin's Metatheory of Organizational Culture and Its Uses in Communication ResearchCOMMUNICATION THEORY, Issue 3 2006Bryan C. Taylor Joanne Martin's scholarship has significantly influenced the study of organizational culture by communication scholars. Martin's recent metatheory seeks to "map" the "terrain" of perspectives commonly used to study organizational culture and argues for the use of multiple perspectives to produce more fruitful research. While acknowledging the benefits of this metatheory, we critique 2 of its problematic elements. Both arise from Martin's claims about the phenomena of organizational culture and the various perspectives through which they might be known. The first problem involves Martin's decoupling of ontology and epistemology, as well as her subsequent oscillation between 2 conflicting clusters of "onto-epistemological" claims. Partly as a result, Martin also overemphasizes the ideational dimensions of organizational culture, thereby inhibiting analysis of its production in and through communication. These problems may negatively affect how communication scholars conceptualize organizational cultural phenomena and analyze data. To mitigate these problems, we offer 2 readings derived from social constructionism, poststructuralism, and critical realism. These readings aid communication scholars in successfully using Martin's metatheory. We conclude by considering the implications of this critique for the development of metatheory in communication. [source] Gender differences in obsessive,compulsive symptom dimensionsDEPRESSION AND ANXIETY, Issue 10 2008Javier Labad M.D. Abstract The aim of our study was to assess the role of gender in OCD symptom dimensions with a multivariate analysis while controlling for history of tic disorders and age at onset of OCD. One hundred and eighty-six consecutive outpatients with a DSM-IV diagnosis of OCD were interviewed. Yale-Brown Obsessive,Compulsive Scale (YBOC-S), YBOC-S Symptom Checklist, and Hamilton Depression and Anxiety Scales were administered to all patients. Lifetime history of tic disorders was assessed with the tic inventory section of the Yale Global Tic Severity Scale. Age at onset of OCD was assessed by direct interview. Statistical analysis was carried out through logistic regression to calculate adjusted female:male odds ratios (OR) for each dimension. A relationship was found between gender and two main OCD dimensions: contamination/cleaning (higher in females; female:male OR=2.02, P=0.03) and sexual/religious (lower in females; female:male OR=0.41, P=0.03). We did not find gender differences in the aggressive/checking, symmetry/ordering, or hoarding dimensions. We also found a greater history of tic disorders in those patients with symptoms from the symmetry/ordering, dimension (P<0.01). Both symmetry/ordering and sexual/religious dimensions were associated with an earlier age at onset of OCD (P<0.05). Gender is a variable that plays a role in the expression of OCD, particularly the contamination/cleaning and sexual/religious dimensions. Our results underscore the need to examine the relationship between OCD dimensions and clinical variables such as gender, tics, age at onset and severity of the disorder to improve the identification of OCD subtypes. Depression and Anxiety 2007. © 2007 Wiley-Liss, Inc. [source] A comparative study of Java and C performance in two large-scale parallel applicationsCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 15 2009Aamir Shafi Abstract In the 1990s the Message Passing Interface Forum defined MPI bindings for Fortran, C, and C++. With the success of MPI these relatively conservative languages have continued to dominate in the parallel computing community. There are compelling arguments in favour of more modern languages like Java. These include portability, better runtime error checking, modularity, and multi-threading. But these arguments have not converted many HPC programmers, perhaps due to the scarcity of full-scale scientific Java codes, and the lack of evidence for performance competitive with C or Fortran. This paper tries to redress this situation by porting two scientific applications to Java. Both of these applications are parallelized using our thread-safe Java messaging system,MPJ Express. The first application is the Gadget-2 code, which is a massively parallel structure formation code for cosmological simulations. The second application uses the finite-domain time-difference method for simulations in the area of computational electromagnetics. We evaluate and compare the performance of the Java and C versions of these two scientific applications, and demonstrate that the Java codes can achieve performance comparable with legacy applications written in conventional HPC languages. Copyright © 2009 John Wiley & Sons, Ltd. [source] The effect of uncontrolled concurrency on model checkingCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2008Donna M. Carter Abstract Correctness of concurrent software is usually checked by techniques such as peer code reviews or code walkthroughs and testing. These techniques, however, are subject to human error, and thus do not achieve an in-depth verification of correctness. Model-checking techniques, which can systematically identify and verify every state that a system can enter, are a powerful alternative method for verifying concurrent systems. However, the usefulness of model checking is limited because the number of states for concurrent models grows exponentially with the number of processes in the system. This is often referred to as the ,state explosion problem.' Some processes are a central part of the software operation and must be included in the model. However, we have found that some exponential complexity results due to uncontrolled concurrency introduced by the programmer rather than due to the intrinsic characteristics of the software being modeled. We have performed tests on multimedia synchronization to show the effect of abstraction as well as uncontrolled concurrency using the Promela/SPIN model checker. We begin with a sequential model not expected to have exponential complexity but that results in exponential complexity. In this paper, we provide alternative designs and explain how uncontrolled concurrency can be removed from the code. Copyright © 2007 John Wiley & Sons, Ltd. [source] A test suite for parallel performance analysis toolsCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 11 2007Michael Gerndt Abstract Parallel performance analysis tools must be tested as to whether they perform their task correctly, which comprises at least three aspects. First, it must be ensured that the tools neither alter the semantics nor distort the run-time behavior of the application under investigation. Next, it must be verified that the tools collect the correct performance data as required by their specification. Finally, it must be checked that the tools perform their intended tasks and detect relevant performance problems. Focusing on the latter (correctness) aspect, testing can be done using synthetic test functions with controllable performance properties, possibly complemented by real-world applications with known performance behavior. A systematic test suite can be built from synthetic test functions and other components, possibly with the help of tools to assist the user in putting the pieces together into executable test programs. Clearly, such a test suite can be highly useful to builders of performance analysis tools. It is surprising that, up until now, no systematic effort has been undertaken to provide such a suite. In this paper we describe the APART Test Suite (ATS) for checking the correctness (in the above sense) of parallel performance analysis tools. In particular, we describe a collection of synthetic test functions which allows one to easily construct both simple and more complex test programs with desired performance properties. We briefly report on experience with MPI and OpenMP performance tools when applied to the test cases generated by ATS. Copyright © 2006 John Wiley & Sons, Ltd. [source] Simulating multiple inheritance in JavaCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2002Douglas Lyon Abstract The CentiJ system automatically generates code that simulates multiple inheritance in Java. The generated code inputs a series of instances and outputs specifications that can be combined using multiple inheritance. The multiple inheritance of implementation is obtained by simple message forwarding. The reflection API of Java is used to reverse engineer the instances, and so the program can generate source code, but does not require source code on its input. Advantages of CentiJ include compile-time type checking, speed of execution, automatic disambiguation (name space collision resolution) and ease of maintenance. Simulation of multiple inheritance was previously available only to Java programmers who performed manual delegation or who made use of dynamic proxies. The technique has been applied at a major aerospace corporation. Copyright © 2002 John Wiley & Sons, Ltd. [source] Longitudinal assessment of symptom and subtype categories in obsessive,compulsive disorderDEPRESSION AND ANXIETY, Issue 7 2007Lutfullah Besiroglu M.D. Abstract Although it has been postulated that symptom subtypes are potential predictors of treatment response, few data exist on the longitudinal course of symptom and subtype categories in obsessive,compulsive disorder (OCD). Putative subtypes of OCD have gradually gained more recognition, but as yet there is no generally accepted subtype discrimination. Subtypes, it has been suggested, could perhaps be discriminated based on autogenous versus reactive obsessions stemming from different cognitive processes. In this study, our aim was to assess whether symptom and subtype categories change over time. Using the Yale,Brown Obsessive Compulsive Symptom Checklist (Y-BOCS-SC), we assessed 109 patients who met DSM-IV criteria for OCD to establish baseline values, then reassessed 91 (83%) of the initial group after 36±8.2 months. Upon reassessment, we found significant changes from baseline within aggressive, contamination, religious, symmetry and miscellaneous obsessions and within checking, washing, repeating, counting and ordering compulsion categories. Sexual, hoarding, and somatic obsessions, and hoarding and miscellaneous compulsions, did not change significantly. In accordance with the relevant literature, we also assigned patients to one of three subtypes,autogenous, reactive, or mixed groups. Though some changes in subtype categories were found, no subtype shifts (e.g., autogenous to reactive or reactive to autogenous) were observed during the course of the study. Significantly more patients in the autogenous group did not meet OCD criteria at follow-up than did patients in the other groups. Our results suggest that the discrimination between these two types of obsession might be highly valid, because autogenous and reactive obsessions are quite different, both in the development and maintenance of their cognitive mechanisms, and in their outcome. Depression and Anxiety 24:461,466, 2007. © 2006 Wiley-Liss, Inc. [source] ROLE OF ENDOSCOPY IN SCREENING OF EARLY PANCREATIC CANCER AND BILE DUCT CANCERDIGESTIVE ENDOSCOPY, Issue 2009Kiyohito Tanaka In the screening of early pancreatic cancer and bile duct cancer, the first issue was ,what are the types of abnormality in laboratory data and symptoms in case of early pancreatic cancer and bile duct cancer?' Early cancer in the pancreaticobiliary region has almost no symptoms, however epigastralgia without abnormality in the gastrointestinal (GI) tract is a sign of early stage pancreaticobiliary cancer. Sudden onset and aggravation of diabetes mellitus is an important change in the case of pancreatic cancer. Extracorporeal ultrasonography is a very useful procedure of checking up changes of pancreatic and biliary lesions. As the role of endoscopy in screening, endoscopic ultrasonography (EUS) is the most effective means of cancer detection of the pancreas, and endoscopic retrograde cholangiopancreatography (ERCP) is most useful of diagnosis tool for abnormalities of the common bile duct. Endoscopic retrograde cholangiopancreatography is an important modality as the procedure of sampling of diagnostic materials. Endoscopic ultrasonography-fine needle aspiration (EUS-FNA) has the role of histological diagnosis of pancreatic mass lesion also. Especially, in the case of pancreas cancer without evidence of cancer by pancreatic juice cytology and brushing cytology, EUS-FNA is essential. Intra ductal ultrasonography (IUDS) and perotral cholangioscopy (POCS) are useful for determination of mucosal extent in extrahepatic bile duct cancer. Further improvements of endoscopical technology, endoscopic procedures are expected to be more useful modalities in detection and diagnosis of early pancreatic and bile duct cancers. [source] The Race Towards Transparency: An Experimental InvestigationECONOMIC NOTES, Issue 3 2002Marco Rossi To understand the current tendency toward transparency, we studied the effects of accounting disclosure in a laboratory. In our experiment, transparency in the financial accounts of the listed companies improved information efficiency; but, even after checking for fundamentals, the transparency increased the volatility of market prices. Moreover, transparency improved investors' utility, so that their preference for more certain assets emerged. Therefore, we argue that the current race toward transparency may be better explained by firms' and markets' intention to attract household investments rather than to improve market efficiency. (J.E.L.: G92, D44, D81, G12, G28). [source] Hourly surface wind monitor consistency checking over an extended observation periodENVIRONMETRICS, Issue 4 2009Scott Beaver Abstract A consistency checking methodology is presented to aid in identifying biased values in extended historical records of hourly surface wind measurements obtained from a single station. The method is intended for screening extended observation periods for values which do not fail physical consistency checks (i.e., standard or complex quality assurance methods), yet nonetheless exhibit statistical properties which differ from the bulk of the record. Several specific types of inconsistencies common in surface wind monitoring datasets are considered: annual biases, unexpected values, and discontinuities. The purely empirical method checks for self-consistency in the temporal distribution of the wind measurements by explicitly modeling the diurnal variability. Each year of data is modeled using principal component analysis (PCA) (or empirical orthogonal functions, EOF), then hierarchical clustering with nearest neighbor linkage is used to visualize any annual biases existing in the measurements. The diurnal distributions for wind speed and direction are additionally estimated and visualized to determine any periods of time which are inconsistent with the typical diurnal cycle for a given monitor. The robust consistency checking method is applied to a set of 44 monitors operating in the San Joaquin Valley (SJV) of Central California over a 9-year period. Monitors from the SLAMS, CIMIS, and RAWS networks are considered. Similar inconsistencies are detected in all three networks; however, network-specific types of inconsistencies are found as well. Copyright © 2008 John Wiley & Sons, Ltd. [source] Volatile compounds of original African black and white shea butter from Tchad and CameroonEUROPEAN JOURNAL OF LIPID SCIENCE AND TECHNOLOGY, Issue 7 2006Sabine Krist Abstract Shea butter is used as an edible vegetable fat in many African countries. It can be utilized as a substitute or complete replacement for cocoa butter in various applications and plays an important role in traditional African medicinal practice. Although detection of volatile compounds by solid-phase micro-extraction gas-chromatography mass-spectroscopy (SPME-GC-MS) is a very reliable and reproducible technique, which can be used as an important part of authenticity checking, production monitoring and contamination detection, no published data about volatile compounds of shea butter are available so far. In this investigation, the characteristic volatiles in the headspace of original African shea butter samples were identified by using SPME-capillary-GC coupled to a mass selective detector. Almost 100,different volatile components were identified, e.g. fatty acids, saturated and unsaturated aldehydes and ketones, terpenes, and typical Maillard reaction products such as methylfuranes and pyrazines. Furthermore, the samples have been olfactorily evaluated by a panel of professional flavorists and trained analytical chemists. It can be stated that variations in processing conditions of shea butter result in considerable differences in the composition of headspace volatiles, detected by SPME-GC-MS and human olfaction. [source] Towards a platform for the metabonomic profiling of different strains of Drosophila melanogaster using liquid chromatography,Fourier transform mass spectrometryFEBS JOURNAL, Issue 22 2009Muhammad A. Kamleh A platform based on hydrophilic interaction chromatography in combination with Fourier transform mass spectrometry was developed in order to carry out metabonomics of Drosophila melanogaster strains. The method was able to detect , 230 metabolites, mainly in the positive ion mode, after checking to eliminate false positives caused by isotope peaks, adducts and fragment ions. Two wild-type strains, Canton S and Oregon R, were studied, plus two mutant strains, Maroon Like and Chocolate. In order to observe the differential expression of metabolites, liquid chromatography-mass spectrometry analyses of the different strains were compared using sieve 1.2 software to extract metabolic differences. The output from sieve was searched against a metabolite database using an Excel-based macro written in-house. Metabolic differences were observed between the wild-type strains, and also between both Chocolate and Maroon Like compared with Oregon R. It was established that a metabonomic approach could produce results leading to the generation of new hypotheses. In addition, the structure of a new class of lipid with a histidine head group, found in all of the strains of flies, but lower in Maroon Like, was elucidated. [source] Multivariate GARCH Modeling of Exchange Rate Volatility Transmission in the European Monetary SystemFINANCIAL REVIEW, Issue 1 2000Colm Kearney C32/F31/G15 Abstract We construct a series of 3-, 4- and 5-variable multivariate GARCH models of exchange rate volatility transmission across the important European Monetary System (EMS) currencies including the French franc, the German mark, the Italian lira, and the European Currency Unit. The models are estimated without imposing the common restriction of constant correlation on both daily and weekly data from April 1979,March 1997. Our results indicate the importance of checking for specification robustness in multivariate Generalized Autoregressive Conditional Heleroskedasticity (GARCH) modeling, we find that increased temporal aggregation reduces observed volatility transmission, and that the mark plays a dominant position in terms of volatility transmission. [source] Raw materials: the importance of quality and safety.FLAVOUR AND FRAGRANCE JOURNAL, Issue 5 2010A review. Abstract Aromatic plants and spices are used throughout the world for flavouring food and beverages, as well as for food supplements, novel foods and as a source of essential oils and aromatic extracts. The non-availability or inadequacy of standards for checking and assuring the quality of aromatic plants and spices is one of the main problems that arise for industry when using such raw materials. As many aromatic plants are harvested from the wild, standardization to assure their quality is important for their safe and effective utilization in food and beverage industries. On the other hand, there are numerous parameters that influence the chemical composition of plants, which play an important role in the final quality of the product and possibly in any risk arising to the consumer. Also, from a safety point of view, aromatic plants and spices should be free of undeclared contaminants and adulterants, such as toxic botanicals, pathogenic microorganisms and excessive levels of microbial toxins, pesticides or fumigation agents. We focus on these aspects and examine ways to assure their appropriate utilization from the quality and safety standpoint. The regulatory situation of medicinal and aromatic plants (MAPs) is very complicated; several differences in standards and regulations between countries can be found, a situation that can result in more health risks arising for consumers. To clarify some of the existing problems, the major regulations of the USA and the European Union (EU) and the borderlines between food supplements and medicines and other international standards, are briefly described and discussed. Copyright © 2010 John Wiley & Sons, Ltd. [source] P - and S -velocity images of the lithosphere,asthenosphere system in the Central Andes from local-source tomographic inversionGEOPHYSICAL JOURNAL INTERNATIONAL, Issue 1 2006Ivan Koulakov SUMMARY About 50 000 P and S arrival times and 25 000 values of t* recorded at seismic arrays operated in the Central Andes between 20°S and 25°S in the time period from 1994 to 1997 have been used for locating more than 1500 deep and crustal earthquakes and creating 3-D P, S velocity and Qp models. The study volume in the reference model is subdivided into three domains: slab, continental crust and mantle wedge. A starting velocity distribution in each domain is set from a priori information: in the crust it is based on the controlled sources seismic studies; in slab and mantle wedge it is defined using relations between P and S velocities, temperature and composition given by mineral physics. Each iteration of tomographic inversion consists of the following steps: (1) absolute location of sources in 3-D velocity model using P and S arrival times; (2) double-difference relocation of the sources and (3) simultaneous determination of P and S velocity anomalies, P and S station corrections and source parameters by inverting one matrix. Velocity parameters are computed in a mesh with the density of nodes proportional to the ray density with double-sided nodes at the domain boundaries. The next iteration is repeated with the updated velocity model and source parameters obtained at the previous step. Different tests aimed at checking the reliability of the obtained velocity models are presented. In addition, we present the results of inversion for Vp and Vp/Vs parameters, which appear to be practically equivalent to Vp and Vs inversion. A separate inversion for Qp has been performed using the ray paths and source locations in the final velocity model. The resulting Vp, Vs and Qp distributions show complicated, essentially 3-D structure in the lithosphere and asthenosphere. P and S velocities appear to be well correlated, suggesting the important role of variations of composition, temperature, water content and degree of partial melting. [source] Three Secondary Reference Materials for Lithium Isotope Measurements: Li7-N, Li6-N and LiCl-N SolutionsGEOSTANDARDS & GEOANALYTICAL RESEARCH, Issue 1 2007Jean Carignan matériaux de référence; isotopes de Li; solutions de Li; QUAD-ICP-MS; MC-ICP-MS The CRPG (Nancy, France) has prepared secondary reference materials for Li isotope measurements by mixing 7Li or 6Li spikes and either L-SVEC or IRMM-016 certified reference materials to produce solutions having a known Li concentration and isotopic composition. The Li7-N and Li6-N solution samples (1.5 mol l,1 HNO3) have nominal ,7Li isotopic compositions of 30.1, and -9.7, respectively relative to L-SVEC and concentrations of 100 mg l,1. Repeated measurement of these samples using the QUAD-ICP-MS at the CRPG yielded ,7Li of 30.4 ± 1.1, (n = 13) and -8.9 ± 0.9, (n = 9) at the 2s level of confidence. An additional LiCl-N solution was measured and yielded a delta value of 9.5 ± 0.6, (n = 3). Identical results were obtained at the BRGM (Orléans, France) from determinations performed with a Neptune MC-ICP-MS (30.2 ± 0.3,, n = 89 for the Li7-N, -8.0 ± 0.3,, n = 38 for the Li6-N and 10.1 ± 0.2,, n = 46 for LiCl-N at the 2s level of confidence). The deviation of measured composition relative to the nominal value for the Li6-N solution might be explained by either contamination during preparation or an error during sample weighing. These secondary reference materials, previously passed through ion exchange resin or directly analysed, may be used for checking the accuracy of Li isotopic measurements over a range of almost 40, and will be available to the scientific community upon request to J. Carignan or N. Vigier, CRPG. Le CRPG (Nancy, France) a préparé des matériaux secondaires de référence pour l'analyse des isotopes du Li en mélangeant des spikes de 7Li ou 6Li avec les matériaux de référence certifiés L-SVEC ou IRMM-016, ceci afin de produire des solutions ayant des concentrations et compositions isotopiques de Li connues. Les solutions Li7-N et Li6-N (1.5 mol l,1 HNO3) ont des compositions isotopiques nominales de ,7Li, exprimées par rapport à L-SVEC, de 30.1, et de -9.7, respectivement, et des concentrations de 10 0 mg l,1. L'analyse répétée de ces solutions par QUAD-ICP-MS au CRPG donne des ,7Li de 30.4 ± 1.1, (n = 13) et -8.9 ± 0.9, (n = 9) avec une incertitude à 2s. Une solution additionnelle de LiCl-N a été analysée et a donné une valeur de delta de 9.5 ± 0.6, (n = 3). Des résultats identiques ont été obtenus au BRGM (Orléans, France) où les déterminations ont été effectuées sur le MC-ICP-MS Neptune (30.2 ± 0.3,, n = 89 pour Li7-N, -8.0 ± 0.3,, n = 38 pour Li6-N et 10.1 ± 0.2,, n = 46 pour LiCl-N, à 2s d'intervalle de confiance). Le biais entre les compositions mesurées et la valeur nominale, observé pour la solution Li6-N peut être expliqué par une contamination durant la préparation ou par une erreur durant la pesée. Ces matériaux secondaires de référence, préalablement passés sur résine échangeuse d'ions ou analysés directement, peuvent être utilisés pour vérifier la justesse des analyses isotopiques de Li sur une gamme de presque 40% et sont à la disposition de la communauté scientifique sur demande auprès de J. Carignan ou N. Vigier, CRPG. [source] Application of Six Sigma Methods for Improving the Analytical Data Management Process in the Environmental IndustryGROUND WATER MONITORING & REMEDIATION, Issue 2 2006Christopher M. French Honeywell applied the rigorous and well-documented Six Sigma quality-improvement approach to the complex, highly heterogeneous, and mission-critical process of remedial site environmental data management to achieve a sea change in terms of data quality, environmental risk reduction, and overall process cost reduction. The primary focus was to apply both qualitative and quantitative Six Sigma methods to improve electronic management of analytical laboratory data generated for environmental remediation and long-term monitoring programs. The process includes electronic data delivery, data QA/QC checking, data verification, data validation, database administration, regulatory agency reporting and linkage to spatial information, and real-time geographical information systems. Results of the analysis identified that automated, centralized web-based software tools delivered through Software as a Service (SaaS) model are optimal to improve the process resulting in cost reductions, while simultaneously improving data quality and long-term data usability and perseverance. A pilot project was completed that quantified cycle time and cost improvements of 50% and 65%, respectively. [source] Sources of information on adverse effects: a systematic reviewHEALTH INFORMATION & LIBRARIES JOURNAL, Issue 3 2010Su Golder Background:, Systematic reviews can provide accurate and timely information on adverse effects. An essential part of the systematic review process is a thorough search of the literature. This often requires searching many different sources. However, it is unclear which sources are most effective at providing information on adverse effects. Objective:, To identify and summarise studies that have evaluated sources of information on adverse effects. Methods:, Studies were located by searching in 10 databases as well as by reference checking, hand searching, citation searching and contacting experts. Results:, A total of 6218 citations were retrieved yielding 19 studies which met the inclusion criteria. The included studies tended to focus on the adverse effects of drug interventions and compare the relative value of different sources using the number of relevant references retrieved from searches of each source. However, few studies were conducted recently with a large sample of references. Conclusions:, This review suggests that embase, Derwent Drug File, medline and industry submissions may potentially provide the greatest number of relevant references for information on adverse effects of drugs. However, a systematic evaluation of the current value of different sources of information for adverse effects is urgently required. [source] Feature-space clustering for fMRI meta-analysis,HUMAN BRAIN MAPPING, Issue 3 2001Cyril Goutte Abstract Clustering functional magnetic resonance imaging (fMRI) time series has emerged in recent years as a possible alternative to parametric modeling approaches. Most of the work so far has been concerned with clustering raw time series. In this contribution we investigate the applicability of a clustering method applied to features extracted from the data. This approach is extremely versatile and encompasses previously published results [Goutte et al., 1999] as special cases. A typical application is in data reduction: as the increase in temporal resolution of fMRI experiments routinely yields fMRI sequences containing several hundreds of images, it is sometimes necessary to invoke feature extraction to reduce the dimensionality of the data space. A second interesting application is in the meta-analysis of fMRI experiment, where features are obtained from a possibly large number of single-voxel analyses. In particular this allows the checking of the differences and agreements between different methods of analysis. Both approaches are illustrated on a fMRI data set involving visual stimulation, and we show that the feature space clustering approach yields nontrivial results and, in particular, shows interesting differences between individual voxel analysis performed with traditional methods. Hum. Brain Mapping 13:165,183, 2001. © 2001 Wiley-Liss, Inc. [source] Ergonomic weighted scores to evaluate critical instructions for improvements in a printed circuit assembly factoryHUMAN FACTORS AND ERGONOMICS IN MANUFACTURING & SERVICE INDUSTRIES, Issue 1 2003Rabindra Nath Sen A survey was conducted on the efficacy of the Printed Circuit Assembly operations using newly designed Ergonomic Weighted Scores (EWSs) to evaluate Critical Instructions (CIs) to operators. This helped to choose priorities of problematic operations. Five EWSs were drafted, tried, and finalized by a team of experts, for the purpose of checking the adherence to CIs, failing any of which could cause adverse effects on the quality of products. The top three priority operations were identified and the follow-up studies resulted in an increase in monthly revenue of US$227,880 and improvements in quality, productivity, occupational health and safety of the operators. © 2003 Wiley Periodicals, Inc. Hum Factors Man 13: 41,58, 2003. [source] Apparent/spurious multifractality of data sampled from fractional Brownian/Lévy motionsHYDROLOGICAL PROCESSES, Issue 15 2010Shlomo P. Neuman Abstract Many earth and environmental variables appear to be self-affine (monofractal) or multifractal with spatial (or temporal) increments having exceedance probability tails that decay as powers of , , where 1 < , , 2. The literature considers self-affine and multifractal modes of scaling to be fundamentally different, the first arising from additive and the second from multiplicative random fields or processes. We demonstrate theoretically that data having finite support, sampled across a finite domain from one or several realizations of an additive Gaussian field constituting fractional Brownian motion (fBm) characterized by , = 2, give rise to positive square (or absolute) increments which behave as if the field was multifractal when in fact it is monofractal. Sampling such data from additive fractional Lévy motions (fLm) with 1 < , < 2 causes them to exhibit spurious multifractality. Deviations from apparent multifractal behaviour at small and large lags are due to nonzero data support and finite domain size, unrelated to noise or undersampling (the causes cited for such deviations in the literature). Our analysis is based on a formal decomposition of anisotropic fLm (fBm when , = 2) into a continuous hierarchy of statistically independent and homogeneous random fields, or modes, which captures the above behaviour in terms of only E + 3 parameters where E is Euclidean dimension. Although the decomposition is consistent with a hydrologic rationale proposed by Neuman (2003), its mathematical validity is independent of such a rationale. Our results suggest that it may be worth checking how closely would variables considered in the literature to be multifractal (e.g. experimental and simulated turbulent velocities, some simulated porous flow velocities, landscape elevations, rain intensities, river network area and width functions, river flow series, soil water storage and physical properties) fit the simpler monofractal model considered in this paper (such an effort would require paying close attention to the support and sampling window scales of the data). Parsimony would suggest associating variables found to fit both models equally well with the latter. Copyright © 2010 John Wiley & Sons, Ltd. [source] Improving Parsing of ,BA' Sentences for Machine TranslationIEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, Issue 1 2008Dapeng Yin Non-member Abstract The research on Chinese-Japanese machine translation has been lasting for many years, and now this research field is increasingly thoroughly refined. In practical machine translation system, the processing of a simple and short Chinese sentence has somewhat good results. However, the translation of complex long Chinese sentence still has difficulties. For example, these systems are still unable to solve the translation problem of complex ,BA' sentences. In this article a new method of parsing of ,BA' sentence for machine translation based on valency theory is proposed. A ,BA' sentence is one that has a prepositional word ,BA'. The structural character of a ,BA' sentence is that the original verb is behind the object word. The object word after the ,BA' preposition is used as an adverbial modifier of an active word. First, a large number of grammar items from Chinese grammar books are collected, and some elementary judgment rules are set by classifying and including the collected grammar items. Then, these judgment rules are put into use in actual Chinese language and are modified by checking their results instantly. Rules are checked and modified by using the statistical information from an actual corpus. Then, a five-segment model used for ,BA' sentence translation is brought forward after the above mentioned analysis. Finally, we applied this proposed model into our developed machine translation system and evaluated the experimental results. It achieved a 91.3% rate of accuracy and the satisfying result verified effectiveness of our five-segment model for ,BA' sentence translation. Copyright © 2007 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc. [source] A note on formulas for localized failure of frictional materials in compression and biaxial loading modesINTERNATIONAL JOURNAL FOR NUMERICAL AND ANALYTICAL METHODS IN GEOMECHANICS, Issue 10 2001Matthias Lambrecht Abstract The paper investigates aspects of the localization analysis of frictional materials. We derive closed formulas and diagrams for the inclination angle of critical discontinuity surfaces which develop in homogeneous compression and biaxial loading tests. The localization analysis is based on a Drucker,Prager-type elastoplastic hardening model for non-associated plastic flow at small strains, which we represent in spectral form. For this type of constitutive model, general analytical formulas for the so-called critical hardening modulus and the inclination angle of critical discontinuity surfaces are derived for the plane strain case. The subsequent treatment then specializes these formulas for the analysis of compression and biaxial loading modes. The key contribution here is a detailed analysis of plane strain deformation modes where the localized failure occurs after subsequent plastic flow. The derived formulas and diagrams can be applied to the checking of an accompanying localization analysis of frictional materials in finite-element computations. Copyright © 2001 John Wiley & Sons, Ltd. [source] |