Home About us Contact | |||
Traditional Methods (traditional + methods)
Selected AbstractsRapid plant diversity assessment using a pixel nested plot design: A case study in Beaver Meadows, Rocky Mountain National Park, Colorado, USADIVERSITY AND DISTRIBUTIONS, Issue 4 2007Mohammed A. Kalkhan ABSTRACT Geospatial statistical modelling and thematic maps have recently emerged as effective tools for the management of natural areas at the landscape scale. Traditional methods for the collection of field data pertaining to questions of landscape were developed without consideration for the parameters of these applications. We introduce an alternative field sampling design based on smaller unbiased random plot and subplot locations called the pixel nested plot (PNP). We demonstrate the applicability of the PNP design of 15 m × 15 m to assess patterns of plant diversity and species richness across the landscape at Rocky Mountain National Park (RMNP), Colorado, USA in a time (cost)-efficient manner for field data collection. Our results produced comparable results to a previous study in the Beaver Meadow study (BMS) area within RMNP, where there was a demonstrated focus of plant diversity. Our study used the smaller PNP sampling design for field data collection which could be linked to geospatial information data and could be used for landscape-scale analyses and assessment applications. In 2003, we established 61 PNP in the eastern region of RMNP. We present a comparison between this approach using a sub-sample of 19 PNP from this data set and 20 of Modified Whittaker nested plots (MWNP) of 20 m × 50 m that were collected in the BMS area. The PNP captured 266 unique plant species while the MWNP captured 275 unique species. Based on a comparison of PNP and MWNP in the Beaver Meadows area, RMNP, the PNP required less time and area sampled to achieve a similar number of species sampled. Using the PNP approach for data collection can facilitate the ecological monitoring of these vulnerable areas at the landscape scale in a time- and therefore cost-effective manner. [source] Evaluating Cardiac Sources of Embolic Stroke with MRIECHOCARDIOGRAPHY, Issue 3 2007Asu Rustemli M.D. The evaluation of patients with stroke includes identifying its etiology in order to appropriately tailor therapy. Currently, the diagnostic work-up includes imaging of the brain, the arteries of the head and neck, the aorta, and the heart. Traditional methods of imaging include magnetic resonance imaging (MRI) and magnetic resonance angiography (MRA), duplex ultrasound, and transthoracic echocardiography (TTE) and/or transesophageal echocardiography (TEE). While echocardiography remains a cornerstone in the field of cardiac imaging, MRI is increasingly able to assess for the most common causes of cardioembolic stroke such as left atrial/left atrial appendage thrombus, left ventricular thrombus, aortic atheroma, cardiac masses and patent foramen ovale. This review will focus on the advantages and limitations of echocardiography and cardiac magnetic resonance (CMR) imaging in diagnosing patients suspected of having an embolic stroke and the role these modalities play in clinical practice today. [source] FIGHTING FINANCIAL CRIME: A UK PERSPECTIVEECONOMIC AFFAIRS, Issue 1 2007Mike Bowron Financial crime has a devastating impact on individuals, companies and governments. Traditional methods of control, predominantly investigation and prosecution, have failed to abate the rise of both fraud and money laundering offences. Tackling financial crime is best approached from the perspective of prevention, an activity that requires co-operation between all those affected by this widespread and corrosive social problem. [source] A fuzzy approach to active usage parameter control in IEEE 802.11b wireless networksEXPERT SYSTEMS, Issue 5 2004David Soud Abstract: Usage parameter control (UPC) provides support for quality of service across heterogeneous networks. For the network operator UPC assists in limiting network usage through traffic shaping, to prevent unacceptable delay. Traditional methods to apply UPC involve the generic cell rate algorithm or ,leaky bucket' algorithm, now commonly implemented in asynchronous transmission mode networks. This paper proposes a novel form of UPC for 802.11b wireless networks. The method proposed measures the rate of individual network flows to actively manage link utilization using a fuzzy logic controller (FLC). The FLC monitors the flow rate and adjusts the sending transmissions to stabilize flows as close to the optimum desired rate as possible. Imposing UPC and using the FLC within a packet switched TCP network enforces cooperation between competing streams of traffic. After carrying out experiments within a wireless network, the results obtained significantly improve upon a ,best effort' service. [source] X-linked mental retardation and epigeneticsJOURNAL OF CELLULAR AND MOLECULAR MEDICINE, Issue 4 2006Guy Froyen Abstract The search for the genetic defects in constitutional diseases has so far been restricted to direct methods for the identification of genetic mutations in the patients' genome. Traditional methods such as karyotyping, FISH, mutation screening, positional cloning and CGH, have been complemented with newer methods including array-CGH and PCR-based approaches (MLPA, qPCR). These methods have revealed a high number of genetic or genomic aberrations that result in an altered expression or reduced functional activity of key proteins. For a significant percentage of patients with congenital disease however, the underlying cause has not been resolved strongly suggesting that yet other mechanisms could play important roles in their etiology. Alterations of the ,native' epigenetic imprint might constitute such a novel mechanism. Epigenetics, heritable changes that do not rely on the nucleotide sequence, has already been shown to play a determining role in embryonic development, X-inactivation, and cell differentiation in mammals. Recent progress in the development of techniques to study these processes on full genome scale has stimulated researchers to investigate the role of epigenetic modifications in cancer as well as in constitutional diseases. We will focus on mental impairment because of the growing evidence for the contribution of epigenetics in memory formation and cognition. Disturbance of the epigenetic profile due to direct alterations at genomic regions, or failure of the epigenetic machinery due to genetic mutations in one of its components, has been demonstrated in cognitive derangements in a number of neurological disorders now. It is therefore tempting to speculate that the cognitive deficit in a significant percentage of patients with unexplained mental retardation results from epigenetic modifications. [source] Pathogen inactivation technology: cleansing the blood supplyJOURNAL OF INTERNAL MEDICINE, Issue 3 2005H. G. KLEIN Abstract., Klein HG (The Johns Hopkins School of Medicine and Warren G. Magnuson Clinical Center, National Institutes of Health, Bethesda, MD, USA). Pathogen inactivation technology: cleansing the blood supply (Review). J Intern Med 2005; 257: 224,237. The calculated residual infectious risk of HIV, hepatitis B virus (HBV) and hepatitis C virus (HCV) from blood transfusion is extremely low. However, the risk of bacterial contamination remains and a variety of other agents including emerging viruses, protozoa and tick-borne agents threaten blood supplies and undermine public confidence in blood safety. Traditional methods of donor screening and testing have limited ability to further reduce disease transmission and cannot prevent an emerging infectious agent from entering the blood supply. Pathogen inactivation technologies have all but eliminated the infectious risks of plasma-derived protein fractions, but as yet no technique has proved sufficiently safe and effective for traditional blood components. Half-way technologies can reduce the risk of pathogen transmission from fresh frozen plasma and cryoprecipitate. Traditional methods of mechanical removal such as washing and filtration have limited success in reducing the risk of cell-associated agents, but methods aimed at sterilizing blood have either proved toxic to the cells or to the recipients of blood components. Several promising methods that target pathogen nucleic acid have recently entered clinical testing. [source] Effect of lipid oxidation and frozen storage on muscle proteins of Atlantic mackerel (Scomber scombrus)JOURNAL OF THE SCIENCE OF FOOD AND AGRICULTURE, Issue 5 2002Suhur Saeed Abstract The effect of storage on the lipids and proteins in Atlantic mackerel stored for up to 24 months at ,20 and ,30,°C was studied. Traditional methods including the peroxide value, thiobarbituric acid-reactive substances (TBARS) and a reverse phase HPLC method were used to determine the primary and secondary lipid oxidation products. All tests showed an increase in lipid oxidation products with storage time and at a higher storage temperature of ,20,°C compared with samples stored at ,30,°C. Antioxidants had a significant effect (P,<,0.01) on the inhibition of lipid oxidation, as shown by the reduction in peroxide value and hydroxides, and malondialdehyde formation. Similarly, deterioration of protein structure and functionality in mackerel stored for 3, 6, 12 and 24 months was greater at ,20 than ,30,°C. ATPase activity in the myosin extract of Atlantic mackerel showed a significant decrease (P,<,0.01) with progressive frozen storage. Protein solubility in high salt concentration (0.6,M NaCl) decreased (P,<,0.01) during storage at both ,20 and ,30,°C but was greater at ,20,°C. Interestingly, antioxidants BHT, vitamin C and vitamin E protected the proteins against complete loss of ATPase activity and protein solubility to a significant level (P,<,0.01) for up to 1 year at ,20,°C compared with samples stored without antioxidants. This study confirms the deleterious effect of lipid oxidation products on protein structure and function in frozen fatty fish. © 2002 Society of Chemical Industry [source] Measuring SPIO and Gd contrast agent magnetization using 3,T MRINMR IN BIOMEDICINE, Issue 8 2009Pádraig Cantillon-Murphy Abstract Traditional methods of measuring magnetization in magnetic fluid samples, such as vibrating sample magnetometry (VSM), are typically limited to maximum field strengths of about 1,T. This work demonstrates the ability of MRI to measure the magnetization associated with two commercial MRI contrast agents at 3,T by comparing analytical solutions to experimental imaging results for the field pattern associated with agents in cylindrical vials. The results of the VSM and fitted MRI data match closely. The method represents an improvement over VSM measurements since results are attainable at imaging field strengths. The agents investigated are Feridex, a superparamagnetic iron oxide suspension used primarily for liver imaging, and Magnevist, a paramagnetic, gadolinium-based compound used for tumors, inflammation and vascular lesions. MR imaging of the agents took place in sealed cylindrical vials in the presence of a surrounding volume of deionized water where the effects of the contrast agents had a measurable effect on the water's magnetization in the vicinity of the compartment of contrast agent. A pair of phase images were used to reconstruct a B0 fieldmap. The resultant B0 maps in the water region, corrected for shimming and container edge effects, were used to predict the agent's magnetization at 3,T. The results were compared with the results from VSM measurements up to 1.2,T and close correlation was observed. The technique should be of interest to those seeking quantification of the magnetization associated with magnetic suspensions beyond the traditional scope of VSM. The magnetization needs to be sufficiently strong (Ms , 50 Am2/kg Fe for Feridex and Xm , 5 × 10,5 m3/kg Gd for Magnevist) for a measurable dipole field in the surrounding water. For this reason, the technique is mostly suitable for undiluted agents. Copyright © 2009 John Wiley & Sons, Ltd. [source] Demystifying Online Genetic DatabasesNURSING & HEALTH SCIENCES, Issue 2 2006Carolyn Driscoll There has been an explosion of genetic information and keeping current can be difficult. Traditional methods for obtaining information may be obsolete. Many sources for genetic information are now found on the internet although they may be confusing to navigate and interpret. The purpose of this presentation is to outline commonly used genetic databases, and demonstrate how they may be accessed and used to interpret genetic data. The mission of the National Center for Biotechnology Information (NCBI), a resource for molecular biology information, is to develop new information technologies to support understanding of molecular and genetic processes related to health and disease. NCBI services include PubMed, Nucleotide, and the BLAST algorithm for sequence comparison. In this presentation, several genetic databases will be explored. Each database will be defined, the available genetic information described, database access demonstrated, and website information displayed. This presentation will provide education related to several genetic databases as a means of facilitating and promoting access to this information by a larger audience of nurses and health care providers involved with genetic health care. [source] Hypothesis: Research in Otolaryngology Is Essential for Continued Improvement in Health Care,THE LARYNGOSCOPE, Issue 6 2002Robert H. Mathog MD Abstract The present report, in the form of a research proposal, is based on the hypothesis that research in otolaryngology is essential for continued improvement in health care. Examples of advances in otolaryngology as a result of research are noted, but for continued success, otolaryngology must maintain and find better ways to train clinically directed researchers. Traditional methods of training such as hands-on experience, courses in the basic principles of research, protected time, and mentoring are discussed and evaluated. Barriers to success such as age, time, and debt are noted. Potential solutions are presented with an emphasis on integration of the research and clinical training. Success of faculty will continue to depend on laboratory and financial support, technical assistance, protected time, salary equivalent to other faculty, and accessibility of research funds. For research to gain support and enthusiasm and to keep it strong and productive, cost-effectiveness and value must be recognized. [source] High-throughput localization of organelle proteins by mass spectrometry: a quantum leap for cell biologyBIOESSAYS, Issue 8 2006Denise J.L. Tan Cells are the fundamental building blocks of organisms and their organization holds the key to our understanding of the processes that control Development and Physiology as well as the mechanisms that underlie disease. Traditional methods of analysis of subcellular structure have relied on the purification of organelles and the painstaking biochemical description of their components. The arrival of high-throughput genomic and, more significantly, proteomic technologies has opened hereto unforeseen possibilities for this task. Recently two reports(1,2) show how much can be gleaned from the combination of analytical centrifugation, mass spectrometry and advanced statistical techniques focused on a high-throughput analysis of the content and organization of plant and animal cells. The results reveal intriguing possibilities for the future and the possibility of mapping much of the known proteome onto our current map of the cell. BioEssays 28: 780,784, 2006. © 2006 Wiley Periodicals, Inc. [source] Development of liquid chromatography/mass spectrometry methods for the quantitative analysis of herbal medicine in biological fluids: a reviewBIOMEDICAL CHROMATOGRAPHY, Issue 1 2010Michael J. Gray Abstract The development of liquid chromatography,mass spectrometry (LC-MS) and tandem MS/MS for the analysis of bioactive components and their metabolites of herbal medicines in biological fluids is reviewed with the aim of providing an overview of the current techniques and methods used. The issues and challenges associated with various stages of the analytical method development are discussed using Ginkgo biloba and Panax ginseng as case studies. LC-MS offers selectivity and specificity in both the chromatographic separation and detection steps. This is necessary in order to measure compounds at extremely low concentrations as is often observed in plasma and urine samples. Traditional methods of detection (UV,visible) do not offer sufficient selectivity and specificity needed. The strategies and pitfalls involved with the measurement of such compounds are discussed in this review. Matrix effects, ,unseen' matrix suppression and enhancement ionization effects can significantly reduce the accuracy and precision of the measurement. The impact of the correct choice of chromatography column formats on signal-to-noise ratio is also discussed. Analytical methods from sample preparation to mass spectrometric detection is outlined in order to provide good direction for analysts intent on the measurement of bioavailable compounds from herbal medicines in plasma and urine samples. Copyright © 2009 John Wiley & Sons, Ltd. [source] Preclinical Manufacture of Anti-HER2 Liposome-Inserting, scFv-PEG-Lipid Conjugate.BIOTECHNOLOGY PROGRESS, Issue 1 2005Analytical methods optimized for micellar F5cys-MP-PEG(2000)-DPSE protein-lipopolymer conjugate are presented. The apparent micelle molecular weight, determined by size exclusion chromatography, ranged from 330 to 960 kDa. The F5cys antibody and conjugate melting points, determined by differential scanning calorimetry, were near 82 °C. Traditional methods for characterizing monodisperse protein species were inapplicable to conjugate analysis. The isoelectric point of F5cys (9.2) and the conjugate (8.9) were determined by capillary isoelectric focusing (cIEF) after addition of the zwitterionic detergent CHAPS to the buffer. Conjugate incubation with phospholipase B selectively removed DSPE lipid groups and dispersed the conjugate prior to separation by chromatographic methods. Alternatively, adding 2-propanol (29.4 vol %) and n -butanol (4.5 vol %) to buffers for salt-gradient cation exchange chromatography provided gentler, nonenzymatic dispersion, resulting in well-resolved peaks. This method was used to assess stability, identify contaminants, establish lot-to-lot comparability, and determine the average chromatographic purity (93%) for conjugate lots, described previously. The F5cys amino acid content was confirmed after conjugation. The expected conjugate avidity for immobilized HER-2/neu was measured by bimolecular interaction analysis (BIAcore). Mock therapeutic assemblies were made by conjugate insertion into preformed doxorubicin-encapsulating liposomes for antibody-directed uptake of doxorubicin by HER2-overexpressing cancer cells in vitro. Together these developed assays established that the manufacturing method as described in the first part of this study consistently produced F5cys-MP-PEG(2000)-DSPE having sufficient purity, stability, and functionality for use in preclinical toxicology investigations. [source] Chemical Approaches to Controlling Intracellular Protein DegradationCHEMBIOCHEM, Issue 1 2005John S. Schneekloth Jr. Inactive. Recent advances have yielded many ways to study proteins by means of inactivation. Traditional methods of genetic knockout are complimented by newer techniques, including RNAi and small molecules that induce proteolysis (see scheme). Although seemingly in competition, these techniques each offer solutions to specific problems in proteomic analysis. [source] Participatory evaluation (I) , sharing lessons from fieldwork in AsiaCHILD: CARE, HEALTH AND DEVELOPMENT, Issue 3 2007B. Crishna Abstract Background There is a need to study methodologies for evaluating social development projects. Traditional methods of evaluation are often not able to capture or measure the ,spirit of change' in people, which is the very essence of human development. Using participatory methodologies is a positive way to ensure that evaluations encourage an understanding of the value of critical analysis among service providers and other stakeholders. Participatory evaluation provides a systematic process of learning through experiences. Methods Practical experiences of conducting a number of evaluation studies in social development projects have led the author to develop four basic principles of participatory evaluation strategies. This has been further conceptualized through an extensive literature search. The article develops and shares these principles through descriptions of field experiences in Asia. Results The article illustrates that the role of any evaluation remains a learning process, one which promotes a climate of reflection and self-assessment. It shows how using participatory methods can create this environment of learning. However, one needs to keep in mind that participatory evaluation takes time, and that the role and calibre of the facilitator are crucial. Conclusion Participatory evaluation methods have been recommended for social development projects to ensure that stakeholders remain in control of their own lives and decisions. [source] Out-of-Core and Dynamic Programming for Data Distribution on a Volume Visualization ClusterCOMPUTER GRAPHICS FORUM, Issue 1 2009S. Frank I.3.2 [Computer Graphics]: Distributed/network graphics; C.2.4 [Distributed Systems]: Distributed applications Abstract Ray directed volume-rendering algorithms are well suited for parallel implementation in a distributed cluster environment. For distributed ray casting, the scene must be partitioned between nodes for good load balancing, and a strict view-dependent priority order is required for image composition. In this paper, we define the load balanced network distribution (LBND) problem and map it to the NP-complete precedence constrained job-shop scheduling problem. We introduce a kd-tree solution and a dynamic programming solution. To process a massive data set, either a parallel or an out-of-core approach is required. Parallel preprocessing is performed by render nodes on data, which are allocated using a static data structure. Volumetric data sets often contain a large portion of voxels that will never be rendered, or empty space. Parallel preprocessing fails to take advantage of this. Our slab-projection slice, introduced in this paper, tracks empty space across consecutive slices of data to reduce the amount of data distributed and rendered. It is used to facilitate out-of-core bricking and kd-tree partitioning. Load balancing using each of our approaches is compared with traditional methods using several segmented regions of the Visible Korean data set. [source] An Ultrasonic Profiling Method for the Inspection of Tubular StructuresCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 6 2007Francisco Gomez These graphs not only show the inner contour of the pipe but also integrate the intensity of the echoes employed to create the profile. The enhanced profile is generated by superimposing the peak intensity from the returning echoes at the calculated x, y, and z coordinates where it reflected from the pipe wall. The proposed method is capable of showing anomalous conditions, inside pipes filled with liquid, with dimensions smaller than the theoretical lateral and axial resolution of the transducer, in contrast to traditional methods where these kinds of defects are not disclosed. The proposed inspection method and its capabilities were validated through the realization of simulations and experiments. The presented approach was particularly developed with the aim of scanning internal sections of pipes filled with liquid using rotary ultrasonic sonars, but it is expected that this research could be expanded to the inspection of other submerged structures, such as water tanks, or pressurized vessels. [source] Dynamic Optimal Traffic Assignment and Signal Time Optimization Using Genetic AlgorithmsCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 4 2004H. R. Varia A simulation-based approach is employed for the case of multiple-origin-multiple-destination traffic flows. The artificial intelligence technique of genetic algorithms (GAs) is used to minimize the overall travel cost in the network with fixed signal timings and optimization of signal timings. The proposed method is applied to the example network and results are discussed. It is concluded that GAs allow the relaxation of many of the assumptions that may be needed to solve the problem analytically by traditional methods. [source] Genetic Algorithms for Optimal Urban Transit Network DesignCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 3 2003Partha Chakroborty This article attempts to highlight the effectiveness of genetic algorithm (GA),based procedures in solving the urban transit network design problem (UTNDP). The article analyzes why traditional methods have problems in solving the UTNDP. The article also suggests procedures to alleviate these problems using GA,based optimization technique. The thrust of the article is three,fold: (1) to show the effectiveness of GAs in solving the UTNDP, (2) to identify features of the UTNDP that make it a difficult problem for traditional techniques, and (3) to suggest directions, through the presentation of GA,based methodologies for the UTNDP, for the development of GA,based procedures for solving other optimization problems having features similar to the UTNDP. [source] A MASTER CLASS IN UNDERSTANDING VARIATIONS IN HEALTHCARECYTOPATHOLOGY, Issue 2006M. Mohammed That there is wide-spread variation in healthcare outcomes cannot be denied. The question is what does the variation mean and what can we do about it? Using a series of well-known case-studies, which include data from the Bristol and Shipman Inquiries, fundamental limitations of traditional methods of understanding variation will be highlighted. These methods, which include comparison with standards, league tables and statistical testing, have flaws and they offer little or no guidance on how to re-act to the variation. Fortunately, there is a theory of variation that overcomes these limitations and provides useful guidance on re-acting to variation, which was developed by Walter Shewhart in the 1920s in an industrial setting. Shewhart's theory of variation found widespread application and won him the accolade ,Father of modern quality control'. His work is central to philosophies of continual improvement. Application of Shewhart's theory of variation, also known as Statistical Process Control (SPC), to case-studies from healthcare will be demonstrated, whilst highlighting the implications and challenges for performance management/monitoring and continual improvement in the healthcare. References:, 1. M A Mohammed, KK Cheng, A Rouse, T Marshall. "Bristol, Shipman and clinical governance: Shewhart's forgotten lessons" The Lancet 2001; 357: 463,7. 2. P Adab, A Rouse, M A Mohammed, T Marshall. "Performance league tables: the NHS deserves better" British Medical Journal 2002; 324: 95,98 [source] A green to red photoconvertible protein as an analyzing tool for early vertebrate developmentDEVELOPMENTAL DYNAMICS, Issue 2 2007Stephan A. Wacker Abstract Lineage labeling is one of the most important techniques in developmental biology. Most recently, a set of photoactivatable fluorescent proteins originating from marine cnidarians became available. Here, we introduce the application of the green to red photoconvertible protein EosFP as a novel technique to analyze early vertebrate development. Both injection of EosFP mRNA and purified, recombinant EosFP followed by a light-driven green to red conversion allow lineage labeling in virtually any temporal and spatial dimension during embryonic development for at least 2 weeks. Specific staining of cells from nonsurface layers is greatly facilitated by light-driven conversion of EosFP compared with traditional methods. Therefore, green to red photoactivatable proteins promise to be a powerful tool with the potential to satisfy the increasing demand for methods enabling detailed phenotypical analyses after manipulations of morphogenetic events, gene expression, or signal transduction. Developmental Dynamics 236:473,480, 2007. © 2006 Wiley-Liss, Inc. [source] A pictographic method for teaching spelling to Greek dyslexic childrenDYSLEXIA, Issue 2 2002Th.D. Mavrommati Abstract In the Greek orthography every letter consistently represents the same sound, but the same sound can be represented by different letters or pairs of letters. This makes spelling more difficult than reading. Two methods of teaching spelling to Greek dyslexic children are compared. The first involved pictograms (specially drawn pictures) for use when alternative spellings are possible. This is referred to as the ,PICTO' method. The second was in effect a combination of two traditional methods: the first involved the teaching of letter-sound correspondences in a multisensory way; the second involved the use of concepts derived from linguistics, the children being taught the derivations of words and shown how the same root morphemes, derivative morphemes, etc., were cosistently represented by the same spelling pattern. This combination of methods is referred to as ,TRAD', signifying ,traditional'. There were 72 subjects in the study, aged between 9 and 11 years. Four different teachers, each using both PICTO and TRAD, took part in the teaching sessions. The PICTO method proved considerably more effective; and possible reasons are suggested as to why this might be so. Copyright © 2002 John Wiley & Sons, Ltd. [source] Particle swarm optimization of TMD by non-stationary base excitation during earthquakeEARTHQUAKE ENGINEERING AND STRUCTURAL DYNAMICS, Issue 9 2008A. Y. T. Leung Abstract There are many traditional methods to find the optimum parameters of a tuned mass damper (TMD) subject to stationary base excitations. It is very difficult to obtain the optimum parameters of a TMD subject to non-stationary base excitations using these traditional optimization techniques. In this paper, by applying particle swarm optimization (PSO) algorithm as a novel evolutionary algorithm, the optimum parameters including the optimum mass ratio, damper damping and tuning frequency of the TMD system attached to a viscously damped single-degree-of-freedom main system subject to non-stationary excitation can be obtained when taking either the displacement or the acceleration mean square response, as well as their combination, as the cost function. For simplicity of presentation, the non-stationary excitation is modeled by an evolutionary stationary process in the paper. By means of three numerical examples for different types of non-stationary ground acceleration models, the results indicate that PSO can be used to find the optimum mass ratio, damper damping and tuning frequency of the non-stationary TMD system, and it is quite easy to be programmed for practical engineering applications. Copyright © 2008 John Wiley & Sons, Ltd. [source] Do Upgradings and Downgradings Convey Information?ECONOMIC NOTES, Issue 3 2006An Event Study of the French Bond Market This study has two purposes: 1To present an alternative method for the study of events related to bond spreads applicable when only a small number of events is available; 2To analyse the impact of downgradings and upgradings on the French financial market. A small number of events can render the use of traditional methods based on the analysis of abnormal returns difficult. We suggest examining the stationarity of relative spreads and dating a possible interruption in the series by carrying out tests in increasingly wider time windows. This method has been applied to assess the role of rating agencies in the French financial market. The results obtained are, in general, not only similar to those previously obtained in other markets, but also more accurate. The aggregate analysis shows an absence of reaction for upgradings while downgradings determine reaction on financial markets. However, if we expand the analysis to single issuers we find that downgradings had no relevant effect on financial markets in most cases. Only two issuers (France Telecom and Vivendi), with initially good, but rapidly deteriorating, credit reputation, experienced a significant rise of their spreads. In these cases, financial markets reacted prior to the downgrading by the agency. Tests based only on the analysis of the whole events would have led us, in the case of downgradings, to partially flawed conclusions. [source] A high-throughput on-line microdialysis-capillary assay for D -serineELECTROPHORESIS, Issue 7-8 2003Kylie B. O'Brien Abstract A high-throughput method is described for the analysis of D -serine and other neurotransmitters in tissue homogenates. Analysis is performed by microdialysis-capillary electrophoresis (CE) with laser-induced fluorescence (LIF) detection in a sheath flow detection cell. Sample pretreatment is not required as microdialysis sampling excludes proteins and cell fragments. Primary amines are derivatized on-line with o -phthaldialdehyde (OPA) in the presence of ,-mercaptoethanol followed by on-line CE-LIF analysis. Under the separation conditions described here, D -serine is resolved from L -serine and other primary amines commonly found in biological samples. Each separation requires less than 22 s. Eliminating the need for sample pretreatment and performing the high-speed CE analysis on-line significantly reduces the time required for D -serine analysis when compared with traditional methods. This method has been used to quantify D -serine levels in larval tiger salamander retinal homogenates, as well as dopamine, ,-amino- n -butyric acid (GABA), glutamate and L -aspartate. D -serine release from an intact retina was also detected. [source] Extreme value predictions based on nonstationary time series of wave dataENVIRONMETRICS, Issue 1 2006Christos N. Stefanakos Abstract A new method for calculating return periods of various level values from nonstationary time series data is presented. The key idea of the method is a new definition of the return period, based on the MEan Number of Upcrossings of the level x* (MENU method). In the present article, the case of Gaussian periodically correlated time series is studied in detail. The whole procedure is numerically implemented and applied to synthetic wave data in order to test the stability of the method. Results obtained by using several variants of traditional methods (Gumbel's approach and the POT method) are also presented for comparison purposes. The results of the MENU method showed an extraordinary stability, in contrast to the wide variability of the traditional methods. The predictions obtained by means of the MENU method are lower than the traditional predictions. This is in accordance with the results of other methods that also take into account the dependence structure of the examined time series. Copyright © 2005 John Wiley & Sons, Ltd. [source] Teaching and assessment of Professional attitudes in UK dental schools , CommentaryEUROPEAN JOURNAL OF DENTAL EDUCATION, Issue 3 2010J. Field Abstract The General Dental Council expects professionalism to be embedded and assessed through-out the undergraduate dental programme. Curricula need therefore to accommodate these recommendations. A stroll poll of UK dental schools provided a basis for understanding the current methods of teaching and assessing professionalism. All respondent schools recognised the importance of professionalism and reported that this was taught and assessed within their curriculum. For most the methods involved were largely traditional, relying on lectures and seminars taught throughout the course. The most common form of assessment was by grading and providing formative feedback after a clinical encounter. Whilst clinical skills and knowledge can perhaps be readily taught and assessed using traditional methods, those involved in education are challenged to identify and implement effective methods of not only teaching, but also assessing professionalism. A variety of standalone methods need to be developed that assess professionalism and this will, in turn, allow the effectiveness of teaching methods to be assessed. [source] Improved Auxiliary for the Synthesis of Medium-Sized Bis(lactams)EUROPEAN JOURNAL OF ORGANIC CHEMISTRY, Issue 2 2008Jasper Springer Abstract Our auxiliary-based method for the synthesis of bis(lactams) has been optimized. A novel auxiliary is described that is inserted in the backbone of a linear peptide facilitating the mutually reactive terminal groups to approach one another for a cyclization reaction. A subsequent ring contraction mechanism leads to the bis(lactams) with the remainings of the auxiliary still attached. Functionalized seven- and eight-membered bis(lactams) have been prepared that are difficult to access using traditional methods. Removal of the auxiliary from the bis(lactams) has been described with the possible side reactions that can occur.(© Wiley-VCH Verlag GmbH & Co. KGaA, 69451 Weinheim, Germany, 2008) [source] Calibration model of microbial biomass carbon and nitrogen concentrations in soils using ultraviolet absorbance and soil organic matterEUROPEAN JOURNAL OF SOIL SCIENCE, Issue 4 2008X. Xu Summary There is a need for a rapid, simple and reliable method of determining soil microbial biomass (SMB) for all soils because traditional methods are laborious. Earlier studies have reported that SMB-C and -N concentrations in grassland and arable soils can be estimated by measurement of UV absorbance in soil extracts. However, these previous studies focused on soils with small soil organic matter (SOM) contents, and there was no consideration of SOM content as a covariate to improve the estimation. In this study, using tropical and temperate forest soils with a wide range of total C (5,204 mg C g,1 soil) and N (1,12 mg N g,1 soil) contents and pH values (4.1,5.9), it was found that increase in UV absorbance of soil extracts at 280 nm (UV280) after fumigation could account for 92,96% of the variance in estimates of the SMB-C and -N concentrations measured by chloroform fumigation and extraction (P < 0.001). The data were combined with those of earlier workers to calibrate UV-based regression models for all the soils, by taking into account their varying SOM content. The validation analysis of the calibration models indicated that the SMB-C and -N concentrations in the 0,5 cm forest soils simulated by using the increase in UV280 and SOM could account for 86,93% of the variance in concentrations determined by chloroform fumigation and extraction (P < 0.001). The slope values of linear regression equations between measured and simulated values were 0.94 ± 0.03 and 0.94 ± 0.04, respectively, for the SMB-C and -N. However, simulation using the regression equations obtained by using only the data for forest profile soils gave less good agreement with measured values. Hence, the calibration models obtained by using the increase in UV280 and SOM can give a rapid, simple and reliable method of determining SMB for all soils. [source] Organ-specific ligation-induced changes in harmonic components of the pulse spectrum and regional vasoconstrictor selectivity in Wistar ratsEXPERIMENTAL PHYSIOLOGY, Issue 1 2006Tse Lin Hsu It has been shown previously that the amplitudes of the harmonic components of the pulse spectrum vary in specific patterns when the arteries leading to different organs are ligated, with the variations in the harmonics being linearly additive. Since ligation can be regarded as a vast increase in organ resistance, the present study examined the potential of using these ligation-induced variations in the pulse spectrum as reference parameters for an increase in vascular resistance and for regional vasoconstrictor selectivity. A vasoconstrictor, either arginine vasopressin (AVP) or angiotensin II (Ang II), was infused into anaesthetized Wistar rats via the femoral vein for 1 h. The distinct harmonic-specific drug effects on the pulse spectrum were simulated by combining renal artery and superior mesenteric artery ligations in different ratios, the ratio with the lowest mean square difference determining the regional drug selectivity. The ratios indicated that the effect of AVP on the pulse spectrum was attributable to the combined effect of ligating the renal and superior mesenteric arteries, while the effect of Ang II was attributable to ligation of the renal artery. The results are comparable with those of investigations of regional vascular resistance performed using traditional methods. Our findings indicate that the ligation-induced variations in the pulse spectrum can be used to determine regional increases in vascular resistance. This implies that blood pressure can be used as the sole parameter to determine which arterial bed has been affected by the vasoconstrictor, and how seriously. [source] |