Additional Effort (additional + effort)

Distribution by Scientific Domains


Selected Abstracts


Noise exposures aboard catcher/processor fishing vessels

AMERICAN JOURNAL OF INDUSTRIAL MEDICINE, Issue 8 2006
Richard L. Neitzel MS
Abstract Background Commercial fishing workers have extended work shifts and potential for 24 hr exposures to high noise. However, exposures in this industry have not been adequately characterized. Methods Noise exposures aboard two catcher/processors (C/P) were assessed using dosimetry, sound-level mapping, and self-reported activities and hearing protection device (HPD) use. These data were combined to estimate work shift, non-work, and 24 hr overall exposure levels using several metrics. The length of time during which HPDs were worn was also used to calculate the effective protection received by crew members. Results Nearly all workers had work shift and 24 hr noise levels that exceeded the relevant limits. After HPD use was accounted for, half of the 24 hr exposures remained above relevant limits. Non-work-shift noise contributed nothing to 24 hr exposure levels. HPDs reduced the average exposure by about 10 dBA, but not all workers wore them consistently. Conclusions The primary risk of hearing loss aboard the monitored vessels comes from work shift noise. Smaller vessels or vessels with different layouts may present more risk of hearing damage from non-work periods. Additional efforts are needed to increase use of HPDs or implement noise controls. Am. J. Ind. Med. 2006. © 2006 Wiley-Liss, Inc. [source]


Studies of the Nature of the Catalytic Species in the ,-Olefin Polymerisation Processes Generated by the Reaction of Diamido(cyclopentadienyl)titanium Complexes with Aluminium Reagents as Cocatalysts

EUROPEAN JOURNAL OF INORGANIC CHEMISTRY, Issue 2 2005
Vanessa Tabernero
Abstract The reaction of the diamido(chloro)cyclopentadienyltitanium compounds TiCpRx[1,2-C6H4(NR,)2]Cl [CpRx = ,5 -C5H5, ,5 -C5(CH3)5, ,5 -C5H4(SiMe3); R, = CH2tBu, Pr] with the Grignard reagent MgClR (R = Me, CH2Ph) affords the monomethyl and monobenzyl derivatives TiCpRx[1,2-C6H4(NR,)2]R. Upon addition of methylaluminoxane (MAO), the chloro- and alkyltitanium complexes show low activity towards the polymerisation of ethylene and styrene. However, no methylation was observed during the treatment of trimethylaluminium with the chloro compounds TiCpRx[1,2-C6H4(NR,)2]Cl. Instead, these reactions give the dinuclear aluminium complexes Al2[1,2-C6H4(NR,)2]Me4 (R, = CH2tBu, Pr) through transmetallation of the diamido ligand, suggesting a deactivation process of the catalysts in the olefin polymerisation reactions. In an additional effort to model the catalytic species, stoichiometric reactions between the methyl derivatives TiCpRx[1,2-C6H4(NR,)2]Me and solid methylaluminoxane (MAO) were studied by NMR spectroscopy. Monitoring of these reactions revealed the formation of zwitterionic species depending on the nature of the solvent. (© Wiley-VCH Verlag GmbH & Co. KGaA, 69451 Weinheim, Germany, 2005) [source]


Agency problems and audit fees: further tests of the free cash flow hypothesis

ACCOUNTING & FINANCE, Issue 2 2010
Paul A. Griffin
G34; G35; M41; M42 Abstract This study finds that the agency problems of companies with high free cash flow (FCF) and low growth opportunities induce auditors of companies in the US to raise audit fees to compensate for the additional effort. We also find that high FCF companies with high growth prospects have higher audit fees. In both cases, higher debt levels moderate the increased fees, consistent with the role of debt as a monitoring mechanism. Other mechanisms to mitigate the agency costs of FCF such as dividend payout and share repurchase (not studied earlier) do not moderate the higher audit fees. [source]


Designing materials with prescribed elastic properties using polygonal cells

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 3 2003
Alejandro R. Diaz
Abstract An extension of the material design problem is presented in which the base cell that characterizes the material microgeometry is polygonal. The setting is the familiar inverse homogenization problem as introduced by Sigmund. Using basic concepts in periodic planar tiling it is shown that base cells of very general geometries can be analysed within the standard topology optimization setting with little additional effort. In particular, the periodic homogenization problem defined on polygonal base cells that tile the plane can be replaced and analysed more efficiently by an equivalent problem that uses simple parallelograms as base cells. Different material layouts can be obtained by varying just two parameters that affect the geometry of the parallelogram, namely, the ratio of the lengths of the sides and the internal angle. This is an efficient way to organize the search of the design space for all possible single-scale material arrangements and could result in solutions that may be unreachable using a square or rectangular base cell. Examples illustrate the results. Copyright © 2003 John Wiley & Sons, Ltd. [source]


New stabilized finite element method for time-dependent incompressible flow problems

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN FLUIDS, Issue 2 2010
*Article first published online: 20 FEB 200, Yueqiang Shang
Abstract A new stabilized finite element method is considered for the time-dependent Stokes problem, based on the lowest-order P1,P0 and Q1,P0 elements that do not satisfy the discrete inf,sup condition. The new stabilized method is characterized by the features that it does not require approximation of the pressure derivatives, specification of mesh-dependent parameters and edge-based data structures, always leads to symmetric linear systems and hence can be applied to existing codes with a little additional effort. The stability of the method is derived under some regularity assumptions. Error estimates for the approximate velocity and pressure are obtained by applying the technique of the Galerkin finite element method. Some numerical results are also given, which show that the new stabilized method is highly efficient for the time-dependent Stokes problem. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Accelerating the analyses of 3-way and 4-way PARAFAC models utilizing multi-dimensional wavelet compression

JOURNAL OF CHEMOMETRICS, Issue 11-12 2005
Jeff Cramer
Abstract Parallel factor analysis (PARAFAC) is one of the most popular methods for evaluating multi-way data sets, such as those typically acquired by hyphenated measurement techniques. One of the reasons for PARAFAC popularity is the ability to extract directly interpretable chemometric models with little a priori information and the capability to handle unknown interferents and missing values. However, PARAFAC requires long computation times that often prohibit sufficiently fast analyses for applications such as online sensing. An additional challenge faced by PARAFAC users is the handling and storage of very large, high-dimensional data sets. Accelerating computations and reducing storage requirements in multi-way analyses are the topics of this manuscript. This study introduces a data pre-processing method based on multi-dimensional wavelet transforms (WTs), which enables highly efficient data compression applied prior to data evaluation. Because multi-dimensional WTs are linear, the intrinsic underlying linear data construction is preserved in the wavelet domain. In almost all studied examples, computation times for analyzing the much smaller, compressed data sets could be reduced so much that the additional effort for wavelet compression was more than recompensated. For 3-way and 4-way synthetic and experimental data sets, acceleration factors up to 50 have been achieved; these data sets could be compressed down to a few per cent of the original size. Despite the high compression, accurate and interpretable models were derived, which are in good agreement with conventionally determined PARAFAC models. This study also found that the wavelet type used for compression is an important factor determining acceleration factors, data compression ratios and model quality. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Directed Attention in Normal and High-Risk Pregnancy

JOURNAL OF OBSTETRIC, GYNECOLOGIC & NEONATAL NURSING, Issue 2 2006
Mary Ann Stark
Objective:, To compare the ability to direct attention in women having a high-risk pregnancy with those having an uncomplicated pregnancy. Design:, Descriptive comparative. Setting:, A tertiary-care hospital. Participants:, Women in the 3rd trimester of pregnancy receiving care from perinatologists were recruited for this study and comprised the high-risk group (n= 67). Women in their 3rd trimester of pregnancy with uncomplicated pregnancies and enrolled in prenatal classes were the comparison group (n= 57). Main outcome measures:, Objective measures of directed attention included digit span forward, digit span backward, Trailmaking A, and Trailmaking B. Subjective measures included the Attentional Function Index and Mental Effort in Tasks. Results:, Women having a complicated pregnancy had significantly more difficulty directing attention on all measures than women having normal pregnancies. When all covariates were considered, women having a high-risk pregnancy had significantly more difficulty directing attention as measured by Trailmaking A, Trailmaking B, and Mental Effort in Tasks. Conclusions:, Women having high-risk pregnancies may have more difficulty with activities that require directed attention than women having normal pregnancies. Learning new information and skills, problem solving, and planning may require additional effort for women having complicated pregnancies. JOGNN, 35, 241-249; 2006. DOI: 10.1111/J.1552-6909.2006.00035.x [source]


Viability for codifying and documenting architectural design decisions with tool support

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 2 2010
Rafael Capilla
Abstract Current software architecture practices have been focused on modeling and documenting the architecture of a software system by means of several architectural views. In practice, the standard architecture documentation lacks explicit description of the decisions made and their underlying rationale, which often leads to knowledge loss. This fact strongly affects the maintenance activities as we need to spend additional effort to replay the decisions made as well as to understand the changes performed in the design. Hence, codifying this architectural knowledge is a challenging task that requires adequate tool support. In this research, we test the capabilities of Architecture Design Decision Support System (ADDSS), a web-based tool for supporting the creation, maintenance, use, and documentation of architectural design decisions (ADD) with their architectures. We used ADDSS to codify architectural knowledge and to maintain those trace links between the design decisions and other software artefacts that would help in the maintenance operations. We illustrate the usage of the tool through four different experiences and discuss the potential benefits of using this architectural knowledge and its impact on the maintenance and evolution activities. Copyright © 2009 John Wiley & Sons, Ltd. [source]


The regulatory and business roles of a Study Director

QUALITY ASSURANCE JOURNAL, Issue 4 2005
Celeste A. Rose
Abstract The role of Study Director can be a challenging one with technical, administrative, and compliance responsibilities. The Study Director often does not have direct responsibility for personnel and other resources required for his/her studies. In this article, we have attempted to identify traits and mechanisms to help the Study Director do his/her job in a professional and compliant manner. In addition to regulatory and scientific expertise, Study Director quality attributes include possession of appropriate soft skills and character traits and are key to the success of a study. Study Directors must have an aptitude for effective communication, relationship building, training, mentoring, and delegation. While these soft skills/desirable traits often require additional effort on the part of the Study Director, they have a large impact on the rate of success, efficiency, and compliance of the study overall. The workshop presented by the authors took a 'hands-on' approach building on the creativity, and experience of Study Directors, supervisors, managers, and Quality Assurance (QA) personnel who participated. The participants deliberated problem scenarios from a Study Director's perspective. The tables and discussion in this article summarize compliant solutions, which arose from the resourcefulness that comes from experience of the participants. Copyright © 2005 John Wiley & Sons, Ltd. [source]


The influence of self-citation corrections and the fractionalised counting of multi-authored manuscripts on the Hirsch index

ANNALEN DER PHYSIK, Issue 9 2009
M. Schreiber
Abstract The Hirsch index or h -index is widely used to quantify the impact of an individual's scientific research output, determining the highest number h of a scientist's papers that received at least h citations. Fractionalised counting of the publications is an appropriate way to distribute the impact of a paper among all the coauthors of a multi-authored manuscript in an easy way, leading to a simple modification hm of the h -index. On the other hand the exclusion of self-citations allows one to sharpen the index, what is appropriate, because self-citations are usually not reflecting the significance of a publication. I have previously analysed the citation records of 26 physicists discussing the sharpened index hs as well as the modification hm of the original h -index. In the present investigation I combine these two procedures yielding the modified sharpened index hms. For a better comparison, interpolated indices are utilized. The correlations between the indices are rather strong, but nevertheless the positions of some datasets change, in a few cases significantly, depending on whether they are put into order according to the values of h, hm, hs, or hms. This leads to the conclusion that the additional effort in determining the modified sharpened index hms is worth performing in order to obtain a fairer evaluation of the citation records. [source]


Neural tube defect rates before and after food fortification with folic acid,

BIRTH DEFECTS RESEARCH, Issue 11 2004
James L. Mills
Abstract BACKGROUND Since 1998, enriched cereal grains sold in the United States have been fortified with folic acid, to reduce the incidence of neural tube defects (NTDs). The Centers for Disease Control and Prevention (CDC) recently reported that NTD rates have decreased 26% since fortification, but that additional effort is needed to achieve the national goal of a 50% reduction. However, accurate determination of NTD rates requires counting antenatally detected cases; the CDC study noted that the number of prenatally diagnosed cases was likely underestimated. METHODS AND RESULTS We examined studies from the United States and Canada that compared rates of NTDs before and after very similar fortification programs were instituted in each country. U.S. studies had incomplete ascertainment of prenatally diagnosed NTD cases, and as a result, underreported the number of NTDs prevented. Canadian studies, in which ascertainment was more complete, showed decreases in NTD rates up to 54%. CONCLUSIONS There is a strong correlation between the completeness of ascertainment and the percentage decrease in NTD rates. Studies that identify cases best show that folic acid fortification is preventing around 50% of NTDs. The percentage of NTDs that are folate-preventable in the United States is uncertain, but is probably 50,60%. Thus, we may be quite close to achieving the optimum level of protection at current fortification levels. Birth Defects Research (Part A), 2004. Published 2004 Wiley-Liss, Inc. [source]


Solving crystal structures in P1: an automated procedure for finding an allowed origin in the correct space group

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 2 2000
Maria Cristina Burla
Crystal structure solution in P1 may be particularly suitable for complex crystal structures crystallizing in other space groups. However, additional efforts and human intervention are often necessary to locate correctly the structural model so obtained with respect to an allowed origin of the actual space group. An automatic procedure is described which is able to perform such a task, allowing the routine passage to the correct space group and the subsequent structure refinement. Some tests are presented proving the effectiveness of the procedure. [source]