Software Packages (software + packages)

Distribution by Scientific Domains


Selected Abstracts


Researching Internet-Based Populations: Advantages and Disadvantages of Online Survey Research, Online Questionnaire Authoring Software Packages, and Web Survey Services

JOURNAL OF COMPUTER-MEDIATED COMMUNICATION, Issue 3 2005
Kevin B. Wright
This article examines some advantages and disadvantages of conducting online survey research. It explores current features, issues, pricing, and limitations associated with products and services, such as online questionnaire features and services to facilitate the online survey process, such as those offered by web survey businesses. The review shows that current online survey products and services can vary considerably in terms of available features, consumer costs, and limitations. It is concluded that online survey researchers should conduct a careful assessment of their research goals, research timeline, and financial situation before choosing a specific product or service. [source]


Software packages for everything

INTERNATIONAL JOURNAL OF NONPROFIT & VOLUNTARY SECTOR MARKETING, Issue 4 2001
Article first published online: 12 JUL 200, Peter Flory
This paper details the way in which voluntary sector organisations can benefit from software packages designed for the commercial sector. Administrative activities of nonprofit and voluntary sector managers revolve around money and the paper demonstrates how the areas of income generation, administration and service provision can be more efficiently managed by the use of existing software. Off-the-shelf systems can be adapted simply by slightly changing the terminology. The paper concludes that, although the costs are high, they are more than offset by the potential benefits. Copyright © 2001 Henry Stewart Publications [source]


Animated instructional software for mechanics of materials: Implementation and assessment

COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 1 2006
Timothy A. Philpot
Abstract During the past 3 years, the Basic Engineering Department at the University of Missouri, Rolla has been developing a second-generation suite of instructional software called MecMovies for the Mechanics of Materials course. The MecMovies suite consists of over 110 animated example problems, drill-and-practice games, and interactive exercises. Students generally respond favorably to software of this type; however, much of the data that has been gathered to assess the effectiveness of similar software has been anecdotal. The method by which instructional software is incorporated into the engineering class is partly responsible for this lack of systematic evaluation. Often, software packages have been implemented in the classroom as supplemental material,recommended but not required. In the Fall 2003 semester, MecMovies was integrated thoroughly into the course assignments for one of the six UMR Mechanics of Materials sections. Four professors were involved in the study, and student performance in the experimental MecMovies section was compared to performance in the five control sections through a common final exam. At the end of the semester, students who used the MecMovies software also completed a survey questionnaire consisting of a number of subjective rating items. This paper presents a comparison of student performance in the experimental and control sections along with discussion of student qualitative ratings and comments. © 2006 Wiley Periodicals, Inc. Comput Appl Eng Educ 14: 31,43, 2006; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20065 [source]


A reference model for grid architectures and its validation

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 11 2010
Wil van der Aalst
Abstract Computing and data-intensive applications in physics, medicine, biology, graphics, and business intelligence require large and distributed infrastructures to address the challenges of the present and the future. For example, process mining applications are faced with terrabytes of event data and computationally expensive algorithms. Computer grids are increasingly being used to deal with such challenges. However, grid computing is often approached in an ad hoc and engineering-like manner. Despite the availability of many software packages for grid applications, a good conceptual model of the grid is missing. This paper provides a formal description of the grid in terms of a colored Petri net (CPN). This CPN can be seen as a reference model for grids as it clarifies the basic concepts at the conceptual level. Moreover, the CPN allows for various kinds of analyses ranging from verification to performance analysis. We validate our model based on real-life experiments using a testbed grid architecture available in our group and we show how the model can be used for the estimation of throughput times for scientific workflows. Copyright © 2009 John Wiley & Sons, Ltd. [source]


A study into the feasibility of using two parallel sparse direct solvers for the Helmholtz equation on Linux clusters

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 7 2006
G. Z. M. Berglund
Abstract Two state-of-the-art parallel software packages for the direct solution of sparse linear systems based on LU-decomposition, MUMPS and SuperLU_DIST have been tested as black-box solvers on problems derived from finite difference discretizations of the Helmholtz equation. The target architecture has been Linux clusters, for which no consistent set of tests of the algorithms implemented in these packages has been published. The investigation consists of series of memory and time scalability checks and has focused on examining the applicability of the algorithms when processing very large sparse matrices on Linux cluster platforms. Special emphasis has been put on monitoring the behaviour of the packages when the equation systems need to be solved for multiple right-hand sides, which is the case, for instance, when modelling a seismic survey. The outcome of the tests points at poor efficiency of the tested algorithms during application of the LU-factors in the solution phase on this type of architecture, where the communication acts as an impasse. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Soil-solution speciation of CD as affected by soil characteristics in unpolluted and polluted soils

ENVIRONMENTAL TOXICOLOGY & CHEMISTRY, Issue 3 2005
Erik Meers
Abstract Total metal content by itself is insufficient as a measure to indicate actual environmental risk. Understanding the mobility of heavy metals in the soil and their speciation in the soil solution is of great importance for accurately assessing environmental risks posed by these metals. In a first explorative study, the effects of general soil characteristics on Cd mobility were evaluated and expressed in the form of empirical formulations. The most important factors influencing mobility of Cd proved to be pH and total soil content. This may indicate that current legislation expressing the requirement for soil sanitation in Flanders (Belgium) as a function of total soil content, organic matter, and clay does not successfully reflect actual risks. Current legal frameworks focusing on total content, therefore, should be amended with criteria that are indicative of metal mobility and availability and are based on physicochemical soil properties. In addition, soil-solution speciation was performed using two independent software packages (Visual Minteq 2.23 and Windermere Humic Aqueous model VI [WHAM VI]). Both programs largely were in agreement in concern to Cd speciation in all 29 soils under study. Depending on soil type, free ion and the organically complexed forms were the most abundant species. Additional inorganic soluble species were sulfates and chlorides. Minor species in solution were in the form of nitrates, hydroxides, and carbonates, the relative importance of which was deemed insignificant in comparison to the four major species. [source]


Methodology for Thermomechanical Simulation and Validation of Mechanical Weld-Seam Properties,

ADVANCED ENGINEERING MATERIALS, Issue 3 2010
Wolfgans Bleck
A simulation and validation of the mechanical properties in submerged-arc-weld seams is presented, which combines numerical simulation of the thermal cycle in the weld using the SimWeld software with an annealing and testing procedure. The weld-seam geometry and thermal profile near the weld seam can be computed based on the simulation of an equivalent heat source describing the energy input and distribution in the weld seam. Defined temperature,time cycles are imposed on tensile specimens allowing for annealing experiments with fast cooling rates. The direct evaluation of welded structures and the simple generation of input data for mechanical simulations in FE software packages are possible. [source]


Computational Methods for the Development of Polymeric Biomaterials

ADVANCED ENGINEERING MATERIALS, Issue 1-2 2010
Aurora D. Costache
This review focuses on polymeric biomaterials and provides a selective overview of the computational modeling approaches used to predict their properties and biological responses. Also, a short overview of existing databases and software packages for the biomaterials field is presented. The review summarizes the research in this area since the year 2000. [source]


Sequential methods and group sequential designs for comparative clinical trials

FUNDAMENTAL & CLINICAL PHARMACOLOGY, Issue 5 2003
Véronique Sébille
Abstract Comparative clinical trials are performed to assess whether a new treatment has superior efficacy than a placebo or a standard treatment (one-sided formulation) or whether two active treatments have different efficacies (two-sided formulation) in a given population. The reference approach is the single-stage design and the statistical test is performed after inclusion and evaluation of a predetermined sample size. In practice, the single-stage design is sometimes difficult to implement because of ethical concerns and/or economic reasons. Thus, specific early termination procedures have been developed to allow repeated statistical analyses to be performed on accumulating data and stop the trial as soon as the information is sufficient to conclude. Two main different approaches can be used. The first one is derived from strictly sequential methods and includes the sequential probability ratio test and the triangular test. The second one is derived from group sequential designs and includes Peto, Pocock, and O'Brien and Fleming methods, , and , spending functions, and one-parameter boundaries. We review all these methods and describe the bases on which they rely as well as their statistical properties. We also compare these methods and comment on their advantages and drawbacks. We present software packages which are available for the planning, monitoring and analysis of comparative clinical trials with these methods and discuss the practical problems encountered when using them. The latest versions of all these methods can offer substantial sample size reductions when compared with the single-stage design not only in the case of clear efficacy but also in the case of complete lack of efficacy of the new treatment. The software packages make their use quite simple. However, it has to be stressed that using these methods requires efficient logistics with real-time data monitoring and, apart from survival studies or long-term clinical trials with censored endpoints, is most appropriate when the endpoint is obtained quickly when compared with the recruitment rate. [source]


Relative importance of evaluation criteria for enterprise systems: a conjoint study

INFORMATION SYSTEMS JOURNAL, Issue 3 2006
Mark Keil
Abstract., While a large body of research exists on the development and implementation of software, organizations are increasingly acquiring enterprise software packages [e.g. enterprise resource planning (ERP) systems] instead of custom developing their own software applications. To be competitive in the marketplace, software package development firms must manage the three-pronged trade-off between cost, quality and functionality. Surprisingly, prior research has made little attempt to investigate the characteristics of packaged software that influence management information system (MIS) managers' likelihood of recommending purchase. As a result, both the criteria by which MIS managers evaluate prospective packaged systems and the attributes that lead to commercially competitive ERP software products are poorly understood. This paper examines this understudied issue through a conjoint study. We focus on ERP systems, which are among the largest and most complex packaged systems that are purchased by organizations. In a conjoint study, 1008 evaluation decisions based on hypothetical ERP software package profiles were completed by managers in 126 organizations. The study represents the first empirical investigation of the relative importance that managers ascribe to various factors that are believed to be important in evaluating packaged software. The results provide important insights for both organizations that acquire such systems and those that develop them. The results show that functionality, reliability, cost, ease of use and ease of customization are judged to be important criteria, while ease of implementation and vendor reputation were not found to be significant. Functionality and reliability were found to be the most heavily weighted factors. We conclude the paper with a detailed discussion of the results and their implications for software acquisition and development practice. [source]


Factors associated with constructive staff,family relationships in the care of older adults in the institutional setting

INTERNATIONAL JOURNAL OF EVIDENCE BASED HEALTHCARE, Issue 4 2006
Emily Haesler BN PGradDipAdvNsg
Abstract Background, Modern healthcare philosophy espouses the virtues of holistic care and acknowledges that family involvement is appropriate and something to be encouraged due to the role it plays in physical and emotional healing. In the aged care sector, the involvement of families is a strong guarantee of a resident's well-being. The important role family plays in the support and care of the older adult in the residential aged care environment has been enshrined in the Australian Commonwealth Charter of Residents' Rights and Responsibilities and the Aged Care Standards of Practice. Despite wide acknowledgement of the importance of family involvement in the healthcare of the older adult, many barriers to the implementation of participatory family care have been identified in past research. For older adults in the healthcare environment to benefit from the involvement of their family members, healthcare professionals need an understanding of the issues surrounding family presence in the healthcare environment and the strategies to best support it. Objectives, The objectives of the systematic review were to present the best available evidence on the strategies, practices and organisational characteristics that promote constructive staff,family relationships in the care of older adults in the healthcare setting. Specifically this review sought to investigate how staff and family members perceive their relationships with each other; staff characteristics that promote constructive relationships with the family; and interventions that support staff,family relationships. Search strategy, A literature search was performed using the following databases for the years 1990,2005: Ageline, APAIS Health, Australian Family and Society Abstracts (FAMILY), CINAHL, Cochrane Library, Dare, Dissertation Abstracts, Embase, MEDLINE, PsycINFO and Social Science Index. Personal communication from expert panel members was also used to identify studies for inclusion. A second search stage was conducted through review of reference lists of studies retrieved during the first search stage. The search was limited to published and unpublished material in English language. Selection criteria, The review was limited to studies involving residents and patients within acute, subacute, rehabilitation and residential settings, aged over 65 years, their family and healthcare staff. Papers addressing family members and healthcare staff perceptions of their relationships with each other were considered for this review. Studies in this review also included those relating to interventions to promote constructive staff,family relationships including organisational strategies, staff,family meetings, case conferencing, environmental approaches, etc. The review considered both quantitative and qualitative research and opinion papers for inclusion. Data collection and analysis, All retrieved papers were critically appraised for eligibility for inclusion and methodological quality independently by two reviewers, and the same reviewers collected details of eligible research. Appraisal forms and data extraction forms designed by the Joanna Briggs Institute as part of the QARI and NOTARI systematic review software packages were used for this review. Findings, Family members' perceptions of their relationships with staff showed that a strong focus was placed on opportunities for the family to be involved in the patient's care. Staff members also expressed a theoretical support for the collaborative process, however, this belief often did not translate to the staff members' clinical practice. In the studies included in the review staff were frequently found to rely on traditional medical models of care in their clinical practice and maintaining control over the environment, rather than fully collaborating with families. Four factors were found to be essential to interventions designed to support a collaborative partnership between family members and healthcare staff: communication, information, education and administrative support. Based on the evidence analysed in this systematic review, staff and family education on relationship development, power and control issues, communication skills and negotiating techniques is essential to promoting constructive staff,family relationships. Managerial support, such as addressing workloads and staffing issues; introducing care models focused on collaboration with families; and providing practical support for staff education, is essential to gaining sustained benefits from interventions designed to promote constructive family,staff relationships. [source]


Review article: Basic steps in adapting response surface methodology as mathematical modelling for bioprocess optimisation in the food systems

INTERNATIONAL JOURNAL OF FOOD SCIENCE & TECHNOLOGY, Issue 9 2010
Titus U. Nwabueze
Summary Techniques involving choosing process combinations for optimisation without due consideration for relevant experimental designs is scientifically unreliable and irreproducible. Mathematical modelling, of which response surface methodology (RSM) is one, provides a precise map leading to successful optimisation. This paper identified key process variables, building the model and searching the solution through multivariate regression analysis, interpretation of resulting polynomial equations and response surface/contour plots as basic steps in adapting the central composite design to achieve process optimisation. It also gave information on appropriate RSM software packages and choice of order in RSM model and data economy in reducing the factorial experiments from large number parameter combinations to a far less number without losing any information including quadratic and interaction (if present) effects. It is expected that this paper will afford many food scientists and researchers the opportunity for adapting RSM as a mathematical model for achieving bioprocess optimisation in food systems. [source]


Inconsistencies between reported test statistics and p- values in two psychiatry journals

INTERNATIONAL JOURNAL OF METHODS IN PSYCHIATRIC RESEARCH, Issue 4 2007
David Berle
Abstract A recent survey of the British Medical Journal (BMJ) and Nature revealed that inconsistencies in reported statistics were common. We sought to replicate that survey in the psychiatry literature. We checked the consistency of reported t -test, F -test and ,2 -test values with their corresponding p -values in the 2005 issues of the Australian and New Zealand Journal of Psychiatry (ANZJP) and compared this with the issues of the ANZJP from 2000, and with a similar journal, Acta Psychiatrica Scandinavica (APS). A reported p -value was ,inconsistent' if it differed (at its reported number of decimal places) from our calculated p -values (using three different software packages), which we based on the reported test statistic and degrees of freedom. Of the 546 results that we checked, 78 (14.3%) of the p -values were inconsistent with the corresponding degrees of freedom and test statistic. Similar rates of inconsistency were found in APS and ANZJP, and when comparing the ANZJP between 2000 and 2005. The percentages of articles with at least one inconsistency were 8.5% for ANZJP 2005, 9.9% for ANZJP 2000 and 12.1% for APS. We conclude that inconsistencies in p -values are common and may reflect errors of analysis and rounding, typographic errors or typesetting errors. Suggestions for reducing the occurrence of such inconsistencies are provided. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Software packages for everything

INTERNATIONAL JOURNAL OF NONPROFIT & VOLUNTARY SECTOR MARKETING, Issue 4 2001
Article first published online: 12 JUL 200, Peter Flory
This paper details the way in which voluntary sector organisations can benefit from software packages designed for the commercial sector. Administrative activities of nonprofit and voluntary sector managers revolve around money and the paper demonstrates how the areas of income generation, administration and service provision can be more efficiently managed by the use of existing software. Off-the-shelf systems can be adapted simply by slightly changing the terminology. The paper concludes that, although the costs are high, they are more than offset by the potential benefits. Copyright © 2001 Henry Stewart Publications [source]


New approach for the analysis and design of negative-resistance oscillators: Application to a quasi-MMIC VCO

INTERNATIONAL JOURNAL OF RF AND MICROWAVE COMPUTER-AIDED ENGINEERING, Issue 4 2006
Jeffrey Chuan
Abstract This article proposes a new approach for the analysis and design of negative-resistance oscillators using computer-aided engineering tools. The method presented does not require any special probe and makes the oscillator design similar to the methodology applied to amplifiers. It speeds up convergence and avoids uncertainties in the solution. The negative-resistance oscillator is split into two parts: an active-amplifying part and a resonator part. A chain is constructed by linking both parts and repeating them several times, which is known as the repeated circuit simulation procedure. This method allows the separation of the signal flowing between them. Small-signal AC-sweep and harmonic-balance techniques, both available in several commercial software packages, are applied. This method is theoretically justified and shows convergence with less iteration. Furthermore, it is more robust than standard harmonic-balance probes in the case of multiple frequencies of oscillation. It has been demonstrated in the design of a quasi-MMIC VCO. This VCO has an external resonator circuit (coaxial resonator and varactor) and a MMIC negative-resistance circuit, which was manufactured using ED02AH p-HEMT technology (OMMIC). © 2006 Wiley Periodicals, Inc. Int J RF and Microwave CAE, 2006. [source]


Reliable computing in estimation of variance components

JOURNAL OF ANIMAL BREEDING AND GENETICS, Issue 6 2008
I. Misztal
Summary The purpose of this study is to present guidelines in selection of statistical and computing algorithms for variance components estimation when computing involves software packages. For this purpose two major methods are to be considered: residual maximal likelihood (REML) and Bayesian via Gibbs sampling. Expectation-Maximization (EM) REML is regarded as a very stable algorithm that is able to converge when covariance matrices are close to singular, however it is slow. However, convergence problems can occur with random regression models, especially if the starting values are much lower than those at convergence. Average Information (AI) REML is much faster for common problems but it relies on heuristics for convergence, and it may be very slow or even diverge for complex models. REML algorithms for general models become unstable with larger number of traits. REML by canonical transformation is stable in such cases but can support only a limited class of models. In general, REML algorithms are difficult to program. Bayesian methods via Gibbs sampling are much easier to program than REML, especially for complex models, and they can support much larger datasets; however, the termination criterion can be hard to determine, and the quality of estimates depends on a number of details. Computing speed varies with computing optimizations, with which some large data sets and complex models can be supported in a reasonable time; however, optimizations increase complexity of programming and restrict the types of models applicable. Several examples from past research are discussed to illustrate the fact that different problems required different methods. [source]


Multiparameter models for performance analysis of UASB reactors

JOURNAL OF CHEMICAL TECHNOLOGY & BIOTECHNOLOGY, Issue 8 2008
C M Narayanan
Abstract BACKGROUND: UASB (upflow anaerobic sludge blanket) bioreactors have the distinct advantage that they do not demand support particles and provide a high rate of bioconversion even with high strength feedstocks. Although apparently simple in construction, the performance analysis of these reactors involves a high degree of mathematical complexity. Most simulation models reported in the literature are rudimentary in nature as they involve gross approximations. In the present paper, two multiparameter simulation packages are presented that make no simplifying assumptions and hence are more rigorous in nature. RESULTS: The first package assumes the sludge bed to be a plug-flow reactor (PFR) and the sludge blanket to be an ideal continuous stirred tank reactor (CSTR). The second package equates the reactor to a plug flow dispersion reactor (PFDR), the axial dispersion coefficient however being a function of axial distance. The three phase nature of the sludge blanket has been considered and the variation of gas velocity in the axial direction has been taken into account. Three different kinetic equations have been considered. Resistance to diffusion of substrate into sludge granules has been accounted for by incorporating appropriately defined effectiveness factors. The applicability of simulation packages developed has been ascertained by comparing with real-life data collected from industrial/pilot plant/laboratory UASB reactors. The maximum deviation observed is ± 15%. CONCLUSIONS: Although the software packages developed have high computational load, their applicability has been successfully ascertained and they may be recommended for design and installation of industrial UASB reactors and also for the rating of existing installations. Copyright © 2008 Society of Chemical Industry [source]


Some common misunderstandings in chemometrics

JOURNAL OF CHEMOMETRICS, Issue 7-8 2010
Karin Kjeldahl
Abstract This paper describes a number of issues and tools in practical chemometric data analysis that are often either misunderstood or misused. Deciding what are relevant samples and variables, (mis-)use of common model diagnostics, and interpretational issues are addressed in relation to component models such as PCA and PLS models. Along with simple misunderstandings, the use of chemometric software packages may contribute to the mistakes if not used critically, and it is thus a main conclusion that good data analysis practice requires the analyst to take responsibility and do what is relevant for the given purpose. Copyright © 2010 John Wiley & Sons, Ltd. [source]


Comparison study of multi-attribute decision analytic software,

JOURNAL OF MULTI CRITERIA DECISION ANALYSIS, Issue 2-3 2005
Simon French
Abstract In this paper, we discuss the functionality and interfaces of five MCDM software packages. We recognize that no single package is appropriate for all decision contexts and processes. Thus our emphasis is not so much to compare the functionality of the packages per se, but to consider their fit with different decision making processes. In doing so, we hope to provide potential users with guidance on selecting a package that is more compatible with their needs. Moreover, we reflect on the further functionality which we would believe should be developed and included in MCDM packages. Copyright © 2006 John Wiley & Sons, Ltd. [source]


The Pros and Cons of Data Analysis Software for Qualitative Research

JOURNAL OF NURSING SCHOLARSHIP, Issue 4 2000
Winsome St John
Purpose: To explore the use of computer-based qualitative data analysis software packages. Scope: The advantages and capabilities of qualitative data analysis software are described and concerns about their effects on methods are discussed. Findings: Advantages of using qualitative data analysis software include being freed from manual and clerical tasks, saving time, being able to deal with large amounts of qualitative data, having increased flexibility, and having improved validity and auditability of qualitative research. Concerns include increasingly deterministic and rigid processes, privileging of coding, and retrieval methods; reification of data, increased pressure on researchers to focus on volume and breadth rather than on depth and meaning, time and energy spent learning to use computer packages, increased commercialism, and distraction from the real work of analysis. Conclusions: We recommend that researchers consider the capabilities of the package, their own computer literacy and knowledge of the package, or the time required to gain these skills, and the suitability of the package for their research. The intelligence and integrity that a researcher brings to the research process must also be brought to the choice and use of tools and analytical processes. Researchers should be as critical of the methodological approaches to using qualitative data analysis software as they are about the fit between research question, methods, and research design. [source]


Inference for two-stage adaptive treatment strategies using mixture distributions

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 1 2010
Abdus S. Wahed
Summary., Treatment of complex diseases such as cancer, leukaemia, acquired immune deficiency syndrome and depression usually follows complex treatment regimes consisting of time varying multiple courses of the same or different treatments. The goal is to achieve the largest overall benefit defined by a common end point such as survival. Adaptive treatment strategy refers to a sequence of treatments that are applied at different stages of therapy based on the individual's history of covariates and intermediate responses to the earlier treatments. However, in many cases treatment assignment depends only on intermediate response and prior treatments. Clinical trials are often designed to compare two or more adaptive treatment strategies. A common approach that is used in these trials is sequential randomization. Patients are randomized on entry into available first-stage treatments and then on the basis of the response to the initial treatments are randomized to second-stage treatments, and so on. The analysis often ignores this feature of randomization and frequently conducts separate analysis for each stage. Recent literature suggested several semiparametric and Bayesian methods for inference related to adaptive treatment strategies from sequentially randomized trials. We develop a parametric approach using mixture distributions to model the survival times under different adaptive treatment strategies. We show that the estimators proposed are asymptotically unbiased and can be easily implemented by using existing routines in statistical software packages. [source]


Pluralism and diversity: trends in the use and application of ordination methods 1990-2007

JOURNAL OF VEGETATION SCIENCE, Issue 4 2009
Henrik Von Wehrden
Abstract Question: What are the trends and patterns in the application of ordination techniques in vegetation science since 1990? Location: Worldwide literature analysis. Methods: Evaluation of five major journals of vegetation science; search of all ISI-listed ecological journals. Data were analysed with ANCOVAs, Spearman rank correlations, GLMs, biodiversity indices and simple graphs. Results: The ISI search retrieved fewer papers that used ordinations than the manual evaluation of five selected journals. Both retrieval methods revealed a clear trend in increasing frequency of ordination applications from 1990 to the present. Canonical Correspondence Analysis was far more frequently detected by the ISI search than any other method. Applications such as Correspondence Analysis/Reciprocal Averaging and Detrended Correspondence Analysis have increasingly been used in studies published in "applied" journals, while Canonical Correspondence Analysis, Redundancy Analysis and Non-Metric Multidimensional Scaling were more frequently used in journals focusing on more "basic" research. Overall, Detrended Correspondence Analysis was the most commonly applied method within the five major journals, although the number of publications slightly decreased over time. Use of Non-Metric Multidimensional Scaling has increased over the last 10 years. Conclusion: The availability of suitable software packages has facilitated the application of certain techniques such as Non-Metric Multidimensional Scaling. However, choices of ordination techniques are currently less driven by the constraints imposed by the software; there is also limited evidence that the choice of methods follows social considerations such as the need to use fashionable methods. Methodological diversity has been maintained or has even increased over time and reflects the researcher's need for diverse analytical tools suitable to address a wide range of questions. [source]


SEDVIEW, Real-time Sedimentation Analysis,

MACROMOLECULAR BIOSCIENCE, Issue 7 2010
David B. Hayes
Abstract The ability to obtain a sedimentation coefficient distribution as the run proceeds, and to get an early idea of the quality of a particular sample, has not been made available in real-time during the run in any of the existing software packages. It is desirable on many occasions to be able to see the number of components present in a sample at an early stage of the run. The ability to ascertain the extent of heterogeneity of sample would help enormously to reduce the amount of time that is necessary to obtain that information. Most software packages currently available require that the run be completed before analysis is carried out or at least some of the early scans analyzed off-line to determine if the run should continue. A software package called SEDVIEW has been developed by us to allow early analysis in real-time. [source]


Analysis of characteristics of a U-slot patch antenna using finite-difference time-domain method

MICROWAVE AND OPTICAL TECHNOLOGY LETTERS, Issue 9 2006
Hsing-Yi Chen
Abstract The finite-difference time domain (FDTD) method is successfully used to analyze characteristics of a U-slot patch antenna without using commercial software packages such as HFSS, IE3D, and CST. The method is proved to be an efficient tool for deep insight studies on complicated patch antennas. Numerical results of return loss, radiation pattern, current distribution, and antenna efficiency are presented. FDTD results are also compared with measurement data and shown to be in good agreement. © 2006 Wiley Periodicals, Inc. Microwave Opt Technol Lett 48: 1687,1694, 2006; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/mop.21804 [source]


A practical guide to methods of parentage analysis

MOLECULAR ECOLOGY RESOURCES, Issue 1 2010
ADAM G. JONES
Abstract The use of molecular techniques for parentage analysis has been a booming science for over a decade. The most important technological breakthrough was the introduction of microsatellite markers to molecular ecology, an advance that was accompanied by a proliferation and refinement of statistical techniques for the analysis of parentage data. Over the last several years, we have seen steady progress in a number of areas related to parentage analysis, and the prospects for successful studies continue to improve. Here, we provide an updated guide for scientists interested in embarking on parentage analysis in natural or artificial populations of organisms, with a particular focus on computer software packages that implement various methods of analysis. Our survey of the literature shows that there are a few established methods that perform extremely well in the analysis of most types of parentage studies. However, particular experimental designs or study systems can benefit from some of the less well-known computer packages available. Overall, we find that parentage analysis is feasible and satisfying in most systems, and we try to provide a simple roadmap to help other scientists navigate the confusing topography of statistical techniques. [source]


convert: A user-friendly program to reformat diploid genotypic data for commonly used population genetic software packages

MOLECULAR ECOLOGY RESOURCES, Issue 2 2004
Jeffrey C. Glaubitz
Abstract convert is a user-friendly, 32-bit Windows program that facilitates ready transfer of codominant, diploid genotypic data amongst commonly used population genetic software packages. convert reads input files in its own ,standard' data format, easily produced from an excel file of diploid, codominant marker data, and can convert these to the input formats of the following programs: gda, genepop, arlequin, popgene, microsat, phylip, and structure. convert can also read input files in genepop format. In addition, convert can produce a summary table of allele frequencies in which private alleles and the sample sizes at each locus are indicated. [source]


Comparative LC-MS: A landscape of peaks and valleys

PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 4 2008
Antoine H. P. America Dr.
Abstract Quantitative proteomics approaches using stable isotopes are well-known and used in many labs nowadays. More recently, high resolution quantitative approaches are reported that rely on LC-MS quantitation of peptide concentrations by comparing peak intensities between multiple runs obtained by continuous detection in MS mode. Characteristic of these comparative LC-MS procedures is that they do not rely on the use of stable isotopes; therefore the procedure is often referred to as label-free LC-MS. In order to compare at comprehensive scale peak intensity data in multiple LC-MS datasets, dedicated software is required for detection, matching and alignment of peaks. The high accuracy in quantitative determination of peptide abundancies provides an impressive level of detail. This approach also requires an experimental set-up where quantitative aspects of protein extraction and reproducible separation conditions need to be well controlled. In this paper we will provide insight in the critical parameters that affect the quality of the results and list an overview of the most recent software packages that are available for this procedure. [source]


Comparison of PDQuest and Progenesis software packages in the analysis of two-dimensional electrophoresis gels

PROTEINS: STRUCTURE, FUNCTION AND BIOINFORMATICS, Issue 10 2003
Arsi T. Rosengren
Abstract Efficient analysis of protein expression by using two-dimensional electrophoresis (2-DE) data relies on the use of automated image processing techniques. The overall success of this research depends critically on the accuracy and the reliability of the analysis software. In addition, the software has a profound effect on the interpretation of the results obtained, and the amount of user intervention demanded during the analysis. The choice of analysis software that best meets specific needs is therefore of interest to the research laboratory. In this paper we compare two advanced analysis software packages, PDQuest and Progenesis. Their evaluation is based on quantitative tests at three different levels of standard 2-DE analysis: spot detection, gel matching and spot quantitation. As test materials we use three gel sets previously used in a similar comparison of Z3 and Melanie, and three sets of gels from our own research. It was observed that the quality of the test gels critically influences the spot detection and gel matching results. Both packages were sensitive to the parameter or filter settings with respect to the tendency of finding true positive and false positive spots. Quantitation results were very accurate for both analysis software packages. [source]


A Computer Implementation of the Separate Maintenance Model for Complex-system Reliability

QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 7 2006
M. Tortorella
Abstract Reliability modeling and quantitative reliability prediction for all but the simplest system architectures demands intensive computer support for the numerical computations required. Many commercial and academic reliability modeling software packages provide support for the Markov-chain state diagram system reliability model. Other system reliability models, such as those offering non-exponential life and/or repair time distributions, transient analysis, or other special handling, may sometimes be desirable. Users have fewer choices for software supporting these options. This paper describes SUPER, a software package developed at Bell Laboratories, which provides computational support for the separate maintenance model as well as for some other useful system reliability descriptions. SUPER is an acronym for System Used for Prediction and Evaluation of Reliability. The paper also includes a brief tutorial to assist practitioners with system reliability model selection, a review of the models contained in SUPER and their theoretical bases, and implementation issues. SUPER has been used in the telecommunications industry for over 15 years. The paper includes an example from this experience. Copyright © 2005 John Wiley & Sons, Ltd. [source]


A comparative study of the accuracy of several de novo sequencing software packages for datasets derived by matrix-assisted laser desorption/ionisation and electrospray

RAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 21 2008
Scott Bringans
First page of article [source]