Future Improvements (future + improvement)

Distribution by Scientific Domains

Selected Abstracts

A methodology for simulating power system vulnerability to cascading failures in the steady state

Murali Kumbale
Abstract Simulations of power system conditions and event sequences leading to either local or widespread blackout has acquired increasing importance following wide-impact network collapses that have occurred over the past several years in North America and Europe. This paper summarizes an analytical framework that has been developed, implemented, and practically used by Southern Company to evaluate system vulnerability to cascading failures using the steady state model. This methodology was first used by Southern Company in 1999. The studies performed at Southern Company have already influenced and motivated certain transmission development projects. Future improvements to the method could include better modeling and sequencing of cascading steps, including time sequence of failures using equipment response time. Significant interest exists in developing preventive methods and procedures and in the application of this technology in the infrastructure security arena. Copyright © 2008 John Wiley & Sons, Ltd. [source]

The effect of lack of insurance, poverty and paediatrician supply on immunization rates among children 19,35 months of age in the United States

James L. Becton Jr MD
Rational, aims and objectives, Previous studies found that the increasing number of paediatricians in the United States was associated with improved childhood immunization coverage, while the increasing poverty level and the lack of health insurance reduced access to health care. We evaluated whether changes in the number of paediatricians, poverty level and health insurance affected national childhood immunization coverage in the state levels of the United States. Methods, Data were collected primarily from the US National Immunization Surveys, series 4:3:1:3:3 from years 1995 and 2003. Ordinal logistic regression analysis was used to analyse the relationships among variables. Results, Over 8 years studied, immunization coverage increased for children aged 19,35 months from 52.3% to 79.8% in the 50 states. The average number of paediatricians per 1000 births increased 28.7% while the percentage of children without health insurance declined 15.6%, and the percentage of children who lived in poverty level declined 17.3%. In 1995, the states with higher immunization coverage were associated with higher numbers of paediatricians [odds ratio (OR), 32.73; 95% confidence interval (CI), 5.96,179.77]. In 2003, the higher numbers of paediatricians still played a role in the increased immunization coverage (OR, 4.69; 95% CI, 1.01,21.78); however, the higher rate of uninsured children in 2003 had an even greater effect upon immunization coverage. Compared with states with lower rates of uninsured children, states with intermediate and higher rates of uninsured children had sixfold (OR, 0.16; 95% CI, 0.03,0.81) and 16-fold (OR, 0.06; 95% CI, 0.01,0.40) decreased childhood immunization coverage, respectively. Conclusion, Between 1995 and 2003 in the United States, the lack of health insurance became more prominent than the supply of paediatricians in affecting immunization coverage for children aged 19,35 months. Future improvements in insurance coverage for children will likely lead to greater immunization coverage. [source]

Evaluation of a rotary tablet press simulator as a tool for the characterization of compaction properties of pharmaceutical products

F. Michaut
Abstract The Stylcam 100R, a rotary press simulator, was designed to simulate speed profiles of rotary tablet presses. Such a simulator was qualified by numerous laboratories and, actually, its ability to be used for studying the behaviour of powders under pressure should be examined. Then, the purpose of this work was to investigate the performances of the Stylcam 100R for characterizing the compaction behaviour and the tabletting properties of pharmaceutical powders. The compressibility of three pharmaceutical excipients (microcrystalline cellulose, dicalcium phosphate dihydrate and ,-lactose monohydrate) was studied. Four compression speeds were used on the compaction simulator. Force,displacement cycles were associated with two energy parameters, the specific total energy (Estot) and the specific expansion energy (Esexp). The mean yield pressure was calculated from Heckel's plots obtained with the in-die method. The diametral tensile strength of compacts was measured in order to evaluate mechanical properties. To evaluate the accuracy of all these parameters, a comparative study was carried out on an eccentric instrumented press. The values of energy parameters and tensile strengths of tablets are close between the eccentric press and the compaction simulator, whatever the compression speed on the latter. The mean yield pressure values obtained using the two presses are different. Finally, the Stylcam 100R seems to be a good tool for characterising tabletting properties of powders, except for the Heckel's model probably due to an unadapted equation of deformation and a lack of accuracy of the displacement transducers. Future improvements should allow correcting these two points. © 2009 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 99: 2874,2885, 2010 [source]

Regenerative Medicine for Cardiovascular Disorders-New Milestones: Adult Stem Cells

A. Ruchan Akar
Abstract:, Cardiovascular disorders are the leading causes of mortality and morbidity in the developed world. Cell-based modalities have received considerable scientific attention over the last decade for their potential use in this clinical arena. This review was intended as a brief overview on the subject of therapeutic potential of adult stem cells in cardiovascular medicine with basic science findings and the current status of clinical applications. The historical perspective and basic concepts are reviewed and a description of current applications and potential adverse effects in cardiovascular medicine is given. Future improvements on cell-based therapies will likely provide remarkable improvement in survival and quality of life for millions of patients with cardiovascular disorders. [source]

Can we predict future improvement in glycaemic control?

R. Singh
Abstract Aims To determine the factors responsible for poor glycaemic control in diabetes and whether any such factors are associated with likely improvement in glycaemic control. Methods A prospective cohort study of 130 diabetic patients with poor glycaemic control (HbA1c , 10.0%) with 1-year follow-up in a teaching hospital Diabetes Clinic. Changes in HbA1c were measured after 1 year. Results Poor glycaemic control was attributed to one of 15 possible causes. Those cases due to recent diagnosis of diabetes, inadequate treatment with diet, oral glucose-lowering agents or insulin, exacerbation of co-existent medical problems, recent stressful life-events and missed clinic appointments were all associated with significant improvement in HbA1c at 12 months. Patients with low mood or alcohol excess, inadequate blood glucose monitoring, poor exercise/sedentary lifestyle, refusal to take tablets or underdosing and refusal to take insulin at all or to increase the dose were all associated with continuing poor glycaemic control at 12 months. The patients were divided almost equally between the two groups. Conclusions In patients with poor glycaemic control, it is possible by simple features identified at clinic to predict which individuals are likely to show improvement in control and which will not. These findings have not been reported previously and suggest that about half of individuals with poor control will improve within our current diabetes clinic practice. Additional strategies will be required to address those individuals who are not likely to respond. [source]

Distributed MEMS transmission lines for tunable filter applications

Yu Liu
Abstract This paper describes the design and fabrication of a distributed MEMS transmission line (DMTL), used to realize a transmission-line with a voltage-variable electrical length for microwave circuits. The DMTL is a coplanar waveguide periodically loaded with continuously-variable MEMS capacitors. A tunable bandpass filter was designed and fabricated on 700 ,m thick glass substrates using three capacitively coupled DMTL sections as variable shunt resonators. The measured results demonstrate a 3.8% tuning range at 20 GHz with 3.6 dB minimum insertion loss. Issues for future improvement are discussed. © 2001 John Wiley & Sons, Inc. Int J RF and Microwave CAE 11: 254,260, 2001. [source]

Efficacy of trap and lure types for detection of Agrilus planipennis (Col., Buprestidae) at low density

J. M. Marshall
Abstract Development of effective trapping tools for forest pests and evaluating the key components of these tools is necessary to locate early-stage infestations and develop management responses to them. Agrilus planipennis Fairmaire (emerald ash borer) is an introduced pest of ash (Fraxinus spp. L.) in North America. The effectiveness of different trap and lure combinations were tested in areas with low and high density populations of A. planipennis. At low density sites, purple prism traps outperformed green traps and girdled ash trap trees in capture rates (adults per day) and rates of detection of A. planipennis. Also, manuka oil lures, used as a standard lure in a national survey programme, captured higher rates of A. planipennis than did previous standards of girdled ash trap trees. There was no logistic relationship between the detection of A. planipennis on a trap and the diameter of the ash tree from which the trap was suspended, possibly because of the use of artificial lures with these traps. There was also no difference in the mean number of A. lanipennis captured per day between ash species and between vigour rating of ash associated with the traps. However, traps placed in open grown and dominant trees captured more beetles than traps placed in lower canopy class trees. At sites defined as low and high density, there was no difference in the larval density per cm3 of phloem. This suggests that exposure time to A. planipennis has been shorter at those low density sites. By exploiting the trap and tree characteristics that improve A. planipennis capture rates and detection efficacy, there can be future improvement in management of this pest. If detection can occur before infested ash trees exhibit signs and symptoms, there may be a potential for reducing the mortality of ash within stands. [source]

Health economics of asthma: assessing the value of asthma interventions

ALLERGY, Issue 12 2008
J. D. Campbell
The aim of this systematic review was to summarize and assess the quality of asthma intervention health economic studies from 2002 to 2007, compare the study findings with clinical management guidelines, and suggest avenues for future improvement of asthma health economic studies. Forty of the 177 studies met our inclusion criteria. We assessed the quality of studies using The Quality of Health Economic Studies validated instrument (total score range: 0,100). Six studies (15%) had quality category 2, 26 studies (65%) achieved quality category 3, and the remaining eight (20%) studies were scored as the highest quality level, category 4. Overall, the findings from this review are in line with the Global Initiative for Asthma clinical guidelines. Many asthma health economic studies lacked appropriate long term time horizons to match the chronic nature of the disease and suffered from using effectiveness measures that did not capture all disease related risks and benefits. We recommend that new asthma simulation models: be flexible to allow for long term time horizons, focus on using levels of asthma control in their structure, and estimate both long term asthma specific outcomes like well-controlled time as well as generic outcomes such as quality adjusted survival. [source]

A comparative proteomic analysis of HepG2 cells incubated by S(,) and R(+) enantiomers of anti-coagulating drug warfarin

Jing Bai
Abstract Warfarin is a commonly prescribed oral anti-coagulant with narrow therapeutic index. It interferes with vitamin K cycle to achieve anti-coagulating effects. Warfarin has two enantiomers, S(,) and R(+) and undergoes stereoselective metabolism, with the S(,) enantiomer being more effective. We reported that the intracellular protein profile in HepG2 cells incubated with S(,) and R(+) warfarin, using iTRAQ-coupled 2-D LC-MS/MS. In samples incubated with S(,) and R(+) warfarin alone, the multi-task protein Protein SET showed significant elevation in cells incubated with S(,) warfarin but not in those incubated with R(+) warfarin. In cells incubated with individual enantiomers of warfarin in the presence of vitamin K, protein disulfide isomerase A3 which is known as a glucose-regulated protein, in cells incubated with S(,) warfarin was found to be down-regulated compared to those incubated with R(+) warfarin. In addition, Protein DJ-1 and 14-3-3 Protein, were down-regulated in cells incubated with either S(,) or R(+) warfarin regardless of the presence of vitamin K. Our results indicated that Protein DJ-1 may act as an enzyme for expression of essential enzymes in vitamin K cycle. Taken together, our findings provided molecular evidence on a comprehensive protein profile on warfarin,cell interaction, which may shed new lights on future improvement of warfarin therapy. [source]

Are we reaching the limits or our ability to detect skin effects with our current testing and measuring methods for consumer products?

Miranda A. Farage
Testing for potential adverse skin effects is a key part of both the overall safety assessment for many consumer products and the evaluation of potential product improvements in mildness. Whilst modern tissue and paper products (i.e. facial tissues, catamenial products, baby wipes and baby and adult diapers) are inherently very mild to skin, current test methodology may not be robust enough to evaluate future improvements in such products. This article provides a commentary on several technologies we have been exploring to improve the sensitivity of test methods for tissue and paper products. The focus has been on three approaches: (i) further exaggerating exposure conditions using novel approaches to sample application, (ii) increasing the sensitivity of the manner in which we score for irritant effects, either visually or via instrumentation and (iii) quantitatively measuring additional endpoints, i.e. subjective sensory effects. [source]

Comparison of the Medical Priority Dispatch System to an Out-of-hospital Patient Acuity Score

Michael J. Feldman MD
Abstract Background: Although the Medical Priority Dispatch System (MPDS) is widely used by emergency medical services (EMS) dispatchers to determine dispatch priority, there is little evidence that it reflects patient acuity. The Canadian Triage and Acuity Scale (CTAS) is a standard patient acuity scale widely used by Canadian emergency departments and EMS systems to prioritize patient care requirements. Objectives: To determine the relationship between MPDS dispatch priority and out-of-hospital CTAS. Methods: All emergency calls on a large urban EMS communications database for a one-year period were obtained. Duplicate calls, nonemergency transfers, and canceled calls were excluded. Sensitivity and specificity to detect high-acuity illness, as well as positive predictive value (PPV) and negative predictive value (NPV), were calculated for all protocols. Results: Of 197,882 calls, 102,582 met inclusion criteria. The overall sensitivity of MPDS was 68.2% (95% confidence interval [CI] = 67.8% to 68.5%), with a specificity of 66.2% (95% CI = 65.7% to 66.7%). The most sensitive protocol for detecting high acuity of illness was the breathing-problem protocol, with a sensitivity of 100.0% (95% CI = 99.9% to 100.0%), whereas the most specific protocol was the one for psychiatric problems, with a specificity of 98.1% (95% CI = 97.5% to 98.7%). The cardiac-arrest protocol had the highest PPV (92.6%, 95% CI = 90.3% to 94.3%), whereas the convulsions protocol had the highest NPV (85.9%, 95% CI = 84.5% to 87.2%). The best-performing protocol overall was the cardiac-arrest protocol, and the protocol with the overall poorest performance was the one for unknown problems. Sixteen of the 32 protocols performed no better than chance alone at identifying high-acuity patients. Conclusions: The Medical Priority Dispatch System exhibits at least moderate sensitivity and specificity for detecting high acuity of illness or injury. This performance analysis may be used to identify target protocols for future improvements. [source]


EVOLUTION, Issue 3 2007
L. Lacey Knowles
Patterns of genetic variation can provide valuable insights for deciphering the relative roles of different evolutionary processes in species differentiation. However, population-genetic models for studying divergence in geographically structured species are generally lacking. Since these are the biogeographic settings where genetic drift is expected to predominate, not only are population-genetic tests of hypotheses in geographically structured species constrained, but generalizations about the evolutionary processes that promote species divergence may also be potentially biased. Here we estimate a population-divergence model in montane grasshoppers from the sky islands of the Rocky Mountains. Because this region was directly impacted by Pleistocene glaciation, both the displacement into glacial refugia and recolonization of montane habitats may contribute to differentiation. Building on the tradition of using information from the genealogical relationships of alleles to infer the geography of divergence, here the additional consideration of the process of gene-lineage sorting is used to obtain a quantitative estimate of population relationships and historical associations (i.e., a population tree) from the gene trees of five anonymous nuclear loci and one mitochondrial locus in the broadly distributed species Melanoplus oregonensis. Three different approaches are used to estimate a model of population divergence; this comparison allows us to evaluate specific methodological assumptions that influence the estimated history of divergence. A model of population divergence was identified that significantly fits the data better compared to the other approaches, based on per-site likelihood scores of the multiple loci, and that provides clues about how divergence proceeded in M. oregonensis during the dynamic Pleistocene. Unlike the approaches that either considered only the most recent coalescence (i.e., information from a single individual per population) or did not consider the pattern of coalescence in the gene genealogies, the population-divergence model that best fits the data was estimated by considering the pattern of gene lineage coalescence across multiple individuals, as well as loci. These results indicate that sampling of multiple individuals per population is critical to obtaining an accurate estimate of the history of divergence so that the signal of common ancestry can be separated from the confounding influence of gene flow,even though estimates suggest that gene flow is not a predominant factor structuring patterns of genetic variation across these sky island populations. They also suggest that the gene genealogies contain information about population relationships, despite the lack of complete sorting of gene lineages. What emerges from the analyses is a model of population divergence that incorporates both contemporary distributions and historical associations, and shows a latitudinal and regional structuring of populations reminiscent of population displacements into multiple glacial refugia. Because the population-divergence model itself is built upon the specific events shaping the history of M. oregonensis, it provides a framework for estimating additional population-genetic parameters relevant to understanding the processes governing differentiation in geographically structured species and avoids the problems of relying on overly simplified and inaccurate divergence models. The utility of these approaches, as well as the caveats and future improvements, for estimating population relationships and historical associations relevant to genetic analyses of geographically structured species are discussed. [source]

From the Atomic Jump Frequencies to the Phenomenological Transport Coefficients,

M. Nastar
The SCMF theory based on an atomic model of atom-vacancy exchange frequencies yields general expressions of the phenomenological coefficients Lij of a multi-component alloy with any crystallographic structure. The limitations and future improvements of the Self-Consistent Mean-Field (SCMF) approach are easily related to the statistical approximation of the thermodynamic correlations and to the time-dependent effective interactions used to describe the kinetic correlations induced by the vacancy diffusion mechanism. [source]

Laboratory identification of factor VIII inhibitors in the real world: the experience from Australasia

HAEMOPHILIA, Issue 4 2010
Summary., The laboratory has a key role in the initial detection of factor inhibitors and an ongoing role in the measurement of inhibitor titres during the course of inhibitor eradication therapy. The most commonly seen factor inhibitors are those directed against factor VIII (FVIII), usually detected either using the original or Nijmegen-modified Bethesda assay. In view of previously demonstrated high variability in laboratory results for inhibitor assays, we have more extensively examined laboratory performance in the identification of FVIII inhibitors. Over the past 3 years, we conducted two questionnaire-based surveys and two wet-challenge surveys utilizing eight samples comprising no FVIII inhibitor (n = 1), or low-titre (n = 2), medium-titre (n = 3) or high-titre (n = 2) FVIII inhibitor. Four samples were tested by 42 laboratories in 2007, and four by 52 laboratories in 2009. High inter-laboratory variation was evident, with CVs around 50% not uncommon, and some 10% of all laboratories (or around 15% of laboratories using Bethesda method) failed to detect low-level inhibitors of around 1 BU mL,1. Laboratories using the Nijmegen method appeared to perform better than those using a standard Bethesda assay, with lower evident assay variation and no false negatives. There was a wide variety of laboratory practice, with no two laboratories using exactly the same process for testing and interpretation of factor inhibitor findings. In conclusion, our study indicates that there is still much need for standardization and improvement in factor inhibitor detection, and we hope that our findings provide a basis for future improvements in this area. [source]

Energetic, exergetic and thermoeconomic analysis of Bilkent combined cycle cogeneration plant

C. Ozgur Colpan
Abstract This paper is a case study of thermodynamics and economics related analyses applied to an existing gas/steam combined cycle cogeneration plant. Basic thermodynamic properties of the plant are determined by energy analysis utilizing main operation conditions. Exergy destructions within the plant and exergy losses to environment are investigated to determine thermodynamic inefficiencies and to assist for guiding future improvements in the plant. Cost balances and auxiliary equations are applied to several subsystems in the plant, hence, cost formation in the plant is observed. Additionally, cost rate of each product of the plant is calculated. Copyright © 2005 John Wiley & Sons, Ltd. [source]

Thermodynamic optimization of internal structure in a fuel cell

Jose V. C. Vargas
Abstract This paper shows that the internal structure (relative sizes, spacings) of a fuel cell can be optimized so that performance is maximized at the global level. The optimization of flow geometry begins at the smallest (elemental) level, where the fuel cell is modelled as a unidirectional flow system. The polarization curve, power and efficiency are obtained as functions of temperature, pressure, geometry and operating parameters. Although the model is illustrated for an alkaline fuel cell, it may be applied to other fuel cell types by changing the reaction equations and accounting for the appropriate energy interactions. The optimization of the internal structure is subjected to fixed total volume. There are four degrees of freedom in the optimization, which account for the relative thicknesses of the two (anode and cathode) diffusion layers, two reaction layers and the space occupied by the electrolyte solution. The available volume is distributed optimally through the system so that the total power is maximized. Numerical results show that the optima are sharp, and must be identified accurately. Temperature and pressure gradients play important roles, especially as the fuel and oxidant flow paths increase. The optimized internal structure is reported in dimensionless form. Directions for future improvements in flow architecture (constructal design) are discussed. Copyright © 2004 John Wiley & Sons, Ltd. [source]

Individual-based models of cod movement and population dynamics

H. J. Edwards
Many fish species undergo seasonal changes in distribution, as a result of horizontal migrations between feeding, nursery and spawning grounds. Exploring the processes involved in these movements may be the key to understanding interactions with other species, man and the environment, and is therefore crucial to effective fisheries management. Recent tagging experiments providing information on the distribution of migratory fish stocks have indicated pronounced regional and temporal differences in the migratory behaviour of cod, suggesting complex interactions between this commercially important fish species and the environment. This paper presents a model of the horizontal movements of demersal fish, principally cod, using an individual-based modelling approach to explore and predict the relationship between demersal fish movements and key environmental and ecological factors. The model simulates the basic biological processes of growth, movement and mortality, and is driven by the analysis of physical tagging data recorded by electronic data storage tags (DSTs). Results show that the incorporation of behavioural data from DSTs into spatially explicit individual-based models can provide realistic simulations of large-scale fish stocks, thus giving a better understanding of their basic ecology and allowing more effective management of commercially important fish species. Possibilities of future improvements and extensions to the model are discussed. [source]

Introduction to the special issue on Bayesian journey-to-crime modelling

Ned Levine
Abstract In this special issue of the Journal of Investigative Psychology and Offender Profiling, we explore a Bayesian approach to journey-to-crime (JTC) estimation with an emphasis on the statistical models used. The approach conceptualises the probability that an offender lives at one location as the product of the probability distribution from a JTC estimate along with the probability distribution of other offenders who committed crimes in the same locations. The Bayesian approach is appropriate as the second part is conditional on the first part. The introduction gives the background behind the methodology and suggests how future improvements can be made by integrating new information. Finally, the papers in the special issue are introduced. Copyright © 2009 John Wiley & Sons, Ltd. [source]

Labor Reform and Dual Transitions in Brazil and the Southern Cone

Marķa Lorena Cook
ABSTRACT The sequencing of transitions to democracy and to a market economy shaped the outcome of labor law reform and prospects for expanded labor rights in Argentina, Brazil, and Chile. Argentina and Brazil experienced democratic transitions before market economic reforms were consolidated in the 1990s. During the transition, unions obtained prolabor reforms and secured rights that were enshrined in labor law. In posttransition democratic governments, market reforms coincided with efforts to reverse earlier labor protections. Unable to block many harmful reforms, organized labor in Argentina and Brazil did conserve core interests linked to organizational survival and hence to future bargaining leverage. In Chile this sequence was reversed. Market economic policies and labor reform were consolidated under military dictatorship. During democratic transition, employers successfully resisted reforms that would expand labor rights. This produced a limited scope of organizational resources for Chilean unions and reduced prospects for future improvements. [source]

Explosion energies, nickel masses and distances of Type II plateau supernovae

D. K. Nadyozhin
ABSTRACT The hydrodynamical modelling of Type II plateau supernova (SNIIP) light curves predicts a correlation between three observable parameters (plateau duration, absolute magnitude and photospheric velocity at the middle of the plateau) on the one hand, and three physical parameters (explosion energy E, mass of the envelope expelled and pre-supernova radius R) on the other. The correlation is used, together with adopted distances from the expanding photosphere method, to estimate and R for a dozen well-observed SNIIP. For this set of supernovae, the resulting value of E varies within a factor of 6 (0.5 ,E/1051 erg , 3), whereas the envelope mass remains within the limits . The pre-supernova radius is typically 200,600 R,, but can reach ,1000 R, for the brightest supernovae (e.g. SN 1992am). A new method of determining the distance of SNIIP is proposed. It is based on the assumption of a correlation between the explosion energy E and the 56Ni mass required to power the post-plateau light curve tail through 56Co decay. The method is useful for SNIIP with well-observed bolometric light curves during both the plateau and radioactive tail phases. The resulting distances and future improvements are discussed. [source]

Orthodontically stressed periodontium of transgenic mouse as a model for studying mechanical response in bone: The effect on the number of osteoblasts

Dubravko Pavlin
A better understanding of cellular and molecular mechanisms involved in response to mechanical stress is a prerequisite for future improvements in orthodontic treatment. To expand the application of molecular biology techniques in this area of research, we developed and characterized a mouse tooth movement model. The aim of this study was to biomechanically characterize this model and to evaluate the effect of orthodontic stress on the proliferation of periodontal osteoblasts. We used an orthodontic coil spring appliance with a low force/deflection rate, which produced an average force of 10,12 g. This design provided a predictable tipping movement of the molar with the center of rotation at the level of root apices. Histological observations of paradental tissues revealed a response favoring a fast onset of tooth movement and deposition of new osteoid starting after 3 days of treatment. The effect of treatment on the histomorpometric parameter of the number of osteoclasts per unit bone perimeter was determined after 1, 2, 3, 4, 6, and 12 days of treatment. Starting with day 2, the osteoblast number showed a modest but consistent increase in treated periodontal sites at all time-points, ranging from 14 to 39% and becoming significant only at day 6. Only a moderate increase in the number of osteoblasts in the areas of otherwise intense bone matrix synthesis suggests that, during bone formation, proliferation of cells has a smaller role compared to a marked increase in differentiation of individual cells. The mouse model, which allows for a controlled, reproducible, orthodontic mechanical loading, can be applied to both wild-type and transgenic animals and should enhance the research of the transduction of mechanical orthodontic signal into a biological response. [source]

Empirical evaluation of an extended similarity theory for the stably stratified atmospheric surface layer

Harald Sodemann
Abstract The theory of the atmospheric stable boundary layer (SBL) has recently been extended by a distinction between nocturnal and long-lived SBLs. The latter SBL type, which includes influences from the free atmosphere on fluxes in the surface layer, requires a modification of the traditional Monin,Obukhov similarity theory. In the present study, the applicability of this extended theory for long-lived SBLs is evaluated and the required coefficients are estimated using data from Antarctica. Changes in wind and temperature gradients due to different weather conditions are shown to exert a strong influence on the estimation of the new coefficients CuN and C,N. Using the wind gradient as classification criterion, the momentum flux coefficient CuN is estimated to range between 0.51±0.03 and 2.26±0.08. Using the temperature gradient as classification criterion, the heat flux coefficient C,N is estimated to range between 0.022±0.002 and 0.040±0.001. At present, the proposed new scaling theory is still in a preliminary stage. Possible future improvements should take into account factors influencing the wind and temperature gradients, such as weather conditions. An artificial background correlation strongly imprints upon the parameter estimation, suggesting that both the methodology for estimating the new coefficients CuN and C,N and the choice of the nondimensional variables for this extended scaling theory may require some revision. Copyright © 2004 Royal Meteorological Society [source]

Strategies of plant breeding for improved rumen function

Summary In general, breeding programmes directed at the improvement of forage have concentrated on easily measurable phenotypes such as yield, digestibility, resistance to lodging, etc. Selection programmes have improved forage production but historically have addressed relatively few quality considerations. In addition, selection for quality has been limited by availability of suitable analytical techniques. With the current emphasis on quality rather than quantity and the desire by the public for greater understanding about where their food comes from, quality considerations should be a greater target in future breeding programmes. This review briefly covers previous improvements in quality of grazed and silage forages and considers how new technologies might be employed to realise targets for future improvements. In particular we address the concept that interactions between rumen micro-organisms and ingested plant material in the rumen are not static but are in fact dynamic. This has implications for post-ingestion biology and feed utilisation. [source]

Power generation from coal and biomass based on integrated gasification combined cycle concept with pre- and post-combustion carbon capture methods

Calin-Cristian Cormos
Abstract Gasification technology is a process in which solid fuel is partially oxidised by oxygen and steam/water to produce a combustible gas called syngas (mainly a mixture of hydrogen and carbon monoxide). Syngas can be used either for power generation or processed to obtain various chemicals (hydrogen, ammonia, methanol, etc.). This article evaluates the possibilities of solid fuel decarbonisation by capturing carbon dioxide resulted form thermo-chemical conversion of solid fuel using gasification. Evaluation is focused on power generation technology using syngas produced by solid fuel gasification (so-called integrated gasification combined cycle,IGCC). Case studies analysed in the article are using a mixture of coal and biomass (sawdust) to produce around 400 MW electricity simultaneously with capturing about 90% of the feedstock carbon. Various carbon dioxide capture options (post- and pre-combustion) are compared with situation of no carbon capture in terms of plant configurations, energy penalty, CO2 emissions, etc. Plant options are modelled using ChemCAD, and simulation results are used to assess the plant performances. Plant flexibility and future improvements are also discussed. Copyright © 2009 Curtin University of Technology and John Wiley & Sons, Ltd. [source]

Multivariate data analysis on historical IPV production data for better process understanding and future improvements

Yvonne E. Thomassen
Abstract Historical manufacturing data can potentially harbor a wealth of information for process optimization and enhancement of efficiency and robustness. To extract useful data multivariate data analysis (MVDA) using projection methods is often applied. In this contribution, the results obtained from applying MVDA on data from inactivated polio vaccine (IPV) production runs are described. Data from over 50 batches at two different production scales (700-L and 1,500-L) were available. The explorative analysis performed on single unit operations indicated consistent manufacturing. Known outliers (e.g., rejected batches) were identified using principal component analysis (PCA). The source of operational variation was pinpointed to variation of input such as media. Other relevant process parameters were in control and, using this manufacturing data, could not be correlated to product quality attributes. The gained knowledge of the IPV production process, not only from the MVDA, but also from digitalizing the available historical data, has proven to be useful for troubleshooting, understanding limitations of available data and seeing the opportunity for improvements. Biotechnol. Bioeng. 2010;107: 96,104. © 2010 Wiley Periodicals, Inc. [source]

Techno-economic evaluation of a two-step biological process for hydrogen production

Mattias Ljunggren
Abstract An integrated biological process for the production of hydrogen based on thermophilic and photo-heterotrophic fermentation was evaluated from a technical and economic standpoint. Besides the two fermentation steps the process also includes pretreatment of the raw material (potato steam peels) and purification of hydrogen using amine absorption. The study aimed neither at determining the absolute cost of biohydrogen nor at an economic optimization of the production process, but rather at studying the effects of different parameters on the production costs of biohydrogen as a guideline for future improvements. The effect of the key parameters, hydrogen productivity and yield and substrate concentration in the two fermentations on the cost of the hydrogen produced was studied. The selection of the process conditions was based mainly on laboratory data. The process was simulated by use of the software Aspen Plus and the capital costs were estimated using the program Aspen Icarus Process Evaluator. The study shows that the photo-fermentation is the main contributor to the hydrogen production cost mainly because of the cost of plastic tubing, for the photo-fermentors, which represents 40.5% of the hydrogen production cost. The costs of the capital investment and chemicals were also notable contributors to the hydrogen production cost. Major economic improvements could be achieved by increasing the productivity of the two fermentation steps on a medium-term to long-term scale. © 2009 American Institute of Chemical Engineers Biotechnol. Prog., 2010 [source]

A single nutrient feed supports both chemically defined NS0 and CHO fed-batch processes: Improved productivity and lactate metabolism

Ningning Ma
Abstract A chemically defined nutrient feed (CDF) coupled with basal medium preloading was developed to replace a hydrolysate-containing feed (HCF) for a fed-batch NS0 process. The CDF not only enabled a completely chemically defined process but also increased recombinant monoclonal antibody titer by 115%. Subsequent tests of CDF in a CHO process indicated that it could also replace the hydrolysate-containing nutrient feed in this expression system as well as providing an 80% increase in product titer. In both CDF NS0 and CHO processes, the peak lactate concentrations were lower and, more interestingly, lactate metabolism shifted markedly from net production to net consumption when cells transitioned from exponential to stationary growth phase. Subsequent investigations of the lactate metabolic shift in the CHO CDF process were carried out to identify the cause(s) of the metabolic shift. These investigations revealed several metabolic features of the CHO cell line that we studied. First, glucose consumption and lactate consumption are strictly complementary to each other. The combined cell specific glucose and lactate consumption rate was a constant across exponential and stationary growth phases. Second, Lactate dehydrogenase (LDH) activity fluctuated during the fed-batch process. LDH activity was at the lowest when lactate concentration started to decrease. Third, a steep cross plasma membrane glucose gradient exists. Intracellular glucose concentration was more than two orders of magnitude lower than that in the medium. Fourth, a large quantity of citrate was diverted out of mitochondria to the medium, suggesting a partially truncated tricarboxylic acid (TCA) cycle in CHO cells. Finally, other intermediates in or linked to the glycolytic pathway and the TCA cycle, which include alanine, citrate, isocitrate, and succinate, demonstrated a metabolic shift similar to that of lactate. Interestingly, all these metabolites are either in or linked to the pathway downstream of pyruvate, but upstream of fumarate in glucose metabolism. Although the specific mechanisms for the metabolic shift of lactate and other metabolites remain to be elucidated, the increased understanding of the metabolism of CHO cultures could lead to future improvements in medium and process development. © 2009 American Institute of Chemical Engineers Biotechnol. Prog., 2009 [source]