Process Understanding (process + understanding)

Distribution by Scientific Domains


Selected Abstracts


Dealing with Landscape Heterogeneity in Watershed Hydrology: A Review of Recent Progress toward New Hydrological Theory

GEOGRAPHY COMPASS (ELECTRONIC), Issue 1 2009
Peter A. Troch
Predictions of hydrologic system response to natural and anthropogenic forcing are highly uncertain due to the heterogeneity of the land surface and subsurface. Landscape heterogeneity results in spatiotemporal variability of hydrological states and fluxes, scale-dependent flow and transport properties, and incomplete process understanding. Recent community activities, such as Prediction in Ungauged Basins of International Association of Hydrological Sciences, have recognized the impasse current catchment hydrology is facing and have called for a focused research agenda toward new hydrological theory at the watershed scale. This new hydrological theory should recognize the dominant control of landscape heterogeneity on hydrological processes, should explore novel ways to account for its effect at the watershed scale, and should build on an interdisciplinary understanding of how feedback mechanisms between hydrology, biogeochemistry, pedology, geomorphology, and ecology affect catchment evolution and functioning. [source]


Geomorphology Fluid Flow Modelling: Can Fluvial Flow Only Be Modelled Using a Three-Dimensional Approach?

GEOGRAPHY COMPASS (ELECTRONIC), Issue 1 2008
R. J. Hardy
The application of numerical models to gain insight into flow processes is becoming a prevalent research methodology in fluvial geomorphology. The advantage of this approach is that models are particularly useful for identifying emergent behaviour in the landscape where combinations of processes act over several scales. However, there are a wide range of available models and it is not always apparent that methodological approach should be chosen. The decision about the amount of process representation required needs to be balanced against both the spatial and temporal scales of interest. In this article, it is argued that in order to gain a complete, high resolution process understanding of flow within the fluvial system a full three-dimensional modelling approach with a complete physical basis is required. [source]


TOPCAT-NP: a minimum information requirement model for simulation of flow and nutrient transport from agricultural systems

HYDROLOGICAL PROCESSES, Issue 14 2008
P. F. Quinn
Abstract Future catchment planning requires a good understanding of the impacts of land use and management, especially with regard to nutrient pollution. A range of readily usable tools, including models, can play a critical role in underpinning robust decision-making. Modelling tools must articulate our process understanding, make links to a range of catchment characteristics and scales and have the capability to reflect future land-use management changes. Hence, the model application can play an important part in giving confidence to policy makers that positive outcomes will arise from any proposed land-use changes. Here, a minimum information requirement (MIR) modelling approach is presented that creates simple, parsimonious models based on more complex physically based models, which makes the model more appropriate to catchment-scale applications. This paper shows three separate MIR models that represent flow, nitrate losses and phosphorus losses. These models are integrated into a single catchment model (TOPCAT-NP), which has the advantage that certain model components (such as soil type and flow paths) are shared by all three MIR models. The integrated model can simulate a number of land-use activities that relate to typical land-use management practices. The modelling process also gives insight into the seasonal and event nature of nutrient losses exhibited at a range of catchment scales. Three case studies are presented to reflect the range of applicability of the model. The three studies show how different runoff and nutrient loss regimes in different soil/geological and global locations can be simulated using the same model. The first case study models intense agricultural land uses in Denmark (Gjern, 114 km2), the second is an intense agricultural area dominated by high superphosphate applications in Australia (Ellen Brook, 66 km2) and the third is a small research-scale catchment in the UK (Bollington Hall, 2 km2). Copyright © 2007 John Wiley & Sons, Ltd. [source]


Scaling up and out in runoff process understanding: insights from nested experimental catchment studies

HYDROLOGICAL PROCESSES, Issue 11 2006
C. Soulsby
First page of article [source]


Case studies in Bayesian segmentation applied to CD control

INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, Issue 5 2003
A.R. Taylor
Identifying step changes in historical process and controller output variables can lead to improved process understanding and fault resolution in control system performance analysis. This paper describes an application of Bayesian methods in the search for statistically significant temporal segmentations in the data collected by a cross directional (CD) control system in an industrial web forming process. CD control systems give rise to vector observations which are often transformed through orthogonal bases for control and performance analysis. In this paper two models which exploit basis function representations of vector times series data are segmented. The first of these is a power spectrum model based on the asymptotic Chi-squared approximation which allows large data sets to be processed. The second approach, more capable of detecting small changes, but as a result is more computationally demanding, is a special case of the multivariate linear model. Given the statistical model of the data, inference regarding the number and location of the change-points is based on numerical Bayesian methods known as Markov chain Monte Carlo (MCMC). The methods are applied to real data and the resulting segmentation relates to real process events. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Monitoring the film coating unit operation and predicting drug dissolution using terahertz pulsed imaging

JOURNAL OF PHARMACEUTICAL SCIENCES, Issue 12 2009
Louise Ho
Abstract Understanding the coating unit operation is imperative to improve product quality and reduce output risks for coated solid dosage forms. Three batches of sustained-release tablets coated with the same process parameters (pan speed, spray rate, etc.) were subjected to terahertz pulsed imaging (TPI) analysis followed by dissolution testing. Mean dissolution times (MDT) from conventional dissolution testing were correlated with terahertz waveforms, which yielded a multivariate, partial least squares regression (PLS) model with an R2 of 0.92 for the calibration set and 0.91 for the validation set. This two-component, PLS model was built from batch I that was coated in the same environmental conditions (air temperature, humidity, etc.) to that of batch II but at different environmental conditions from batch III. The MDTs of batch II was predicted in a nondestructive manner with the developed PLS model and the accuracy of the predicted values were subsequently validated with conventional dissolution testing and found to be in good agreement. The terahertz PLS model was also shown to be sensitive to changes in the coating conditions, successfully identifying the larger coating variability in batch III. In this study, we demonstrated that TPI in conjunction with PLS analysis could be employed to assist with film coating process understanding and provide predictions on drug dissolution. © 2009 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 98:4866,4876, 2009 [source]


Near infrared spectroscopy in the development of solid dosage forms

JOURNAL OF PHARMACY AND PHARMACOLOGY: AN INTERNATI ONAL JOURNAL OF PHARMACEUTICAL SCIENCE, Issue 2 2007
Eetu Räsänen
The use of near infrared (NIR) spectroscopy has rapidly grown partly due to demands of process analytical applications in the pharmaceutical industry. Furthermore, newest regulatory guidelines have advanced the increase of the use of NIR technologies. The non-destructive and non-invasive nature of measurements makes NIR a powerful tool in characterization of pharmaceutical solids. These benefits among others often make NIR advantageous over traditional analytical methods. However, in addition to NIR, a wide variety of other tools are naturally also available for analysis in pharmaceutical development and manufacturing, and those can often be more suitable for a given application. The versatility and rapidness of NIR will ensure its contribution to increased process understanding, better process control and improved quality of drug products. This review concentrates on the use of NIR spectroscopy from a process research perspective and highlights recent applications in the field. [source]


Multivariate data analysis on historical IPV production data for better process understanding and future improvements

BIOTECHNOLOGY & BIOENGINEERING, Issue 1 2010
Yvonne E. Thomassen
Abstract Historical manufacturing data can potentially harbor a wealth of information for process optimization and enhancement of efficiency and robustness. To extract useful data multivariate data analysis (MVDA) using projection methods is often applied. In this contribution, the results obtained from applying MVDA on data from inactivated polio vaccine (IPV) production runs are described. Data from over 50 batches at two different production scales (700-L and 1,500-L) were available. The explorative analysis performed on single unit operations indicated consistent manufacturing. Known outliers (e.g., rejected batches) were identified using principal component analysis (PCA). The source of operational variation was pinpointed to variation of input such as media. Other relevant process parameters were in control and, using this manufacturing data, could not be correlated to product quality attributes. The gained knowledge of the IPV production process, not only from the MVDA, but also from digitalizing the available historical data, has proven to be useful for troubleshooting, understanding limitations of available data and seeing the opportunity for improvements. Biotechnol. Bioeng. 2010;107: 96,104. © 2010 Wiley Periodicals, Inc. [source]


CHO gene expression profiling in biopharmaceutical process analysis and design

BIOTECHNOLOGY & BIOENGINEERING, Issue 2 2010
Jochen Schaub
Abstract Increase in both productivity and product yields in biopharmaceutical process development with recombinant protein producing mammalian cells can be mainly attributed to the advancements in cell line development, media, and process optimization. Only recently, genome-scale technologies enable a system-level analysis to elucidate the complex biomolecular basis of protein production in mammalian cells promising an increased process understanding and the deduction of knowledge-based approaches for further process optimization. Here, the use of gene expression profiling for the analysis of a low titer (LT) and high titer (HT) fed batch process using the same IgG producing CHO cell line was investigated. We found that gene expression (i) significantly differed in HT versus LT process conditions due to differences in applied chemically defined, serum-free media, (ii) changed over the time course of the fed batch processes, and that (iii) both metabolic pathways and 14 biological functions such as cellular growth or cell death were affected. Furthermore, detailed analysis of metabolism in a standard process format revealed the potential use of transcriptomics for rational media design as is shown for the case of lipid metabolism where the product titer could be increased by about 20% based on a lipid modified basal medium. The results demonstrate that gene expression profiling can be an important tool for mammalian biopharmaceutical process analysis and optimization. Biotechnol. Bioeng. 2010; 105: 431,438. © 2009 Wiley Periodicals, Inc. [source]


Application of agent-based system for bioprocess description and process improvement

BIOTECHNOLOGY PROGRESS, Issue 3 2010
Ying Gao
Abstract Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. © 2009 American Institute of Chemical Engineers Biotechnol. Prog., 2010 [source]


Good modeling practice for PAT applications: Propagation of input uncertainty and sensitivity analysis

BIOTECHNOLOGY PROGRESS, Issue 4 2009
Gürkan Sin
Abstract The uncertainty and sensitivity analysis are evaluated for their usefulness as part of the model-building within Process Analytical Technology applications. A mechanistic model describing a batch cultivation of Streptomyces coelicolor for antibiotic production was used as case study. The input uncertainty resulting from assumptions of the model was propagated using the Monte Carlo procedure to estimate the output uncertainty. The results showed that significant uncertainty exists in the model outputs. Moreover the uncertainty in the biomass, glucose, ammonium and base-consumption were found low compared to the large uncertainty observed in the antibiotic and off-gas CO2 predictions. The output uncertainty was observed to be lower during the exponential growth phase, while higher in the stationary and death phases - meaning the model describes some periods better than others. To understand which input parameters are responsible for the output uncertainty, three sensitivity methods (Standardized Regression Coefficients, Morris and differential analysis) were evaluated and compared. The results from these methods were mostly in agreement with each other and revealed that only few parameters (about 10) out of a total 56 were mainly responsible for the output uncertainty. Among these significant parameters, one finds parameters related to fermentation characteristics such as biomass metabolism, chemical equilibria and mass-transfer. Overall the uncertainty and sensitivity analysis are found promising for helping to build reliable mechanistic models and to interpret the model outputs properly. These tools make part of good modeling practice, which can contribute to successful PAT applications for increased process understanding, operation and control purposes. © 2009 American Institute of Chemical Engineers Biotechnol. Prog., 2009 [source]


Application of Multivariate Data Analysis for Identification and Successful Resolution of a Root Cause for a Bioprocessing Application

BIOTECHNOLOGY PROGRESS, Issue 3 2008
Alime Ozlem Kirdar
Multivariate Data Analysis (MVDA) can be used for supporting key activities required for successful bioprocessing. These activities include process characterization, process scale-up, process monitoring, fault diagnosis and root cause analysis. This paper examines an application of MVDA towards root cause analysis for identifying scale-up differences and parameter interactions that adversely impact cell culture process performance. Multivariate data analysis and modeling were performed using data from small-scale (2 L), pilot-scale (2,000 L) and commercial-scale (15,000 L) batches. The input parameters examined included bioreactor pCO2, glucose, lactate, ammonium, raw materials and seed inocula. The output parameters included product attributes, product titer, viable cell density, cell viability and osmolality. Time course performance variables (daily, initial, peak and end point) were also evaluated. Application of MVDA as a diagnostic tool was successful in identifying the root cause and designing experimental conditions to demonstrate and correct it. Process parameters and their interactions that adversely impact cell culture performance and product attributes were successfully identified. MVDA was successfully used as an effective tool for collating process knowledge and increasing process understanding. [source]