Design Paradigm (design + paradigm)

Distribution by Scientific Domains


Selected Abstracts


A predictive high-throughput scale-down model of monoclonal antibody production in CHO cells

BIOTECHNOLOGY & BIOENGINEERING, Issue 6 2009
Rachel Legmann
Abstract Multi-factorial experimentation is essential in understanding the link between mammalian cell culture conditions and the glycoprotein product of any biomanufacturing process. This understanding is increasingly demanded as bioprocess development is influenced by the Quality by Design paradigm. We have developed a system that allows hundreds of micro-bioreactors to be run in parallel under controlled conditions, enabling factorial experiments of much larger scope than is possible with traditional systems. A high-throughput analytics workflow was also developed using commercially available instruments to obtain product quality information for each cell culture condition. The micro-bioreactor system was tested by executing a factorial experiment varying four process parameters: pH, dissolved oxygen, feed supplement rate, and reduced glutathione level. A total of 180 micro-bioreactors were run for 2 weeks during this DOE experiment to assess this scaled down micro-bioreactor system as a high-throughput tool for process development. Online measurements of pH, dissolved oxygen, and optical density were complemented by offline measurements of glucose, viability, titer, and product quality. Model accuracy was assessed by regressing the micro-bioreactor results with those obtained in conventional 3,L bioreactors. Excellent agreement was observed between the micro-bioreactor and the bench-top bioreactor. The micro-bioreactor results were further analyzed to link parameter manipulations to process outcomes via leverage plots, and to examine the interactions between process parameters. The results show that feed supplement rate has a significant effect (P,<,0.05) on all performance metrics with higher feed rates resulting in greater cell mass and product titer. Culture pH impacted terminal integrated viable cell concentration, titer and intact immunoglobulin G titer, with better results obtained at the lower pH set point. The results demonstrate that a micro-scale system can be an excellent model of larger scale systems, while providing data sets broader and deeper than are available by traditional methods. Biotechnol. Bioeng. 2009; 104: 1107,1120. © 2009 Wiley Periodicals, Inc. [source]


A Practical Approach to the Design, Monitoring, and Optimization of In Situ MTBE Aerobic Biobarriers

GROUND WATER MONITORING & REMEDIATION, Issue 1 2010
Paul C. Johnson
A paradigm for the design, monitoring, and optimization of in situ methyl tert -butyl ether (MTBE) aerobic biobarriers is presented. In this technology, an oxygen-rich biologically reactive treatment zone (the "biobarrier") is established in situ and downgradient of the source of dissolved MTBE contamination in groundwater, typically gasoline-impacted soils resulting from leaks and spills at service station sites or other fuel storage and distribution facilities. The system is designed so that groundwater containing dissolved MTBE flows to, and through, the biobarrier treatment zone, ideally under natural gradient conditions so that no pumping is necessary. As the groundwater passes through the biobarrier, the MTBE is converted by microorganisms to innocuous by-products. The system also reduces concentrations of other aerobically degradable chemicals dissolved in the groundwater, such as benzene, toluene, xylenes, and tert -butyl alcohol. This design paradigm is based on experience gained while designing, monitoring, and optimizing pilot-scale and full-scale MTBE biobarrier systems. It is largely empirically based, although the design approach does rely on simple engineering calculations. The paradigm emphasizes gas injection,based oxygen delivery schemes, although many of the steps would be common to other methods of delivering oxygen to aquifers. [source]


A combined iterative scheme for identification and control redesigns

INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, Issue 8 2004
Paresh Date
Abstract This work proposes a unified algorithm for identification and control. Frequency domain data of the plant is weighted to satisfy the given performance specifications. A model is then identified from this weighted frequency domain data and a controller is synthesised using the ,, loopshaping design procedure. The cost function used in the identification stage essentially minimizes a tight upper bound on the difference between the achieved and the designed performance in the sense of the ,, loopshaping design paradigm. Given a model, a method is also suggested to re-adjust model and weighting transfer functions to reduce further the worst case chordal distance between the weighted true plant and the model. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Polynomial control: past, present, and future

INTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 8 2007
Vladimír Ku
Abstract Polynomial techniques have made important contributions to systems and control theory. Engineers in industry often find polynomial and frequency domain methods easier to use than state equation-based techniques. Control theorists show that results obtained in isolation using either approach are in fact closely related. Polynomial system description provides input,output models for linear systems with rational transfer functions. These models display two important system properties, namely poles and zeros, in a transparent manner. A performance specification in terms of polynomials is natural in many situations; see pole allocation techniques. A specific control system design technique, called polynomial equation approach, was developed in the 1960s and 1970s. The distinguishing feature of this technique is a reduction of controller synthesis to a solution of linear polynomial equations of a specific (Diophantine or Bézout) type. In most cases, control systems are designed to be stable and meet additional specifications, such as optimality and robustness. It is therefore natural to design the systems step by step: stabilization first, then the additional specifications each at a time. For this it is obviously necessary to have any and all solutions of the current step available before proceeding any further. This motivates the need for a parametrization of all controllers that stabilize a given plant. In fact this result has become a key tool for the sequential design paradigm. The additional specifications are met by selecting an appropriate parameter. This is simple, systematic, and transparent. However, the strategy suffers from an excessive grow of the controller order. This article is a guided tour through the polynomial control system design. The origins of the parametrization of stabilizing controllers, called Youla,Ku,era parametrization, are explained. Standard results on reference tracking, disturbance elimination, pole placement, deadbeat control, H2 control, l1 control and robust stabilization are summarized. New and exciting applications of the Youla,Ku,era parametrization are then discussed: stabilization subject to input constraints, output overshoot reduction, and fixed-order stabilizing controller design. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Reliable multicast via satellite: a comparison survey and taxonomy

INTERNATIONAL JOURNAL OF SATELLITE COMMUNICATIONS AND NETWORKING, Issue 1 2001
Martin W. Koyabe
Abstract IP multicasting is an important service, which will be provided by the next generation Internet. A range of applications has emerged which take the advantage of multicast delivery. However, several factors currently hinder large-scale deployment of terrestrial multicast services. It is particularly difficult to support delivery to large groups of users. Satellites offer a natural way to extend the multicast service to reach this large number of users. They may offer high capacity (especially when using next generation satellite systems) and also eliminate the need for a large number of intermediate routing hops. There are important differences in the way multicast applications operate over satellite. This paper therefore reviews the key design paradigm and offers a critical comparison between different reliable multicast protocol techniques based on a taxonomy. The suitability of a set of the most common reliable multicast protocols is assessed within a satellite environment and conclusions are presented. Copyright © 2001 John Wiley & Sons, Ltd. [source]


Analyzing experiments with degradation data for improving reliability and for achieving robust reliability

QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 5 2001
Chih-Hua Chiao
Abstract Statistically designed experiments provide a proactive means for improving reliability; moreover, they can be used to design products that are robust to noise factors which are hard or impossible to control. Traditionally, failure-time data have been collected; for high-reliability products, it is unlikely that failures will occur in a reasonable testing period, so the experiment will be uninformative. An alternative, however, is to collect degradation data. Take, for example, fluorescent lamps whose light intensity decreases over time. Observation of light-intensity degradation paths, given that they are smooth, provides information about the reliability of the lamp, and does not require the lamps to fail. This paper considers experiments with such data for ,reliability improvement', as well as for ,robust reliability achievement' using Taguchi's robust design paradigm. A two-stage maximum-likelihood analysis based on a nonlinear random-effects model is proposed and illustrated with data from two experiments. One experiment considers the reliability improvement of fluorescent lamps. The other experiment focuses on robust reliability improvement of light-emitting diodes. Copyright © 2001 John Wiley & Sons, Ltd. [source]


Design of experiments with unknown parameters in variance

APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, Issue 3 2002
Valerii V. Fedorov
Abstract Model fitting when the variance function depends on unknown parameters is a popular problem in many areas of research. Iterated estimators which are asymptotically equivalent to maximum likelihood estimators are proposed and their convergence is discussed. From a computational point of view, these estimators are very close to the iteratively reweighted least-squares methods. The additive structure of the corresponding information matrices allows us to apply convex design theory which leads to optimal design algorithms. We conclude with examples which illustrate how to bridge our general results with specific applied needs. In particular, a model with experimental costs is introduced and is studied within the normalized design paradigm. Copyright © 2002 John Wiley & Sons, Ltd. [source]