Home About us Contact | |||
Artificial Intelligence (artificial + intelligence)
Selected AbstractsA case study of a cooperative learning experiment in artificial intelligenceCOMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 4 2007Fernando Díez Abstract This article describes an innovative teaching experiment (part of a project for Innovation in Teaching at the University Autónoma of Madrid) which was undertaken by the authors during the first semester of the academic year 2004/2005. This teaching experiment has been the object of evaluation by the students as part of their coursework and has consisted of the use of the groupware system KnowCat, by which the students prepare a repository of documents related to topics and themes associated with the subject matter (Artificial Intelligence). During the process of elaboration both the votes for the best documents and the annotation made about them play an essential role. These documents are carried out exclusively by the students and they are who decide, by means of their activity, which of the documents presented are to be chosen as representative of the entire collection. © 2007 Wiley Periodicals, Inc. Comput Appl Eng Educ 15: 308,316, 2007; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20114 [source] Knowledge-based system for structured examination, diagnosis and therapy in treatment of traumatised teethDENTAL TRAUMATOLOGY, Issue 1 2001A. Robertson Abstract , Dental trauma in children and adolescents is a common problem, and the prevalence of these injuries has increased in the last 10,20 years. A dental injury should always be considered an emergency and, thus, be treated immediately to relieve pain, facilitate reduction of displaced teeth, reconstruct lost hard tissue, and improve prognosis. Rational therapy depends upon a correct diagnosis, which can be achieved with the aid of various examination techniques. It must be understood that an incomplete examination can lead to inaccurate diagnosis and less successful treatment. Good knowledge of traumatology and models of treatments can also reduce stress and anxiety for both the patient and the dental team. Knowledge-based Systems (KBS) are a practical implementation of Artificial Intelligence. In complex domains which humans find difficult to understand, KBS can assist in making decisions and can also add knowledge. The aim of this paper is to describe the structure of a knowledge-based system for structured examination, diagnosis and therapy for traumatised primary and permanent teeth. A commercially available program was used as developmental tool for the programming (XpertRule, Attar, London, UK). The paper presents a model for a computerised decision support system for traumatology. [source] Predicting Spray Processing Parameters from Required Coating Structural Attributes by Artificial Intelligence,ADVANCED ENGINEERING MATERIALS, Issue 7 2006A.-F. Kanta Predicting processing parameters to manufacture a coating with the required structural attributes is of prime interest to reduce the associate development costs. Such an approach permits, among other advantages, to select the most appropriate scheme among several possible to implement. This paper intends to present such an approach. The specific case of predicting plasma spray process parameters to manufacture a grey alumina (Al2O3 -TiO2, 13% by wt.) coating was considered. [source] Maximum entropy inference for mixed continuous-discrete variablesINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 4 2010Hermann Singer We represent knowledge by probability distributions of mixed continuous and discrete variables. From the joint distribution of all items, one can compute arbitrary conditional distributions, which may be used for prediction. However, in many cases only some marginal distributions, inverse probabilities, or moments are known. Under these conditions, a principle is needed to determine the full joint distribution of all variables. The principle of maximum entropy (Jaynes, Phys Rev 1957;106:620,630 and 1957;108:171,190; Jaynes, Probability Theory,The Logic of Science, Cambridge, UK: Cambridge University Press, 2003; Haken, Synergetics, Berlin: Springer-Verlag, 1977; Guiasu and Shenitzer, Math Intell 1985;117:83,106) ensures an unbiased estimation of the full multivariate relationships by using only known facts. For the case of discrete variables, the expert shell SPIRIT implements this approach (cf. Rödder, Artif Intell 2000;117:83,106; Rödder and Meyer, in Proceedings of the 12th Conference on Uncertainty in Artificial Intelligence, San Francisco, CA, 2006; Rödder et al., Logical J IGPL 2006;14(3):483,500). In this paper, the approach is generalized to continuous and mixed continuous-discrete distributions and applied to the problem of credit scoring. © 2010 Wiley Periodicals, Inc. [source] The Role of Statistics in the Data Revolution?INTERNATIONAL STATISTICAL REVIEW, Issue 1 2001Jerome H. Friedman Summary The nature of data is rapidly changing. Data sets are becoming increasingly large and complex. Modern methodology for analyzing these new types of data are emerging from the fields of Data Base Managment, Artificial Intelligence, Machine Learning, Pattern Recognition, and Data Visualization. So far Statistics as a field has played a minor role. This paper explores some of the reasons for this, and why statisticians should have an interest in participating in the development of new methods for large and complex data sets. [source] Implementing a CMC tutor group for an existing distance education courseJOURNAL OF COMPUTER ASSISTED LEARNING, Issue 3 2000M Weller Abstract, ,Artificial Intelligence for Technology' (T396) is a distance learning course provided by the Open University of the UK using face-to-face tutorials. In 1997 a pilot study was undertaken of a computer-mediated communication (CMC) tutor group which consisted of volunteers from around the UK. The student feedback raised a number of issues including: the need for a distinct function for the tutor group conference, the role of and demands on the tutor, and the benefits perceived by students. It is suggested that some issues arise from a conflict of cultures each with their own implicit assumptions. The traditional face-to-face tutorial model is sometimes at variance with the demands of the new CMC based tuition. [source] Book review: Bio-Inspired Artificial Intelligence: Theories, Methods, and TechnologiesAMERICAN JOURNAL OF HUMAN BIOLOGY, Issue 5 2009Jackie Chappell No abstract is available for this article. [source] Preference-Based Constrained Optimization with CP-NetsCOMPUTATIONAL INTELLIGENCE, Issue 2 2004Craig Boutilier Many artificial intelligence (AI) tasks, such as product configuration, decision support, and the construction of autonomous agents, involve a process of constrained optimization, that is, optimization of behavior or choices subject to given constraints. In this paper we present an approach for constrained optimization based on a set of hard constraints and a preference ordering represented using a CP-network,a graphical model for representing qualitative preference information. This approach offers both pragmatic and computational advantages. First, it provides a convenient and intuitive tool for specifying the problem, and in particular, the decision maker's preferences. Second, it admits an algorithm for finding the most preferred feasible (Pareto-optimal) outcomes that has the following anytime property: the set of preferred feasible outcomes are enumerated without backtracking. In particular, the first feasible solution generated by this algorithm is Pareto optimal. [source] A case study of a cooperative learning experiment in artificial intelligenceCOMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 4 2007Fernando Díez Abstract This article describes an innovative teaching experiment (part of a project for Innovation in Teaching at the University Autónoma of Madrid) which was undertaken by the authors during the first semester of the academic year 2004/2005. This teaching experiment has been the object of evaluation by the students as part of their coursework and has consisted of the use of the groupware system KnowCat, by which the students prepare a repository of documents related to topics and themes associated with the subject matter (Artificial Intelligence). During the process of elaboration both the votes for the best documents and the annotation made about them play an essential role. These documents are carried out exclusively by the students and they are who decide, by means of their activity, which of the documents presented are to be chosen as representative of the entire collection. © 2007 Wiley Periodicals, Inc. Comput Appl Eng Educ 15: 308,316, 2007; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20114 [source] Automated image-based phenotypic analysis in zebrafish embryosDEVELOPMENTAL DYNAMICS, Issue 3 2009Andreas Vogt Abstract Presently, the zebrafish is the only vertebrate model compatible with contemporary paradigms of drug discovery. Zebrafish embryos are amenable to automation necessary for high-throughput chemical screens, and optical transparency makes them potentially suited for image-based screening. However, the lack of tools for automated analysis of complex images presents an obstacle to using the zebrafish as a high-throughput screening model. We have developed an automated system for imaging and analyzing zebrafish embryos in multi-well plates regardless of embryo orientation and without user intervention. Images of fluorescent embryos were acquired on a high-content reader and analyzed using an artificial intelligence-based image analysis method termed Cognition Network Technology (CNT). CNT reliably detected transgenic fluorescent embryos (Tg(fli1:EGFP)y1) arrayed in 96-well plates and quantified intersegmental blood vessel development in embryos treated with small molecule inhibitors of anigiogenesis. The results demonstrate it is feasible to adapt image-based high-content screening methodology to measure complex whole organism phenotypes. Developmental Dynamics 238:656,663, 2009. © 2009 Wiley-Liss, Inc. [source] Automated generation of new knowledge to support managerial decision-making: case study in forecasting a stock marketEXPERT SYSTEMS, Issue 4 2004Se-Hak Chun Abstract: The deluge of data available to managers underscores the need to develop intelligent systems to generate new knowledge. Such tools are available in the form of learning systems from artificial intelligence. This paper explores how the novel tools can support decision-making in the ubiquitous managerial task of forecasting. For concreteness, the methodology is examined in the context of predicting a financial index whose chaotic properties render the time series difficult to predict. The study investigates the circumstances under which enough new knowledge is extracted from temporal data to overturn the efficient markets hypothesis. The efficient markets hypothesis precludes the possibility of anticipating in financial markets. More precisely, the markets are deemed to be so efficient that the best forecast of a price level for the subsequent period is precisely the current price. Certain anomalies to the efficient market premise have been observed, such as calendar effects. Even so, forecasting techniques have been largely unable to outperform the random walk model which corresponds to the behavior of prices under the efficient markets hypothesis. This paper tests the validity of the efficient markets hypothesis by developing knowledge-based tools to forecast a market index. The predictions are examined across several horizons: single-period forecasts as well as multiple periods. For multiperiod forecasts, the predictive methodology takes two forms: a single jump from the current period to the end of the forecast horizon, and a multistage web of forecasts which progresses systematically from one period to the next. These models are first evaluated using neural networks and case-based reasoning, and are then compared against a random walk model. The computational models are examined in the context of forecasting a composite for the Korean stock market. [source] Some applications of fuzzy logic in rule,based expert systemsEXPERT SYSTEMS, Issue 4 2002Trung T. Pham Fuzzy logic has been used as a means of interpreting vague, incomplete and even contradictory information into a compromised rule base in artificial intelligence such as machine decision,making. Within this context, fuzzy logic can be applied in the field of expert systems to provide additional flexibilities in constructing a working rule base: different experts' opinions can be incorporated into the same rule base, and each opinion can be modeled in a rather vague notion of human language. As some illustrative application examples, this paper describes how fuzzy logic can be used in expert systems. More precisely, it demonstrates the following applications: (i) a healthcare diagnostic system, (ii) an autofocus camera lens system and (iii) a financial decision system. For each application, basic rules are described, the calculation method is outlined and numerical simulation is provided. These applications demonstrate the suitability and performance of fuzzy logic in expert systems. [source] Describing generic expertise models as object-oriented analysis patterns: the heuristic multi-attribute decision patternEXPERT SYSTEMS, Issue 3 2002Ángeles Manjarrés We report on work concerning the use of object-oriented analysis and design (OAD) methods in the development of artificial intelligence (AI) software applications, in which we compare such techniques to software development methods more commonly used in AI, in particular CommonKADS. As a contribution to clarifying the role of OAD methods in AI, in this paper we compare the analysis models of the object-oriented methods and the CommonKADS high-level expertise model. In particular, we study the correspondences between generic tasks, methods and ontologies in methodologies such as CommonKADS and analysis patterns in object-oriented analysis. Our aim in carrying out this study is to explore to what extent, in areas of AI where the object-oriented paradigm may be the most adequate way of conceiving applications, an analysis level ,pattern language' could play the role of the libraries of generic knowledge models in the more commonly used AI software development methods. As a case study we use the decision task , its importance arising from its status as the basic task of the intelligent agent , and the associated heuristic multi-attribute decision method, for which we derive a corresponding decision pattern described in the unified modelling language, a de facto standard in OAD. [source] Selection of knowledge acquisition techniques based upon the problem domain characteristics of production and operations management expert systemsEXPERT SYSTEMS, Issue 2 2001William P. Wagner The application of expert systems to various problem domains in business has grown steadily since their introduction. Regardless of the chosen method of development, the most commonly cited problems in developing these systems are the unavailability of both the experts and knowledge engineers and difficulties with the process of acquiring knowledge from domain experts. Within the field of artificial intelligence, this has been called the ,knowledge acquisition' problem and has been identified as the greatest bottleneck in the expert system development process. Simply stated, the problem is how to acquire the specific knowledge for a well-defined problem domain efficiently from one or more experts and represent it in the appropriate computer format. Given the ,paradox of expertise', the experts have often proceduralized their knowledge to the point that they have difficulty in explaining exactly what they know and how they know it. However, empirical research in the field of expert systems reveals that certain knowledge acquisition techniques are significantly more efficient than others in helping to extract certain types of knowledge within specific problem domains. In this paper we present a mapping between these empirical studies and a generic taxonomy of expert system problem domains. In so doing, certain knowledge acquisition techniques can be prescribed based on the problem domain characteristics. With the production and operations management (P/OM) field as the pilot area for the current study, we first examine the range of problem domains and suggest a mapping of P/OM tasks to a generic taxonomy of problem domains. We then describe the most prominent knowledge acquisition techniques. Based on the examination of the existing empirical knowledge acquisition research, we present how the empirical work can be used to provide guidance to developers of expert systems in the field of P/OM. [source] Intelligent interaction design: the role of human-computer interaction research in the design of intelligent systemsEXPERT SYSTEMS, Issue 1 2001Ann BlandfordArticle first published online: 16 DEC 200 As more intelligent systems are introduced into the marketplace, it is becoming increasingly urgent to consider usability for such systems. Historically, the two fields of artificial intelligence (AI) and human- computer interaction (HCI) have had little in common. In this paper, we consider how established HCI techniques can usefully be applied to the design and evaluation of intelligent systems, and where there is an urgent need for new approaches. Some techniques - notably those for requirements acquisition and empirical evaluation - can usefully be adopted, and indeed are, within many projects. However, many of the tools and techniques developed within HCI to support design and theory-based evaluation cannot be applied in their present forms to intelligent systems because they are based on inappropriate assumptions; there is consequently a need for new approaches. Conversely, there are approaches that have been developed within AI - e.g. in research on dialogue and on ontologies - that could usefully be adapted and encapsulated to respond to this need. These should form the core of a future research agenda for intelligent interaction design. [source] Virtual Servants: Stereotyping Female Front-Office Employees on the InternetGENDER, WORK & ORGANISATION, Issue 5 2005Eva Gustavsson This article focuses on the service providers of the future: virtual assistants on the Internet. Recent technological developments, supported by intensive research on artificial intelligence, have enabled corporations to construct ,virtual employees' who can interact with their online customers. The number of virtual assistants on the Internet continues to grow and most of these new service providers are human-like and female. In this article I profile virtual employees on the Internet , who they are, what they do and how they present themselves. I demonstrate that the Internet suffers from the same gender stereotyping characteristic of customer services in general and that the unreflective choice of female images is, at the minimum, a symbolic reinforcement of the real circumstances of gender divisions in customer service. [source] Information and Communications Technology and Auditing: Current Implications and Future DirectionsINTERNATIONAL JOURNAL OF AUDITING, Issue 2 2010Kamil Omoteso This exploratory study assesses, from a structuration theory perspective, the impact information and communications technology (ICT) tools and techniques are currently having on audit tasks, auditors (internal and external) and the organisations they work for from the point of view of coordination, control, authority and structure. Based on a triangulation of interview and questionnaire techniques, the findings indicate that ICT is re-shaping auditors' roles and outputs as well as audit organisations' structures. The findings also project the view that continuous auditing, artificial intelligence and CobiT are expected to gain more prominence while a need was also seen for new software development to help auditors match the complexity of their clients' information systems. The study's results reveal the current state of affairs of the relationship between ICT and auditing against the backdrop of continuous global ICT sophistication thereby updating ICT audit literature and the likely future direction of this relationship. [source] An intensional approach to qualitative and quantitative periodicity-dependent temporal constraintsINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 8 2009Luca Anselma In this paper, we propose a framework for representing and reasoning about qualitative and quantitative temporal constraints between periodic events. In particular, our contribution is twofold: (i) we provide a formalism to deal with both qualitative and quantitative "periodicity-dependent" constraints between repeated events, considering user-defined periodicities as well; and (ii) we propose an intensional approach to temporal reasoning, which is based on the operations of intersection and composition. Such a comprehensive approach, to the best of our knowledge, represents an innovative contribution that integrates and extends results from both the artificial intelligence and the temporal databases literature. © 2009 Wiley Periodicals, Inc. [source] On bipolarity in argumentation frameworksINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 10 2008L. Amgoud In this article, we propose a survey of the use of bipolarity in argumentation frameworks. On the one hand, the notion of bipolarity relies on the presence of two kinds of entities that have a diametrically opposed nature and that represent repellent forces (a positive entity and a negative entity). The notion exists in various domains (for example with the representation of preferences in artificial intelligence, or in cognitive psychology). On the other hand, argumentation process is a promising approach for reasoning, based on the construction and the comparison of arguments. It follows five steps: building the arguments, defining the interactions between these arguments, valuating the arguments, selecting the most acceptable arguments and, finally, drawing a conclusion. Using the nomenclature proposed by Dubois and Prade, this article shows on various applications, and with some formal definitions, that bipolarity appears in argumentation (in some cases if not always) and can be used in each step of this process under different forms. © 2008 Wiley Periodicals, Inc. [source] Fuzzy extensions for relationships in a generalized object modelINTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 7 2001Valerie V. Cross Numerous approaches for introducing and managing uncertainty in object-oriented models have been proposed. This paper examines various semantics of uncertainty and the interaction with three kinds of relationships inherent to object models: for instance-of, a-kind-of, and a category. A generalized object model incorporating the perspective of semantic data modeling, artificial intelligence, and database systems is the basis for the recommendations for fuzzy extensions to these three kinds of relationships. © 2001 John Wiley & Sons, Inc. [source] Wavelength selection with Tabu SearchJOURNAL OF CHEMOMETRICS, Issue 8-9 2003J. A. Hageman Abstract This paper introduces Tabu Search in analytical chemistry by applying it to wavelength selection. Tabu Search is a deterministic global optimization technique loosely based on concepts from artificial intelligence. Wavelength selection is a method which can be used for improving the quality of calibration models. Tabu Search uses basic, problem-specific operators to explore a search space, and memory to keep track of parts already visited. Several implementational aspects of wavelength selection with Tabu Search will be discussed. Two ways of memorizing the search space are investigated: storing the actual solutions and storing the steps necessary to create them. Parameters associated with Tabu Search are configured with a Plackett,Burman design. In addition, two extension schemes for Tabu Search, intensification and diversification, have been implemented and are applied with good results. Eventually, two implementations of wavelength selection with Tabu Search are tested, one which searches for a solution with a constant number of wavelengths and one with a variable number of wavelengths. Both implementations are compared with results obtained by wavelength selection methods based on simulated annealing (SA) and genetic algorithms (GAs). It is demonstrated with three real-world data sets that Tabu Search performs equally well as and can be a valuable alternative to SA and GAs. The improvements in predictive abilities increased by a factor of 20 for data set 1 and by a factor of 2 for data sets 2 and 3. In addition, when the number of wavelengths in a solution is variable, measurements on the coverage of the search space show that the coverage is usually higher for Tabu Search compared with SA and GAs. Copyright © 2003 John Wiley & Sons, Ltd. [source] CONOPS and autonomy recommendations for VTOL small unmanned aerial system based on Hurricane Katrina operationsJOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 8 2009Kevin S. Pratt This field study examines vertical takeoff and landing (VTOL) small unmanned aerial system (SUAS) operations conducted as part of an 8-day structural inspection task following Hurricane Katrina in 2005. From the observations of the 32 flights spread over 12 missions, four key findings are identified for concept of operations (CONOPS) and the next level of artificial intelligence for rotary-wing SUASs operating in cluttered urban environments. These findings are (1) the minimum useful standoff distance from inspected structures is 2,5 m, (2) omnidirectional sensor capabilities are needed for obstacle avoidance, (3) global positioning system waypoint navigation is unnecessary, and (4) these operations require three operators for one SUAS. Based on the findings and other observations, a crewing organization and flight operations protocol for SUASs are proposed. Needed directions in research and development are also discussed. These recommendations are expected to contribute to the design of platforms, sensors, and artificial intelligence as well as facilitate the acceptance of SUASs in the workplace. © 2009 Wiley Periodicals, Inc. [source] SOLVING DYNAMIC WILDLIFE RESOURCE OPTIMIZATION PROBLEMS USING REINFORCEMENT LEARNINGNATURAL RESOURCE MODELING, Issue 1 2005CHRISTOPHER J. FONNESBECK ABSTRACT. An important technical component of natural resource management, particularly in an adaptive management context, is optimization. This is used to select the most appropriate management strategy, given a model of the system and all relevant available information. For dynamic resource systems, dynamic programming has been the de facto standard for deriving optimal state-specific management strategies. Though effective for small-dimension problems, dynamic programming is incapable of providing solutions to larger problems, even with modern microcomputing technology. Reinforcement learning is an alternative, related procedure for deriving optimal management strategies, based on stochastic approximation. It is an iterative process that improves estimates of the value of state-specific actions based in interactions with a system, or model thereof. Applications of reinforcement learning in the field of artificial intelligence have illustrated its ability to yield near-optimal strategies for very complex model systems, highlighting the potential utility of this method for ecological and natural resource management problems, which tend to be of high dimension. I describe the concept of reinforcement learning and its approach of estimating optimal strategies by temporal difference learning. I then illustrate the application of this method using a simple, well-known case study of Anderson [1975], and compare the reinforcement learning results with those of dynamic programming. Though a globally-optimal strategy is not discovered, it performs very well relative to the dynamic programming strategy, based on simulated cumulative objective return. I suggest that reinforcement learning be applied to relatively complex problems where an approximate solution to a realistic model is preferable to an exact answer to an oversimplified model. [source] Two Dogmas of Neo-EmpiricismPHILOSOPHY COMPASS (ELECTRONIC), Issue 4 2006Edouard Machery This article critically examines the contemporary resurgence of empiricism (or "neo-empiricism") in philosophy, psychology, neuropsychology, and artificial intelligence. This resurgence is an important and positive development. It is the first time that this centuries-old empiricist approach to cognition is precisely formulated in the context of cognitive science and neuroscience. Moreover, neo-empiricists have made several findings that challenge amodal theories of concepts and higher cognition. It is argued, however, that the theoretical foundations of and the empirical evidence for neo-empiricism are not as strong as is usually claimed by its proponents. The empirical evidence for and against neo-empiricism is discussed in detail. [source] Artificial intelligence advancements applied in off-the-shelf controllersPROCESS SAFETY PROGRESS, Issue 2 2002Edward M. Marszal P.E. Since the earliest process units were built, CPI engineers have employed artificial intelligence to prevent losses. The expanding use of computer-based systems for process control has allowed the amount of intelligence applied in these expert systems to drastically increase. Standard methods for performing Expert System tasks are being formalized by numerous researchers in industry and academia. Work products from these groups include designs that present process hazards knowledge in a structured, hierarchical, and modular manner. Advancements in programmable logic controller (PLC) technology have created systems with substantial computing power that are robust and fault tolerant enough to be used in safety critical applications. In addition, IEC 1131-3 standardized the programming languages available in virtually every new controller. The function block language defined in IEC 1131-3 is particularly well suited to performing modular tasks, which makes it an ideal platform for representing knowledge. This paper begins by describing some of the advancements in knowledge-based systems for loss prevention applications. It then explores how standard IEC 1131-3 programming techniques can be used to build function blocks that represent knowledge of the hazards posed by equipment items. The paper goes on to develop a sample function block that represents the hazards of a pressure vessel, using knowledge developed in the API 14-C standard. [source] Integrating artificial intelligence into on-line statistical process controlQUALITY AND RELIABILITY ENGINEERING INTERNATIONAL, Issue 1 2003Ruey-Shiang Guh Abstract Statistical process control (SPC) is one of the most effective tools of total quality management, the main function of which is to monitor and minimize process variations. Typically, SPC applications involve three major tasks in sequence: (1) monitoring the process, (2) diagnosing the deviated process and (3) taking corrective action. With the movement towards a computer integrated manufacturing environment, computer based applications need to be developed to implement the various SPC tasks automatically. However, the pertinent literature shows that nearly all the researches in this field have only focussed on the automation of monitoring the process. The remaining two tasks still need to be carried out by quality practitioners. This project aims to apply a hybrid artificial intelligence technique in building a real time SPC system, in which an artificial neural network based control chart monitoring sub-system and an expert system based control chart alarm interpretation sub-system are integrated for automatically implementing the SPC tasks comprehensively. This system was designed to provide the quality practitioner with three kinds of information related to the current status of the process: (1) status of the process (in-control or out-of-control). If out-of-control, an alarm will be signaled, (2) plausible causes for the out-of-control situation and (3) effective actions against the out-of-control situation. An example is provided to demonstrate that hybrid intelligence can be usefully applied for solving the problems in a real time SPC system. Copyright © 2003 John Wiley & Sons, Ltd. [source] Process modeling and optimization of industrial ethylene oxide reactor by integrating support vector regression and genetic algorithmTHE CANADIAN JOURNAL OF CHEMICAL ENGINEERING, Issue 1 2009Sandip Kumar Lahiri Abstract This article presents an artificial intelligence-based process modeling and optimization strategies, namely support vector regression,genetic algorithm (SVR-GA) for modeling and optimization of catalytic industrial ethylene oxide (EO) reactor. In the SVR-GA approach, an SVR model is constructed for correlating process data comprising values of operating and performance variables. Next, model inputs describing process operating variables are optimized using Genetic Algorithm (GAs) with a view to maximize the process performance. The GA possesses certain unique advantages over the commonly used gradient-based deterministic optimization algorithms The SVR-GA is a new strategy for chemical process modeling and optimization. The major advantage of the strategies is that modeling and optimization can be conducted exclusively from the historic process data wherein the detailed knowledge of process phenomenology (reaction mechanism, kinetics, etc.) is not required. Using SVR-GA strategy, a number of sets of optimized operating conditions leading to maximized EO production and catalyst selectivity were obtained. The optimized solutions when verified in actual plant resulted in a significant improvement in the EO production rate and catalyst selectivity. On présente dans cet article des stratégies de modélisation et d'optimisation de procédés reposant sur l'intelligence artificielle, à savoir la méthode basée sur la régression des vecteurs de soutien et l'algorithme génétique (SVR-GA) pour la modélisation et l'optimisation du réacteur d'oxyde d'éthylène (EO) industriel catalytique. Dans la méthode SVR-GA, un modèle de régression des vecteurs de soutien est mis au point pour corréler les données de procédé comprenant les valeurs des variables de fonctionnement et de performance. Par la suite, les données d'entrée du modèle décrivant les variables de fonctionnement du procédé sont optimisées à l'aide de l'algorithme génétique (GA) dans l'optique de maximiser la performance du procédé. Le GA possède certains avantages uniques par rapport aux algorithmes d'optimisation déterministes basés sur les gradients communément utilisés. La SVR-GA est une nouvelle stratégie pour la modélisation et l'optimisation des procédés. Le principal avantage de ces stratégies est que la modélisation et l'optimisation peuvent être menées exclusivement à partir des données de procédés historiques, et il n'est pas nécessaire de connaître en détail la phénoménologie des procédés (mécanisme de réaction, cinétique, etc.). À l'aide de la stratégie SVR-GA, plusieurs séries de conditions opératoires optimisées conduisant à une production d'EO et une sélectivité de catalyseur maximisées ont été obtenues. Les solutions optimisées vérifiées en installations réelles permettent une amélioration significative du taux de production d'EO et de la sélectivité du catalyseur. [source] |