Computer Science (computer + science)

Distribution by Scientific Domains


Selected Abstracts


University timetabling through conceptual modeling

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 11 2005
Jonathan Lee
A number of approaches have been proposed in tackling the timetabling problem, such as operational research, human-machine interaction, constraint programming, expert systems, and neural networks. However, there are still several key challenges to be addressed: easily reformulated to support changes, a generalized framework to handle various timetabling problems, and ability to incorporate knowledge in the timetabling system. In this article, we propose an automatic software engineering approach, called task-based conceptual graphs, to addressing the challenges in the timetabling problem. Task-based conceptual graphs provide the automation of software development processes including specification, verification, and automatic programming. Maintenance can be directly performed on the specifications rather than on the source code; moreover, hard and soft constraints can be easily inserted or removed. A university timetabling system in the Department of Computer Science and Information Engineering at National Central University is used as an illustrative example for the proposed approach. © 2005 Wiley Periodicals, Inc. Int J Int Syst 20: 1137,1160, 2005. [source]


Departmental websites and female student recruitment

PROCEEDINGS OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE & TECHNOLOGY (ELECTRONIC), Issue 1 2008
Kristin Hanks
Female recruitment and retention in technology related fields is still low, despite numerous attempts to reverse this trend. As a recruitment device, a school's webpage may be the only visual representation a student will see before deciding whether or not to apply. Thus, understanding the possible implications of this medium is important within the larger conversation on gender equity and opportunities. This research addresses several questions: At first glance, do websites give gender cues, whether intentional or not? Is there a relationship between certain web content and the number of women recruited into technology related schools and departments? Do applied fields (Informatics, Information Science, Instructional Technology, Information Systems) differ in their online recruitment practices from more traditional Computer Science and Engineering departments? It is important to note that this research is not an attempt to find the best web practices to recruit female students or an attempt to punish or blame specific institutions regarding their recruitment practices. [source]


On ,-biased generators in NC0

RANDOM STRUCTURES AND ALGORITHMS, Issue 1 2006
Elchanan Mossel
Cryan and Miltersen (Proceedings of the 26th Mathematical Foundations of Computer Science, 2001, pp. 272,284) recently considered the question of whether there can be a pseudorandom generator in NC0, that is, a pseudorandom generator that maps n -bit strings to m -bit strings such that every bit of the output depends on a constant number k of bits of the seed. They show that for k = 3, if m , 4n + 1, there is a distinguisher; in fact, they show that in this case it is possible to break the generator with a linear test, that is, there is a subset of bits of the output whose XOR has a noticeable bias. They leave the question open for k , 4. In fact, they ask whether every NC0 generator can be broken by a statistical test that simply XORs some bits of the input. Equivalently, is it the case that no NC0 generator can sample an ,-biased space with negligible ,? We give a generator for k = 5 that maps n bits into cn bits, so that every bit of the output depends on 5 bits of the seed, and the XOR of every subset of the bits of the output has bias 2. For large values of k, we construct generators that map n bits to bits such that every XOR of outputs has bias . We also present a polynomial-time distinguisher for k = 4,m , 24n having constant distinguishing probability. For large values of k we show that a linear distinguisher with a constant distinguishing probability exists once m , ,(2kn,k/2,). Finally, we consider a variant of the problem where each of the output bits is a degree k polynomial in the inputs. We show there exists a degree k = 2 pseudorandom generator for which the XOR of every subset of the outputs has bias 2,,(n) and which maps n bits to ,(n2) bits. © 2005 Wiley Periodicals, Inc. Random Struct. Alg., 2006 [source]


Randomly coloring graphs with lower bounds on girth and maximum degree

RANDOM STRUCTURES AND ALGORITHMS, Issue 2 2003
Martin Dyer
We consider the problem of generating a random q -coloring of a graph G = (V, E). We consider the simple Glauber Dynamics chain. We show that if the maximum degree , > c1ln n and the girth g > c2ln , (n = |V|) for sufficiently large c1, c2, then this chain mixes rapidly provided q/, > ,, where , , 1.763 is the root of , = e1/,. For this class of graphs, this beats the 11,/6 bound of Vigoda E. Vigoda, Improved bounds for sampling colorings, Proc 40th Annu IEEE Symp Foundations of Computer Science, 1999, pp. 51,59 for general graphs. We extend the result to random graphs. © 2003 Wiley Periodicals, Inc. Random Struct. Alg., 23: 167,179, 2003 [source]


The validity of the Computer Science and Applications activity monitor for use in coronary artery disease patients during level walking

CLINICAL PHYSIOLOGY AND FUNCTIONAL IMAGING, Issue 4 2002
Ulf Ekelund
Summary The principal aim of the present study was to examine the validity of the Computer Science and Applications (CSA) activity monitor during level walking in coronary artery disease (CAD) patients. As a secondary aim, we evaluated the usefulness of two previously published energy expenditure (EE) prediction equations. Thirty-four subjects (29 men and five women), all with diagnosed CAD, volunteered to participate. Oxygen uptake (VO2) was measured by indirect calorimetry during walking on a motorized treadmill at three different speeds (3·2, 4·8 and 6·4 km h,1). Physical activity was measured simultaneously using the CSA activity monitor, secured directly to the skin on the lower back (i.e. lumbar vertebrae 4,5) with an elastic belt. The mean (±SD) activity counts were 1208 ± 429, 3258 ± 753 and 5351 ± 876 counts min,1, at the three speeds, respectively (P<0·001). Activity counts were significantly correlated to speed (r=0·92; P<0·001), VO2 (ml kg,1 min,1; r=0·87; P<0·001) and EE (kcal min,1; r=0·85, P<0·001). A stepwise linear regression analysis showed that activity counts and body weight together explained 75% of the variation in EE. Predicted EE from previously published equations differed significantly when used in this group of CAD patients. In conclusion, the CSA activity monitor is a valid instrument for assessing the intensity of physical activity during treadmill walking in CAD patients. Energy expenditure can be predicted from body weight and activity counts. [source]


ON SOCIAL LEARNING AND ROBUST EVOLUTIONARY ALGORITHM DESIGN IN THE COURNOT OLIGOPOLY GAME

COMPUTATIONAL INTELLIGENCE, Issue 2 2007
Floortje Alkemade
Agent-based computational economics (ACE) combines elements from economics and computer science. In this article, the focus is on the relation between the evolutionary technique that is used and the economic problem that is modeled. In the field of ACE, economic simulations often derive parameter settings for the genetic algorithm directly from the values of the economic model parameters. This article compares two important approaches that are dominating in ACE and shows that the above practice may hinder the performance of the genetic algorithm and thereby hinder agent learning. More specifically, it is shown that economic model parameters and evolutionary algorithm parameters should be treated separately by comparing the two widely used approaches to social learning with respect to their convergence properties and robustness. This leads to new considerations for the methodological aspects of evolutionary algorithm design within the field of ACE. [source]


A Design Theory Approach to Building Strategic Network-Based Customer Service Systems,

DECISION SCIENCES, Issue 3 2009
M. Kathryn Brohman
ABSTRACT Customer service is a key component of a firm's value proposition and a fundamental driver of differentiation and competitive advantage in nearly every industry. Moreover, the relentless coevolution of service opportunities with novel and more powerful information technologies has made this area exciting for academic researchers who can contribute to shaping the design and management of future customer service systems. We engage in interdisciplinary research,across information systems, marketing, and computer science,in order to contribute to the service design and service management literature. Grounded in the design-science perspective, our study leverages marketing theory on the service-dominant logic and recent findings pertaining to the evolution of customer service systems. Our theorizing culminates with the articulation of four design principles. These design principles underlie the emerging class of customer service systems that, we believe, will enable firms to better compete in an environment characterized by an increase in customer centricity and in customers' ability to self-serve and dynamically assemble the components of solutions that fit their needs. In this environment, customers retain control over their transactional data, as well as the timing and mode of their interactions with firms, as they increasingly gravitate toward integrated complete customer solutions rather than single products or services. Guided by these design principles, we iterated through, and evaluated, two instantiations of the class of systems we propose, before outlining implications and directions for further cross-disciplinary scholarly research. [source]


The value of environmental modelling languages for building distributed hydrological models

HYDROLOGICAL PROCESSES, Issue 14 2002
Derek Karssenberg
Abstract An evaluation is made of the suitability of programming languages for hydrological modellers to create distributed, process-based hydrological models. Both system programming languages and high-level environmental modelling languages are evaluated based on a list of requirements for the optimal programming language for such models. This is illustrated with a case study, implemented using the PCRaster environmental modelling language to create a distributed, process-based hydrological model based on the concepts of KINEROS-EUROSEM. The main conclusion is that system programming languages are not ideal for hydrologists who are not computer programmers because the level of thinking of these languages is too strongly related to specialized computer science. A higher level environmental modelling language is better in the sense that it operates at the conceptual level of the hydrologist. This is because it contains operators that identify hydrological processes that operate on hydrological entities, such as two-dimensional maps, three-dimensional blocks and time-series. The case study illustrates the advantages of using an environmental modelling language as compared with system programming languages in fulfilling requirements on the level of thinking applied in the language, the reusability of the program code, the lack of technical details in the program, a short model development time and learnability. The study shows that environmental modelling languages are equally good as system programming languages in minimizing programming errors, but are worse in generic application and performance. It is expected that environmental modelling languages will be used in future mainly for development of new models that can be tailored to modelling aims and the field data available. Copyright © 2002 John Wiley & Sons, Ltd. [source]


A posteriori error approximation in EFG method

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 15 2003
L. Gavete
Abstract Recently, considerable effort has been devoted to the development of the so-called meshless methods. Meshless methods still require considerable improvement before they equal the prominence of finite elements in computer science and engineering. One of the paths in the evolution of meshless methods has been the development of the element free Galerkin (EFG) method. In the EFG method, it is obviously important that the ,a posteriori error' should be approximated. An ,a posteriori error' approximation based on the moving least-squares method is proposed, using the solution, computed from the EFG method. The error approximation procedure proposed in this paper is simple to construct and requires, at most, nearest neighbour information from the EFG solution. The formulation is based on employing different moving least-squares approximations. Different selection strategies of the moving least-squares approximations have been used and compared, to obtain optimum values of the parameters involved in the approximation of the error. The performance of the developed approximation of the error is illustrated by analysing different examples for two-dimensional (2D) potential and elasticity problems, using regular and irregular clouds of points. The implemented procedure of error approximation allows the global energy norm error to be estimated and also provides a good evaluation of local errors. Copyright © 2003 John Wiley & Sons, Ltd. [source]


High-confidence control: Ensuring reliability in high-performance real-time systems

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 4 2004
Tariq Samad
Technology transfer is an especially difficult proposition for real-time control. To facilitate it, we need to complement the "high-performance" orientation of control research with an emphasis on establishing "high confidence" in real-time implementation. Two particular problems are discussed and recent research directed at their solutions is presented. First, the use of anytime algorithms requires dynamic resource management technology that generally is not available today in real-time systems. Second, complex algorithms have unpredictable computational characteristics that, nevertheless, need to be modeled; statistical verification is suggested as a possible approach. In both cases, a synthesis of control engineering and computer science is required if effective solutions are to be devised. Simulation-based demonstrations with uninhabited aerial vehicles (UAVs) serve to illustrate the research efforts. © 2004 Wiley Periodicals, Inc. [source]


Competitive On-line Statistics

INTERNATIONAL STATISTICAL REVIEW, Issue 2 2001
Volodya Vovk
Summary A radically new approach to statistical modelling, which combines mathematical techniques of Bayesian statistics with the philosophy of the theory of competitive on-line algorithms, has arisen over the last decade in computer science (to a large degree, under the influence of Dawid's prequential statistics). In this approach, which we call "competitive on-line statistics", it is not assumed that data are generated by some stochastic mechanism; the bounds derived for the performance of competitive on-line statistical procedures are guaranteed to hold (and not just hold with high probability or on the average). This paper reviews some results in this area; the new material in it includes the proofs for the performance of the Aggregating Algorithm in the problem of linear regression with square loss. Résumé Cet article décrit une approch nouvelle à modelage statistique combinant les techniques mathematiques de statistique Bayesienne avec la philosophie de la theorie de algorithmes compétitives en ligne. Dans cette approche, qui émergeait durant le décennie derniére dans I'informatique, on ne suppose pas que les données sont produites par une mécanaisme stochastique; au lieu de cela, il est prouvé que les procédures statistiques compétitives en ligne atteignent toujours (et non, par exemple, avechaute probabilite) quelque but desirable (explicitant la bonne performance sur les données réeles). Cet article pass en revue des plusieurs résultats dans cette domaine; son matériel neuf comprend les preuves pour la performance de le àlgorithme agrégent dans le probléme de la régression linégression linéaire avec la perte carrée. [source]


Path finding under uncertainty

JOURNAL OF ADVANCED TRANSPORTATION, Issue 1 2005
Anthony Chen
Path finding problems have many real-world applications in various fields, such as operations research, computer science, telecommunication, transportation, etc. In this paper, we examine three definitions of optimality for finding the optimal path under an uncertain environment. These three stochastic path finding models are formulated as the expected value model, dependent-chance model, and chance-constrained model using different criteria to hedge against the travel time uncertainty. A simulation-based genetic algorithm procedure is developed to solve these path finding models under uncertainties. Numerical results are also presented to demonstrate the features of these stochastic path finding models. [source]


Images of self and others as computer users: the role of gender and experience

JOURNAL OF COMPUTER ASSISTED LEARNING, Issue 5 2006
E. M. Mercier
Abstract Gender differences in the pursuit of technology careers are a current issue of concern. We report on two studies that use surveys, drawings and interviews to examine sixth- and eighth-grade students' perceptions of knowledgeable computer users and their self-perception as a computer-type person. In Study 1, participants were asked to generate representations of computer users in pictures or words. The results indicate that the majority of representations were of male users and they frequently wore glasses. Students of both genders were more likely to draw males. Eighth-grade students' representations included more stereotypical features than those of sixth-grade students. In Study 2, students were asked whether they believed that there was such a thing as a computer-type person and whether they perceived themselves to be one. Eighty per cent of students rejected this characterization. They differed from students who accepted it in their levels of past experience, their confidence, and the probability that they shared their knowledge with others. The results of both studies suggest that while there is a male image of computer science in general, it is not overly negative and students' self-perception is not governed by their own gender as much as by other variables. [source]


An affordable modular mobile robotic platform with fuzzy logic control and evolutionary artificial neural networks

JOURNAL OF FIELD ROBOTICS (FORMERLY JOURNAL OF ROBOTIC SYSTEMS), Issue 8 2004
Maurice Tedder
Autonomous robotics projects encompass the rich nature of integrated systems that includes mechanical, electrical, and computational software components. The availability of smaller and cheaper hardware components has helped make possible a new dimension in operational autonomy. This paper describes a mobile robotic platform consisting of several integrated modules including a laptop computer that serves as the main control module, microcontroller-based motion control module, a vision processing module, a sensor interface module, and a navigation module. The laptop computer module contains the main software development environment with a user interface to access and control all other modules. Programming language independence is achieved by using standard input/output computer interfaces including RS-232 serial port, USB, networking, audio input and output, and parallel port devices. However, with the same hardware technology available to all, the distinguishing factor in most cases for intelligent systems becomes the software design. The software for autonomous robots must intelligently control the hardware so that it functions in unstructured, dynamic, and uncertain environments while maintaining an autonomous adaptability. This paper describes how we introduced fuzzy logic control to one robot platform in order to solve the 2003 Intelligent Ground Vehicle Competition (IGVC) Autonomous Challenge problem. This paper also describes the introduction of hybrid software design that utilizes Fuzzy Evolutionary Artificial Neural Network techniques. In this design, rather than using a control program that is directly coded, the robot's artificial neural net is first trained with a training data set using evolutionary optimization techniques to adjust weight values between neurons. The trained neural network with a weight average defuzzification method was able to make correct decisions to unseen vision patterns for the IGVC Autonomous Challenge. A comparison of the Lawrence Technological University robot designs and the design of the other competing schools shows that our platforms were the most affordable robot systems to use as tools for computer science and engineering education. © 2004 Wiley Periodicals, Inc. [source]


Expectancy and Risk for Alcoholism: The Unfortunate Exploitation of a Fundamental Characteristic of Neurobehavioral Adaptation

ALCOHOLISM, Issue 5 2002
Mark S. Goldman
Psychological investigations of alcohol expectancies over the last 20 years, using primarily verbal techniques, have strongly supported expectancies as an important mediator of biological and environmental antecedent variables that influence risk for alcohol use and abuse. At the same time, rapid developments in neuroscience, cognitive science, affective science, computer science, and genetics proved to be compatible with the concept of expectancy and, in some cases, used this concept directly. By using four principles that bear on the integration of knowledge in the biological and behavioral sciences,consilience, conservation, contingency, and emergence,these developments are merged into an integrated explanation of alcoholism and other addictions. In this framework, expectancy is seen as a functional approach to adaptation and survival that has been manifested in multiple biological systems with different structures and processes. Understood in this context, addiction is not a unique behavioral problem or special pathology distinct from the neurobehavioral substrate that governs all behavior, but is rather a natural (albeit unfortunate) consequence of these same processes. The ultimate intent is to weave a working heuristic that ties together findings from molecular and molar levels of inquiry and thereby might help direct future research. Such integration is critical in the multifaceted study of addictions. [source]


A cluster analysis of scholar and journal bibliometric indicators

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 10 2009
Massimo Franceschet
We investigate different approaches based on correlation analysis to reduce the complexity of a space of quantitative indicators for the assessment of research performance. The proposed methods group bibliometric indicators into clusters of highly intercorrelated indicators. Each cluster is then associated with a representative indicator. The set of all representatives corresponds to a base of orthogonal metrics capturing independent aspects of research performance and can be exploited to design a composite performance indicator. We apply the devised methodology to isolate orthogonal performance metrics for scholars and journals in the field of computer science and to design a global performance indicator. The methodology is general and can be exploited to design composite indicators that are based on a set of possibly overlapping criteria. [source]


An exploratory study of Malaysian publication productivity in computer science and information technology

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 12 2002
Yinian Gu
Explores the Malaysian computer science and information technology publication productivity. A total of 547 unique Malaysian authors, affiliated to 52 organizations in Malaysia, contributed 461 publications between 1990 and 1999 as indicated by data collected from three Web-based databases. The majority (378 or 69.1%) of authors wrote one publication. The productive authors and the number of their papers as well as the position of their names in the articles are listed to indicate their productivity and degree of involvement in their research publications. Researchers from the universities contribute about 428 (92.8%) publications. The three most productive institutions together account for a total of 258 (56.0%) publications. The composition of the publications are 197 (42.7%) journal articles, 263 (57.1%) conference papers, and 1 (0.2%) monograph chapters. The results indicate that the scholars published in a few core proceedings but contributed to a wide variety of journals. Thirty-nine fields of research undertaken by the scholars are also revealed. The possible reasons for the amount and pattern of contributions are related to the size of researcher population in the country, the availability of refereed scholarly journals, and the total expenditure allocated to information, computers, and communication technology (ICCT) research in Malaysia. [source]


A Parallelised High Performance Monte Carlo Simulation Approach for Complex Polymerisation Kinetics

MACROMOLECULAR THEORY AND SIMULATIONS, Issue 6 2007
Hugh Chaffey-Millar
Abstract A novel, parallelised approach to Monte Carlo simulations for the computation of full molecular weight distributions (MWDs) arising from complex polymerisation reactions is presented. The parallel Monte Carlo method constitutes perhaps the most comprehensive route to the simulation of full MWDs of multiple chain length polymer entities and can also provide detailed microstructural information. New fundamental insights have been developed with regard to the Monte Carlo process in at least three key areas: (i) an insufficient system size is demonstrated to create inaccuracies via poor representation of the most improbable events and least numerous species; (ii) advanced algorithmic principles and compiler technology known to computer science have been used to provide speed improvements and (iii) the parallelisability of the algorithm has been explored and excellent scalability demonstrated. At present, the parallel Monte Carlo method presented herein compares very favourably in speed with the latest developments in the h-p Galerkin method-based PREDICI software package while providing significantly more detailed microstructural information. It seems viable to fuse parallel Monte Carlo methods with those based on the h-p Galerkin methods to achieve an optimum of information depths for the modelling of complex macromolecular kinetics and the resulting microstructural information. [source]


A Tractable and Expressive Class of Marginal Contribution Nets and Its Applications

MLQ- MATHEMATICAL LOGIC QUARTERLY, Issue 4 2009
Edith Elkind
Abstract Coalitional games raise a number of important questions from the point of view of computer science, key among them being how to represent such games compactly, and how to efficiently compute solution concepts assuming such representations. Marginal contribution nets (MC-nets), introduced by Ieong and Shoham, are one of the simplest and most influential representation schemes for coalitional games. MC-nets are a rulebased formalism, in which rules take the form pattern , value, where "pattern " is a Boolean condition over agents, and "value " is a numeric value. Ieong and Shoham showed that, for a class of what we will call "basic" MC-nets, where patterns are constrained to be a conjunction of literals, marginal contribution nets permit the easy computation of solution concepts such as the Shapley value. However, there are very natural classes of coalitional games that require an exponential number of such basic MC-net rules. We present read-once MC- nets, a new class of MC-nets that is provably more compact than basic MC-nets, while retaining the attractive computational properties of basic MC-nets. We show how the techniques we develop for read-once MC-nets can be applied to other domains, in particular, computing solution concepts in network flow games on series-parallel networks (© 2009 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]


Laptops in computer science: Creating the "Learning Studio"

NEW DIRECTIONS FOR TEACHING & LEARNING, Issue 101 2005
Roy P. Pargas
Laptops helped two faculty members adapt a highly challenging, critical computer science course to increasing enrollment. Among the course enhancements the technology made possible were daily quizzes, animation-based demonstrations and exercises, and more authentic assessment. [source]


Craniofacial imaging informatics and technology development

ORTHODONTICS & CRANIOFACIAL RESEARCH, Issue 2003
MW Vannier
Structured Abstract Author, Vannier MW Purpose , ,Craniofacial imaging informatics' refers to image and related scientific data from the dentomaxillofacial complex, and application of ,informatics techniques' (derived from disciplines such as applied mathematics, computer science and statistics) to understand and organize the information associated with the data. Method , Major trends in information technology determine the progress made in craniofacial imaging and informatics. These trends include industry consolidation, disruptive technologies, Moore's law, electronic atlases and on-line databases. Each of these trends is explained and documented, relative to their influence on craniofacial imaging. Results , Craniofacial imaging is influenced by major trends that affect all medical imaging and related informatics applications. The introduction of cone beam craniofacial computed tomography scanners is an example of a disruptive technology entering the field. An important opportunity lies in the integration of biologic knowledge repositories with craniofacial images. Conclusion , The progress of craniofacial imaging will continue subject to limitations imposed by the underlying technologies, especially imaging informatics. Disruptive technologies will play a major role in the evolution of this field. [source]


Information behavior in developing countries: Research, issues, and emerging trends

PROCEEDINGS OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE & TECHNOLOGY (ELECTRONIC), Issue 1 2007
Dania Bilal Moderator
The field of library and information science (LIS) has historically been a leading discipline in studying human information behavior (Spink & Cole, 2006). Information seeking in industrialized nations is grounded in theories and moving towards new directions and evolutionary approaches that often challenge the established paradigms of information behavior studies (see Case, 2007; Spink & Cole, 2006; Fisher, Erdelez, & Mckechnie, 2005; Chelton & Cool, 2004. Information behavior has been conceptualized in a holistic context that draws upon theories from various disciplines such as cognitive science, communication, psychology, and computer science (Nahl & Bilal, 2007; Spink & Cole, 2006). Compared to industrialized nations, most developing countries relegate towards the bottom heap of research on information behavior (Coleman, 2005; Britz, 2005). A panel of researchers, educators, and consultants will address research in information behavior in various contexts in developing countries, particularly in India, South Africa, and the Arab world. Based on their research findings and experiences, the speakers will trace themes, map the intellectual terrain, identify emerging trends and approaches, and frame issues related to information behavior research in these countries. Moreover, they will identify significant knowledge domains, concepts, and topics of application in information behavior research where there can be mutual exchange between developing countries and the industrialized nations (including the United States) to nurture and further growth in this area of study. [source]


A simple and linear time randomized algorithm for computing sparse spanners in weighted graphs,

RANDOM STRUCTURES AND ALGORITHMS, Issue 4 2007
Surender Baswana
Abstract Let G = (V,E) be an undirected weighted graph on |V | = n vertices and |E| = m edges. A t -spanner of the graph G, for any t , 1, is a subgraph (V,ES), ES , E, such that the distance between any pair of vertices in the subgraph is at most t times the distance between them in the graph G. Computing a t -spanner of minimum size (number of edges) has been a widely studied and well-motivated problem in computer science. In this paper we present the first linear time randomized algorithm that computes a t -spanner of a given weighted graph. Moreover, the size of the t -spanner computed essentially matches the worst case lower bound implied by a 43-year old girth lower bound conjecture made independently by Erd,s, Bollobás, and Bondy & Simonovits. Our algorithm uses a novel clustering approach that avoids any distance computation altogether. This feature is somewhat surprising since all the previously existing algorithms employ computation of some sort of local or global distance information, which involves growing either breadth first search trees up to ,(t)-levels or full shortest path trees on a large fraction of vertices. The truly local approach of our algorithm also leads to equally simple and efficient algorithms for computing spanners in other important computational environments like distributed, parallel, and external memory. © 2006 Wiley Periodicals, Inc. Random Struct. Alg., 2007 [source]


A unified approach to the analysis of Horton-Strahler parameters of binary tree structures

RANDOM STRUCTURES AND ALGORITHMS, Issue 3-4 2002
Markus E. Nebel
Abstract The Horton-Strahler number naturally arose from problems in various fields, e.g., geology, molecular biology and computer science. Consequently, detailed investigations of related parameters for different classes of binary tree structures are of interest. This paper shows one possibility of how to perform a mathematical analysis for parameters related to the Horton-Strahler number in a unified way such that only a single analysis is needed to obtain results for many different classes of trees. The method is explained by the examples of the expected Horton-Strahler number and the related rth moments, the average number of critical nodes, and the expected distance between critical nodes. © 2002 Wiley Periodicals, Inc. Random Struct. Alg., 21: 252,277, 2002 [source]


Virtobot,a multi-functional robotic system for 3D surface scanning and automatic post mortem biopsy

THE INTERNATIONAL JOURNAL OF MEDICAL ROBOTICS AND COMPUTER ASSISTED SURGERY, Issue 1 2010
Lars Christian Ebert
Abstract Background The Virtopsy project, a multi-disciplinary project that involves forensic science, diagnostic imaging, computer science, automation technology, telematics and biomechanics, aims to develop new techniques to improve the outcome of forensic investigations. This paper presents a new approach in the field of minimally invasive virtual autopsy for a versatile robotic system that is able to perform three-dimensional (3D) surface scans as well as post mortem image-guided soft tissue biopsies. Methods The system consists of an industrial six-axis robot with additional extensions (i.e. a linear axis to increase working space, a tool-changing system and a dedicated safety system), a multi-slice CT scanner with equipment for angiography, a digital photogrammetry and 3D optical surface-scanning system, a 3D tracking system, and a biopsy end effector for automatic needle placement. A wax phantom was developed for biopsy accuracy tests. Results Surface scanning times were significantly reduced (scanning times cut in half, calibration three times faster). The biopsy module worked with an accuracy of 3.2 mm. Discussion Using the Virtobot, the surface-scanning procedure could be standardized and accelerated. The biopsy module is accurate enough for use in biopsies in a forensic setting. Conclusion The Virtobot can be utilized for several independent tasks in the field of forensic medicine, and is sufficiently versatile to be adapted to different tasks in the future. Copyright © 2009 John Wiley & Sons, Ltd. [source]


The ED of the Future: an Interdisciplinary Graduate Course in Healthcare Design

ACADEMIC EMERGENCY MEDICINE, Issue 2009
David Cowan
Six faculty members from Georgia Institute of Technology, Emory University School of Medicine, Emory Healthcare, and Perkins + Will created and taught a one-semester course titled "The Emergency Department of the Future". The goals of the course were to provide an environment for students to be exposed to the unique challenges of healthcare design, to experience and learn techniques for successful interdisciplinary design, and to create innovations with impact. Twenty graduate students representing five disciplines (architecture, health systems, human-computer interaction, computer science, and systems engineering) participated in this class. The course included a series of didactic lectures covering a wide range of issues including architectural design of hospitals and emergency departments, observation techniques for working environments, electronic medical records, and patient-centered care. Lecturers included emergency physicians, nurses, architects, human-computer interaction researchers, and design specialists. Students developed problem statements along with prototype design solutions through these lectures, direct observation, and interaction with course faculty. The resulting projects include a mobile triage chair that takes vital signs, equipment sliders for easy functional transformation, an integrated lighting design, as well as patient assistants for self registration, communication, environmental control, and discharge support. The developed projects have generated ideas to improve emergency care that may be implementable commercial products as well as fundable projects for future research. The final presentation open house attracted over a hundred visitors from local and national healthcare facilities and industry. This presentation will highlight the structure and organization of the course as well as the resulting projects. [source]


Modelling skin disease: Lessons from the worlds of mathematics, physics and computer science

AUSTRALASIAN JOURNAL OF DERMATOLOGY, Issue 2 2005
Stephen Gilmore
SUMMARY Theoretical biology is a field that attempts to understand the complex phenomena of life in terms of mathematical and physical principles. Likewise, theoretical medicine employs mathematical arguments and models as a methodology in approaching the complexities of human disease. Naturally, these concepts can be applied to dermatology. There are many possible methods available in the theoretical investigation of skin disease. A number of examples are presented briefly. These include the mathematical modelling of pattern formation in congenital naevi and erythema gyratum repens, an information-theoretic approach to the analysis of genetic networks in autoimmunity, and computer simulations of early melanoma growth. To conclude, an analogy is drawn between the behaviour of well-known physical processes, such as earthquakes, and the spatio-temporal evolution of skin disease. Creating models in skin disease can lead to predictions that can be investigated experimentally or by observation and offer the prospect of unexpected or important insights into pathogenesis. [source]


LambdaXtreme® transport system: R&D of a high capacity system for low cost, ultra long haul DWDM transport

BELL LABS TECHNICAL JOURNAL, Issue 2 2006
Daniel A. Fishman
The LambdaXtreme® Transport System, Lucent Technologies' ultra long haul high-capacity transport product, leverages leading edge innovations in optics and applied physics as well as computational and computer science. In this paper, we provide a detailed view of the research and development efforts that resulted in a lightwave transmission system that is now being used in backbone national networks in the United States. © 2006 Lucent Technologies Inc. [source]


A portable bioinformatics course for upper-division undergraduate curriculum in sciences

BIOCHEMISTRY AND MOLECULAR BIOLOGY EDUCATION, Issue 5 2008
Wely B. Floriano
Abstract This article discusses the challenges that bioinformatics education is facing and describes a bioinformatics course that is successfully taught at the California State Polytechnic University, Pomona, to the fourth year undergraduate students in biological sciences, chemistry, and computer science. Information on lecture and computer practice topics, free for academic use software and web links required for the laboratory exercises and student surveys for two instances of the course, is presented. This course emphasizes hands-on experience and focuses on developing practical skills while providing a solid knowledge base for critically applying these skills. It is designed in blocks of 1-hour lecture followed by 2 hours of computer laboratory exercises, both covering the same general topic, for a total of 30 hours of lecture and 60 hours of computer practice. The heavy computational aspect of this course was designed to use a single multiprocessor computer server running Linux, accessible from laptops through Virtual Network Computing sessions. The laptops can be either provided by the institution or owned by the individual students. This configuration avoids the need to install and maintain bioinformatics software on each laptop. Only a single installation is required for most bioinformatics software on the Linux server. The content of this course and its software/hardware configuration are well suited for institutions with no dedicated computer laboratory. This author believes that the same model can be successfully implemented in other institutions, especially those who do not have a strong instructional computer technology support such as community colleges and small universities. [source]


Computational Biology: Toward Deciphering Gene Regulatory Information in Mammalian Genomes

BIOMETRICS, Issue 3 2006
Hongkai Ji
Summary Computational biology is a rapidly evolving area where methodologies from computer science, mathematics, and statistics are applied to address fundamental problems in biology. The study of gene regulatory information is a central problem in current computational biology. This article reviews recent development of statistical methods related to this field. Starting from microarray gene selection, we examine methods for finding transcription factor binding motifs and cis -regulatory modules in coregulated genes, and methods for utilizing information from cross-species comparisons and ChIP-chip experiments. The ultimate understanding of cis -regulatory logic in mammalian genomes may require the integration of information collected from all these steps. [source]