Home About us Contact | |||
Computer
Kinds of Computer Terms modified by Computer Selected AbstractsCOMPUTER MEDIATED EXPERIENCE AND EDUCATIONEDUCATIONAL THEORY, Issue 4 2001Leonard J. Waks First page of article [source] HN09P ASSESSMENT OF FREE FIBULAR BONE IN THE RECONSTRUCTED MANDIBLE USING THREE-DIMENSIONAL COMPUTER GENERATED IMAGESANZ JOURNAL OF SURGERY, Issue 2007H. Nabi We report the preliminary results of the Royal Adelaide Hospital experience with multidimensional simulated views of the fibula-flap reconstructed mandible. The free fibular flap is a well recognised option for mandibular reconstruction. What is not well understood however is how the fibula behaves in comparison to the dentate mandible. To date, skeletal remodelling and bone atrophy has only been assessed using standard orthopantogram films. For many years three-dimensional (3D) computer generated models using data from CT scans have been utilised for craniofacial reconstruction. We proposed that these images will enable us to more accurately visualise the integration of the transplanted graft within the mandible. We recalled and CT scanned patients from 2004 to 2006 that underwent free fibular flaps for reconstruction of mandibular malignancy and performed 3D reconstruction of these images. This is the first reported series of multidimensional computer generated images to assess bone in the reconstructed mandible. [source] Computer-based management environment for an assembly language programming laboratoryCOMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 1 2007Santiago Rodríguez Abstract This article describes the environment used in the Computer Architecture Department of the Technical University of Madrid (UPM) for managing small laboratory work projects and a specific application for an Assembly Language Programming Laboratory. The approach is based on a chain of tools that a small team of teachers can use to efficiently manage a course with a large number of students (400 per year). Students use this tool chain to complete their assignments using an MC88110 CPU simulator also developed by the Department. Students use a Delivery Agent tool to send files containing their implementations. These files are stored in one of the Department servers. Every student laboratory assignment is tested by an Automatic Project Evaluator that executes a set of previously designed and configured tests. These tools are used by teachers to manage mass courses thereby avoiding restrictions on students working on the same assignment. This procedure may encourage students to copy others' laboratory work and we have therefore developed a complementary tool to help teachers find "replicated" laboratory assignment implementations. This tool is a plagiarism detection assistant that completes the tool-chain functionality. Jointly, these tools have demonstrated over the last decade that important benefits can be gained from the exploitation of a global laboratory work management system. Some of the benefits may be transferable to an area of growing importance that we have not directly explored, i.e. distance learning environments for technical subjects. © 2007 Wiley Periodicals, Inc. Comput Appl Eng Educ 15: 41,54, 2007; Published online in Wiley InterScience (www.interscience.wiley.com); DOI 10.1002/cae.20094 [source] Interactive Graphics for Computer Adaptive TestingCOMPUTER GRAPHICS FORUM, Issue 8 2009I. Cheng K.3.1 [Computer Milieux]: Computer and Education , Computer Uses in Education; I.3.8 [Computing Methodologies]: Computer Graphics , Application Abstract Interactive graphics are commonly used in games and have been shown to be successful in attracting the general audience. Instead of computer games, animations, cartoons, and videos being used only for entertainment, there is now an interest in using interactive graphics for ,innovative testing'. Rather than traditional pen-and-paper tests, audio, video and graphics are being conceived as alternative means for more effective testing in the future. In this paper, we review some examples of graphics item types for testing. As well, we outline how games can be used to interactively test concepts; discuss designing chemistry item types with interactive 3D graphics; suggest approaches for automatically adjusting difficulty level in interactive graphics based questions; and propose strategies for giving partial marks for incorrect answers. We study how to test different cognitive skills, such as music, using multimedia interfaces; and also evaluate the effectiveness of our model. Methods for estimating difficulty level of a mathematical item type using Item Response Theory (IRT) and a molecule construction item type using Graph Edit Distance are discussed. Evaluation of the graphics item types through extensive testing on some students is described. We also outline the application of using interactive graphics over cell phones. All of the graphics item types used in this paper are developed by members of our research group. [source] Serious Games: Broadening Games Impact Beyond EntertainmentCOMPUTER GRAPHICS FORUM, Issue 3 2007Ben Sawyer Computer and videogames for many years has been an island of technology and design innovation largely left to itself as it morphed from a cottage business into a global media and software industry. While there have been pockets of derivative activity related to games and game technology only in the last half-dozen years has there been a real movement toward exploiting this industry in many new and exciting ways. Today the general use of games and game technologies for purposes beyond entertainment is collectively referred to as serious games. The Serious Games Initiative was formed in 2002 and since its inception has been among a number of critical efforts that has helped open up the world and many disciplines to the ideas and innovations that may be sourced from the commercial, independent, and academic game fields. This has been a person-by-person, project-by-project effort that not only has informed us about the potential of games but also in how you merge innovation and innovators from one discipline with those in another. In this talk we will explore the total gamut of the serious games field identifying past the obvious how games and game technologies are being applied to problems in a wide array of areas including healthcare, productivity, visualization, science, and of course training and education. Once a proper definition of serious games is established the talk will focus on the current state of the field as it relates to research and infrastructure issues that are needed to make the difference between seeing serious games take hold as a major new practice or having it devolve into another trend of the moment lost to history. [source] The significance of protective factors in the assessment of riskCRIMINAL BEHAVIOUR AND MENTAL HEALTH, Issue 1 2010Charlotte E. Rennie Background,Few studies have explored protective factors in the assessment of risk, despite acknowledgement that protective factors may play an important role. Aim,To examine the significance of protective factors in assessment of risk using the Structured Assessment of Violence Risk in Youth (SAVRY). Method,The SAVRY was completed on 135 male adolescents in custody in the UK. Data on previous offending and childhood psychopathology were collected. Participants were prospectively followed up at 12 months using data from the Home Office Police National Computer (HOPNC). Results,Participants with protective factors were older when first arrested, were less prolific offenders and had fewer psychopathological problems. The number of protective factors present was significantly higher for participants who did not re-offend during the follow-up. The total number of SAVRY protective factors significantly predicted desistance at follow-up and resilient personality traits constituted the only significant individual protective factor. Conclusions and implications,Protective factors might buffer the effects of risk factors and a resilient personality may be crucial. Recognition of protective factors should be an essential part of the risk management process and for interventions with high-risk adolescents to reduce re-offending. Copyright © 2010 John Wiley & Sons, Ltd. [source] Computer-based training with ortho-phonological units in dyslexic children: new investigationsDYSLEXIA, Issue 3 2009Jean Ecalle Abstract This study aims to show that training using a computer game incorporating an audio-visual phoneme discrimination task with phonological units, presented simultaneously with orthographic units, might improve literacy skills. Two experiments were conducted, one in secondary schools with dyslexic children (Experiment 1) and the other in a speech-therapy clinic with individual case studies (Experiment 2). A classical pre-test, training, post-test design was used. The main findings indicated an improvement in reading scores after short intensive training (10,h) in Experiment 1 and progress in the reading and spelling scores obtained by the dyslexic children (training for 8,h) in Experiment 2. These results are discussed within the frameworks of both the speech-specific deficit theory of dyslexia and the connectionist models of reading development. Copyright © 2008 John Wiley & Sons, Ltd. [source] The effect of time spent in treatment and dropout status on rates of convictions, cautions and imprisonment over 5 years in a primary care-led methadone maintenance serviceADDICTION, Issue 4 2010Phillip Oliver ABSTRACT Background Methadone maintenance treatment (MMT) in primary care settings is used increasingly as a standard method of delivering treatment for heroin users. It has been shown to reduce criminal activity and incarceration over periods of periods of 12 months or less; however, little is known about the effect of this treatment over longer durations. Aims To examine the association between treatment status and rates of convictions and cautions (judicial disposals) over a 5-year period in a cohort of heroin users treated in a general practitioner (GP)-led MMT service. Design Cohort study. Setting The primary care clinic for drug dependence, Sheffield, 1999,2005. Participants The cohort comprised 108 consecutive patients who were eligible and entered treatment. Ninety were followed-up for the full 5 years. Intervention The intervention consisted of MMT provided by GPs in a primary care clinic setting. Measurements Criminal conviction and caution rates and time spent in prison, derived from Police National Computer (PNC) criminal records. Findings The overall reduction in the number of convictions and cautions expected for patients entering MMT in similar primary care settings is 10% for each 6 months retained in treatment. Patients in continuous treatment had the greatest reduction in judicial disposal rates, similar to those who were discharged for positive reasons (e.g. drug free). Patients who had more than one treatment episode over the observation period did no better than those who dropped out of treatment. Conclusions MMT delivered in a primary care clinic setting is effective in reducing convictions and cautions and incarceration over an extended period. Continuous treatment is associated with the greatest reductions. [source] Modeling Passing Rates on a Computer-Based Medical Licensing Examination: An Application of Survival Data AnalysisEDUCATIONAL MEASUREMENT: ISSUES AND PRACTICE, Issue 3 2004André F. de Champlain The purpose of this article was to model United States Medical Licensing Examination (USMLE) Step 2 passing rates using the Cox Proportional Hazards Model, best known for its application in analyzing clinical trial data. The number of months it took to pass the computer-based Step 2 examination was treated as the dependent variable in the model. Covariates in the model were: (a) medical school location (U.S. and Canadian or other), (b) primary language (English or other), and (c) gender. Preliminary findings indicate that examinees were nearly 2.7 times more likely to experience the event (pass Step 2) if they were U.S. or Canadian trained. Examinees with English as their primary language were 2.1 times more likely to pass Step 2, but gender had little impact. These findings are discussed more fully in light of past research and broader potential applications of survival analysis in educational measurement. [source] Validity Issues in Computer-Based TestingEDUCATIONAL MEASUREMENT: ISSUES AND PRACTICE, Issue 3 2001Kristen L. Huff Advances in technology are stimulating the development of complex, computerized assessments. The prevailing rationales for developing computer-based assessments are improved measurement and increased efficiency. In the midst of this measurement revolution, test developers and evaluators must revisit the notion of validity. In this article, we discuss the potential positive and negative effects computer-based testing could have on validity, review the literature regarding validation perspectives in computer-based testing, and provide suggestions regarding how to evaluate the contributions of computer-based testing to more valid measurement practices. We conclude that computer-based testing shows great promise for enhancing validity, but at this juncture, it remains equivocal whether technological innovations in assessment have led to more valid measurement. [source] Audio Computer-Based Tests (CBTs): An Initial Framework for the Use of Sound in Computerized TestsEDUCATIONAL MEASUREMENT: ISSUES AND PRACTICE, Issue 2 2001Cynthia G. Parshall Are there important aspects of human ability that we have not been measuring? What are the purposes and types of audio that are possible in computerized tests? Will the use of audio in computer-based tests lead to more valid and reliable measurement? [source] Computer-based psychological treatment for comorbid depression and problematic alcohol and/or cannabis use: a randomized controlled trial of clinical efficacyADDICTION, Issue 3 2009Frances J. Kay-Lambkin ABSTRACT Aims To evaluate computer- versus therapist-delivered psychological treatment for people with comorbid depression and alcohol/cannabis use problems. Design Randomized controlled trial. Setting Community-based participants in the Hunter Region of New South Wales, Australia. Participants Ninety-seven people with comorbid major depression and alcohol/cannabis misuse. Intervention All participants received a brief intervention (BI) for depressive symptoms and substance misuse, followed by random assignment to: no further treatment (BI alone); or nine sessions of motivational interviewing and cognitive behaviour therapy (intensive MI/CBT). Participants allocated to the intensive MI/CBT condition were selected at random to receive their treatment ,live' (i.e. delivered by a psychologist) or via a computer-based program (with brief weekly input from a psychologist). Measurements Depression, alcohol/cannabis use and hazardous substance use index scores measured at baseline, and 3, 6 and 12 months post-baseline assessment. Findings (i) Depression responded better to intensive MI/CBT compared to BI alone, with ,live' treatment demonstrating a strong short-term beneficial effect which was matched by computer-based treatment at 12-month follow-up; (ii) problematic alcohol use responded well to BI alone and even better to the intensive MI/CBT intervention; (iii) intensive MI/CBT was significantly better than BI alone in reducing cannabis use and hazardous substance use, with computer-based therapy showing the largest treatment effect. Conclusions Computer-based treatment, targeting both depression and substance use simultaneously, results in at least equivalent 12-month outcomes relative to a ,live' intervention. For clinicians treating people with comorbid depression and alcohol problems, BIs addressing both issues appear to be an appropriate and efficacious treatment option. Primary care of those with comorbid depression and cannabis use problems could involve computer-based integrated interventions for depression and cannabis use, with brief regular contact with the clinician to check on progress. [source] Computer and Electronics Product Stewardship: Are We Ready for the Challenge?ENVIRONMENTAL QUALITY MANAGEMENT, Issue 1 2001Cate Gable The PC revolution continues to produce faster and smarter machines at a stunning pace. Almost forgotten in the rush are the millions of nearly new, but suddenly outdated, computers that are abandoned every year. Can product stewardship offer life-after-end-of-life for this growing mountain of "attic-ware," while averting a costly,and potentially toxic,waste disposal crisis? © 2001 by Cate Gable, Axioun Books. Used with permission. [source] Quality assurance of specialised treatment of eating disorders using large-scale internet-based collection systems: Methods, results and lessons learned from designing the Stepwise databaseEUROPEAN EATING DISORDERS REVIEW, Issue 4 2010Andreas Birgegård Abstract Computer-based quality assurance of specialist eating disorder (ED) care is a possible way of meeting demands for evaluating the real-life effectiveness of treatment, in a large-scale, cost-effective and highly structured way. The Internet-based Stepwise system combines clinical utility for patients and practitioners, and provides research-quality naturalistic data. Stepwise was designed to capture relevant variables concerning EDs and general psychiatric status, and the database can be used for both clinical and research purposes. The system comprises semi-structured diagnostic interviews, clinical ratings and self-ratings, automated follow-up schedules, as well as administrative functions to facilitate registration compliance. As of June 2009, the system is in use at 20 treatment units and comprises 2776 patients. Diagnostic distribution (including subcategories of eating disorder not otherwise specified) and clinical characteristics are presented, as well as data on registration compliance. Obstacles and keys to successful implementation of the Stepwise system are discussed, including possible gains and on-going challenges inherent in large-scale, Internet-based quality assurance. Copyright © 2010 John Wiley & Sons, Ltd and Eating Disorders Association. [source] SLIC-1/sorting nexin,20: A novel sorting nexin that directs subcellular distribution of PSGL-1EUROPEAN JOURNAL OF IMMUNOLOGY, Issue 2 2008Ulrich Abstract P-Selectin glycoprotein ligand-1 (PSGL-1) is a mucin-like glycoprotein expressed on the surface of leukocytes that serves as the major ligand for the selectin family of adhesion molecules and functions in leukocyte tethering and rolling on activated endothelium and platelets. Previous studies have implicated the highly conserved cytoplasmic domain of PSGL-1 in regulating outside-in signaling of integrin activation. However, molecules that physically and functionally interact with this domain are not completely defined. Using a yeast two-hybrid screen with the cytoplasmic domain of PSGL-1 as bait, a novel protein designated selectin ligand interactor cytoplasmic-1 (SLIC-1) was isolated. Computer-based homology search revealed that SLIC-1 was the human orthologue for the previously identified mouse sorting nexin,20. Direct interaction between SLIC-1 and PSGL-1 was specific as indicated by co-immunoprecipitation and motif mapping. Colocalization experiments demonstrated that SLIC-1 contains a Phox homology domain that binds phosphoinositides and targets the PSGL-1/SLIC-1 complex to endosomes. Deficiency in the murine homologue of SLIC-1 did not modulate PSGL-1-dependent signaling nor alter neutrophil adhesion through PSGL-1. We conclude that SLIC-1 serves as a sorting molecule that cycles PSGL-1 into endosomes with no impact on leukocyte recruitment. [source] The Joint Research Program "CPR Precipitation" , Towards More Powerful Computer Assisted Metallurgy Codes,ADVANCED ENGINEERING MATERIALS, Issue 12 2006P. Maugis Computer Assisted Metallurgy (CAM) is developed and used by both Alcan and Arcelor to predict material properties, optimize processing and accelerate the development of new innovative solutions. These CAM codes describe the microstructure evolution of an alloy from solidification to the final step of the transformation schedule and predict usage properties from the simulated end product microstructure. The accuracy of the predictions requires a reliable laboratory or plant experimental database and robust physical laws. [source] Approaches to locating expertise using corporate knowledgeINTELLIGENT SYSTEMS IN ACCOUNTING, FINANCE & MANAGEMENT, Issue 4 2002Richard Crowder In many organizations people need to locate colleagues with knowledge and information to resolve a problem. Computer-based systems that assist users with ,nding such expertise are increasingly important to industrial organizations. In this paper we discuss the development of Expertise Finders suitable for use within the engineering design environment, as illustrated through the use of a scenario. A key feature of this work is that the Expertise Finder returns both recommended contacts and supporting documentation. The Expertise Finder bases its results on information held within the organization, e.g. on-line publications repositories, human resource records, and not on individually compiled Curriculum Vitaes or other forms of user-maintained records. The recommendations are presented to the user with due regard to the social context, and are supported by the documents used to make the recommendation. Copyright © 2003 John Wiley & Sons, Ltd. [source] Automatic CAD model topology generationINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN FLUIDS, Issue 8 2006Paresh S. Patel Abstract Computer aided design (CAD) models often need to be processed due to the data translation issues and requirements of the downstream applications like computational field simulation, rapid prototyping, computer graphics, computational manufacturing, and real-time rendering before they can be used. Automatic CAD model processing tools can significantly reduce the amount of time and cost associated with the manual processing. The topology generation algorithm, commonly known as CAD repairing/healing, is presented to detect commonly found geometrical and topological issues like cracks, gaps, overlaps, intersections, T-connections, and no/invalid topology in the model, process them and build correct topological information. The present algorithm is based on the iterative vertex pair contraction and expansion operations called stitching and filling, respectively, to process the model accurately. Moreover, the topology generation algorithm can process manifold as well as non-manifold models, which makes the procedure more general and flexible. In addition, a spatial data structure is used for searching and neighbour finding to process large models efficiently. In this way, the combination of generality, accuracy, and efficiency of this algorithm seems to be a significant improvement over existing techniques. Results are presented showing the effectiveness of the algorithm to process two- and three-dimensional configurations. Copyright © 2006 John Wiley & Sons, Ltd. [source] Circuits, computers, and beyond Boolean logic,INTERNATIONAL JOURNAL OF CIRCUIT THEORY AND APPLICATIONS, Issue 5-6 2007Tamás Roska Abstract Historically, the invention of the stored programmable computer architecture, introduced by John Von Neumann, was also influenced by electrical circuit implementation aspects, as well as tied to fundamental insight of logic reasoning. It can also be considered as a mind-inspired machine. Since then, the implementation of logic gates, control and memories has developed independently of the architecture. The Cellular Wave Computer architecture (IEEE Trans. Circuits Syst. II 1993; 40:163,173; Electron. Lett. 2007; 43:427,449; J. Circuits Syst. Comput. 2003; 5(2):539,562) as a spatial,temporal universal machine on flows has also been influenced by circuit aspects of very large-scale integration (VLSI) technology, as well as some motivating living neural circuits, via the cellular nonlinear (neural) network (CNN). It might be considered as a brain-inspired machine. In this paper, after summarizing the main properties of the Cellular Wave Computer, we highlight a few basic properties of this new kind of computer and computing. In particular, phenomena related to (i) the one-pass solution of a set of implicit equations due to real-time spatial array feedback, (ii) the true random signal array generation via the insertion of the continuous physical noise signals, (iii) the finite synchrony radius due to the functional delay of wires, as well as to (iv) biology relevance. We also show that the Cellular Wave Computer is performing spatial,temporal inference that goes beyond Boolean logic, a characteristic of living neural circuits. Copyright © 2007 John Wiley & Sons, Ltd. [source] Split agent-based routing in interconnected networksINTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 4 2004Constandinos X. Mavromoustakis Abstract Adaptive behaviour of swarm-based agents (BT Technol. J. 1994; 12:104,113; AAMAS Conference '02, Melbourne, Australia, Month 1,2, 2002; Softcomput. J. 2001; 5(4):313,317.) is being studied in this paper with respect to network throughput for a certain amount of data traffic. Algorithmically complex problems like routing data packets in a network need to be faced with a dynamically adaptive approach such as agent-based scheme. Particularly in interconnected networks where multiple networks are participating in order to figure a large-scale network with different QoS levels and heterogeneity in the service of delay sensitive packets, routing algorithm must adopt in frequent network changes to anticipate such situations. Split agent-based routing technique (SART) is a variant of swarm-based routing (Adapt. Behav. 1997; 5:169,207; Proceedings of 2003 International Symposium on Performance Evaluation of Computer and Telecommunication Systems,SPECTS, Montreal, Canada, July 20,24, 2003; 240,247.) where agents are split after their departure to the next node on a hop-by-hop basis. Packets that are delay sensitive are marked as prioritized which agents recognize-as being a part of a packet- and try to influence the two-way routing tables. Thorough examination is made, for the performance of the proposed algorithm in the network and the QoS offered, taking into account a number of metrics. It is shown that the split agent routing scheme applied to interconnected networks offers a decentralized control in the network and an efficient way to increase overall performance and packet control reducing at the same time the packet loss concept. Copyright © 2004 John Wiley & Sons, Ltd. [source] Computer-based morphometry of brainINTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, Issue 2 2010Bang-Bon Koo Abstract Over the past decade, the importance of probing the anatomy of the brain has reemerged as an important field of neuroscience. In combination with functional imaging techniques, the rapid advancement of neuroimaging techniques,such as magnetic resonance imaging,and their growing applicability in studying brain morphometry has led to great advances in neuroscience research. Considering the requirements of the diverse technologies,from image processing to statistics,in performing morphometry of the brain, it is critical to have an overall understanding of this subject. The major objective of this review is to provide a practical introduction to this field. The review starts by covering basic concepts and techniques that are commonly used in morphometry of structural magnetic resonance imaging and then extends to further technical perspectives. © 2010 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 20, 117,125, 2010 [source] An underwater photomosaic technique using Adobe PhotoshopÔINTERNATIONAL JOURNAL OF NAUTICAL ARCHAEOLOGY, Issue 1 2002Colin J. M. Martin A simple technique for taking systematic runs of vertical underwater photographs is described. Computer-based procedures for rectifying, matching, and assembling the photographs into mosaics using Adobe PhotoshopÔ software are then explained. [source] Computer-based endoscopic image-processing technology for endourology and laparoscopic surgeryINTERNATIONAL JOURNAL OF UROLOGY, Issue 6 2009Tatsuo Igarashi Abstract Endourology and laparoscopic surgery are evolving in accordance with developments in instrumentation and progress in surgical technique. Recent advances in computer and image-processing technology have enabled novel images to be created from conventional endoscopic and laparoscopic video images. Such technology harbors the potential to advance endourology and laparoscopic surgery by adding new value and function to the endoscope. The panoramic and three-dimensional images created by computer processing are two outstanding features that can address the shortcomings of conventional endoscopy and laparoscopy, such as narrow field of view, lack of depth cue, and discontinuous information. The wide panoramic images show an anatomical ,map' of the abdominal cavity and hollow organs with high brightness and resolution, as the images are collected from video images taken in a close-up manner. To assist in laparoscopic surgery, especially in suturing, a three-dimensional movie can be obtained by enhancing movement parallax using a conventional monocular laparoscope. In tubular organs such as the prostatic urethra, reconstruction of three-dimensional structure can be achieved, implying the possibility of a liquid dynamic model for assessing local urethral resistance in urination. Computer-based processing of endoscopic images will establish new tools for endourology and laparoscopic surgery in the near future. [source] Some characteristics of sperm motility in European hake (Merluccius merluccius, L., 1758)JOURNAL OF APPLIED ICHTHYOLOGY, Issue 5 2010A.-L. Groison Summary The objective of this paper is to characterize some of the sperm motility parameters in European hake (Merluccius merluccius), which is considered to be a species with aquaculture potential. The total ATP, ADP and AMP concentrations were determined using high-performance liquid chromatography on hake sperm samples collected during the winter-early spring in the Bay of Biscay (France) (n = 22) and on hake sperm samples collected during the summer-early autumn in waters off Western Norway (n = 5). The Adenylate Energy Charge (AEC) was deduced from these data. Computer Assisted Sperm Analysis (CASA) was used to measure a series of parameters characterizing the motility and the sperm swimming performances. Changes in salinity of the swimming medium affected all the measured motility parameters. The sperm velocity and the straightness of the movement were at maximum when sperm was activated with 100% filtrated sea water (100 SW) but decreased sharply later. When sperm was activated in filtrated sea water (50% diluted with distilled water: 50 SW) the values of these parameters increased (with a lower percentage of active cells) during the first 2.5 min and thereafter decreased slowly. In 50 SW, the initial velocity was lowered but the swimming period lasting 4.5 times longer than in 100 SW (but with a lower percentage of actively swimming cells). Initial sperm motility (percentage of swimming cells) in 100 SW was affected by sperm storage duration. Undiluted sperm could be stored at 4°C for 5 days and still show 13 ± 7% motility; the velocity and straightness of the movement were at maximum at the earliest period of measurement (0.5,1 day of storage) and then decreased gradually to reach their minima after 4 days of storage. Further, both the AEC and ATP content decreased with storage time, with the AEC decreasing from 0.78 ± 0.07 (mean ± SD) at stripping time to 0.20 ± 0.09 after 2 days of storage. Over the same period ATP content decreased from 85 ± 80 to 5 ± 4 nanomoles 10,9 spermatozoa, these data presenting a high variability. [source] Computer Mediated Markets: An Introduction and Preliminary Test of Market Structure ImpactsJOURNAL OF COMPUTER-MEDIATED COMMUNICATION, Issue 3 2000Charles Steinfield Electronic commerce may influence the way in which goods are traded between businesses. Many believe that Internet-based business-to-business e-commerce will reduce the extent to which firms buying goods and services are "locked in" to a single supplier. Using a secondary analysis of data collected in late 1996 on firms' use of electronic networks for transactions, we empirically test the effects of Internet use on buyer lock-in. Results are weak, but suggest that using the Internet rather than proprietary computer networks in connecting with external trading partners appears to lessen a buying firm's dependence on its primary supplier. The Internet seems to be especially valuable in allowing small firms to connect to external constituents. [source] The Effect of Computer-Based Tests on Racial-Ethnic and Gender GroupsJOURNAL OF EDUCATIONAL MEASUREMENT, Issue 2 2002Ann Gallagher In this study data were examined from several national testing programs to determine whether the change from paper-based administration to computer-based tests (CBTs) influences group differences in performance. Performances by gender, racial, and ethnic groups on the Graduate Record Examination General Test, Graduate Management Admissions Test, SAT I: Reasoning Test, and Praxis: Professional Assessment for Beginning Teachers, were analyzed to determine whether the shift in testing format from paper-and-pencil tests to CBTs posed a disadvantage to any of these subgroups, beyond that already identified for paper-based tests. Although all differences were quite small, some consistent patterns were found for some racial-ethnic and gender groups. African-American examinees and, to a lesser degree, Hispanic examinees appear to benefit from the CBT format. On some tests, female examinees' performance was relatively lower on the CBT version. [source] Computer aided design for sustainable industrial processes: Specific tools and applicationsAICHE JOURNAL, Issue 4 2009Maurizio Fermeglia Abstract Chemical Process Sustainability can be estimated using different sustainability indicators. The quantitative estimation of those indicators is necessary (i) for evaluating the environmental impact of a chemical process and (ii) for choosing the best design among different available alternatives. To accomplish these goals, the computerized calculation of sustainability indicators requires the use of at least three computer tools: (i) process simulation, (ii) molecular modeling and a (iii) sustainability indicators software code. In this work, a complete software platform, Process Sustainability Prediction Framework, integrated with process simulation programs, which support the CAPE-OPEN interfaces, is presented and discussed. The article contains also description and application of molecular modeling techniques to estimate different toxicological data, which are used in the calculation of sustainability indicators. A representative example of one chemical process and thermo-physical properties used in the toxicological data calculation, are reported to demonstrate the applicability of the software to real cases. © 2009 American Institute of Chemical Engineers AIChE J, 2009 [source] Enhancing students' understanding of the concept of chemical bonding by using activities provided on an interactive websiteJOURNAL OF RESEARCH IN SCIENCE TEACHING, Issue 3 2009Marcel Frailich Abstract This study investigated the effectiveness of a web-based learning environment in enhancing 10th grade high-school students' understanding of the concept of chemical bonding. Two groups participated in this study: an experimental group (N,=,161) and a comparison one (N,=,93). The teachers in the experimental group were asked to implement four activities taken from a website, all dealing with the concept of chemical bonding. Computer-based visual models are utilized in all the activities in order to demonstrate bonding and the structure of matter, and are based on student-centered learning. The study incorporated both quantitative and qualitative research. The quantitative research consisted of achievement questionnaires administered to both the experimental and comparison groups. In contrast, the qualitative research included observations and interviews of students and teachers. Importantly, we found that the experimental group outperformed the comparison group significantly, in the achievement post-test, which examines students' understanding of the concept of chemical bonding. These results led us to conclude that the web-based learning activities which integrated visualization tools with active and cooperative learning strategies provided students with opportunities to construct their knowledge regarding the concept of chemical bonding. © 2008 Wiley Periodicals, Inc. J Res Sci Teach 46: 289,310, 2009 [source] A randomized, controlled, single-blind trial of teaching provided by a computer-based multimedia package versus lectureMEDICAL EDUCATION, Issue 9 2001Christopher Williams Background Computer-based teaching may allow effective teaching of important psychiatric knowledge and skills. Aims To investigate the effectiveness and acceptability of computer-based teaching. Method A single-blind, randomized, controlled study of 166 undergraduate medical students at the University of Leeds, involving an educational intervention of either a structured lecture or a computer-based teaching package (both of equal duration). Results There was no difference in knowledge between the groups at baseline or immediately after teaching. Both groups made significant gains in knowledge after teaching. Students who attended the lecture rated their subjective knowledge and skills at a statistically significantly higher level than students who had used the computers. Students who had used the computer package scored higher on an objective measure of assessment skills. Students did not perceive the computer package to be as useful as the traditional lecture format, despite finding it easy to use and recommending its use to other students. Conclusions Medical students rate themselves subjectively as learning less from computer-based as compared with lecture-based teaching. Objective measures suggest equivalence in knowledge acquisition and significantly greater skills acquisition for computer-based teaching. [source] A Computer-Based Method for Determination of the Cell-Free Layer Width in MicrocirculationMICROCIRCULATION, Issue 3 2006SANGHO KIM ABSTRACT Objectives: The cell-free layer between the erythrocyte column and the vessel wall is an important determinant of hydrodynamic resistance in microcirculatory vessels. The authors report a method for continuous measurement of the width of this layer. Methods: The light intensity of a linear array of pixels perpendicular to the vessel axis is continuously determined from a video image of a microcirculatory vessel. A threshold level based on Otsu's method is used to establish the interface between the cell-free layer and the erythrocyte column. To test the method, video images at 750,4500 frames/s were obtained from venules and arterioles in rat spinotrapezius muscle at normal and reduced arterial pressures before and after induction of erythrocyte aggregation with Dextran 500. The current measurements were compared to manual measurements of the same images. Results: Values obtained by the manual and the new methods were in agreement within the 95% confidence limit by the Bland-Altman analysis and within 90,95% range by the correlation coefficient (R2). The more frequent measurements reveal substantial, rapid variations in cell-free layer width and changes in mean values with alteration of arterial pressure and red cell aggregability. Conclusions: A new, computer-based technique has been developed that provides measurements of rapid, time-dependent variations in the width of the cell-free layer in the microcirculation. [source] |