Home About us Contact | |||
Developers
Kinds of Developers Selected AbstractsIDEOLOGICAL DEVELOPERS AND THE FORMATION OF LOCAL DEVELOPMENT POLICY: THE CASE OF INNER-CITY PRESERVATION IN TEL AVIVJOURNAL OF URBAN AFFAIRS, Issue 5 2008NURIT ALFASI ABSTRACT:,This article studies the role of ideological developers (IDs) in the formation and implementation of local development policy. The IDs are developers whose motivation is ideological as opposed to financial, and they initiate ideas rather than plans and projects. Based on a case study regarding inner-city preservation, we claim that in Tel Aviv, IDs have much leverage on local decision making. The IDs are individuals with high personal capital, who focus on an issue that it is not championed by existing civil groups. As the IDs seek out influential routes to policy makers, they build circumstantial coalitions. Through these limited and conditional partnerships with administrators and other influential actors, the IDs apply pressure and advance their specific cause. [source] CODE IS SPEECH: Legal Tinkering, Expertise, and Protest among Free and Open Source Software DevelopersCULTURAL ANTHROPOLOGY, Issue 3 2009GABRIELLA COLEMAN ABSTRACT In this essay, I examine the channels through which Free and Open Source Software (F/OSS) developers reconfigure central tenets of the liberal tradition,and the meanings of both freedom and speech,to defend against efforts to constrain their productive autonomy. I demonstrate how F/OSS developers contest and specify the meaning of liberal freedom,especially free speech,through the development of legal tools and discourses within the context of the F/OSS project. I highlight how developers concurrently tinker with technology and the law using similar skills, which transform and consolidate ethical precepts among developers. I contrast this legal pedagogy with more extraordinary legal battles over intellectual property, speech, and software. I concentrate on the arrests of two programmers, Jon Johansen and Dmitry Sklyarov, and on the protests they provoked, which unfolded between 1999 and 2003. These events are analytically significant because they dramatized and thus made visible tacit social processes. They publicized the challenge that F/OSS represents to the dominant regime of intellectual property (and clarified the democratic stakes involved) and also stabilized a rival liberal legal regime intimately connecting source code to speech. [source] Inclusive Achievement Testing for Linguistically and Culturally Diverse Test Takers: Essential Considerations for Test Developers and Decision MakersEDUCATIONAL MEASUREMENT: ISSUES AND PRACTICE, Issue 1 2009Shelley B. Fairbairn Substantial growth in the numbers of English language learners (ELLs) in the United States and Canada in recent years has significantly affected the educational systems of both countries. This article focuses on critical issues and concerns related to the assessment of ELLs in U.S. and Canadian schools and emphasizes assessment approaches for test developers and decision makers that will facilitate increased equity, meaningfulness, and accuracy in assessment and accountability efforts. It begins by examining the crucial issue of defining ELLs as a group. Next, it examines the impact of testing originating from the No Child Left Behind Act of 2001 (NCLB) in the U.S. and government-mandated standards-driven testing in Canada by briefly describing each country's respective legislated testing requirements and outlining their consequences at several levels. Finally, the authors identify key points that test developers and decision makers in both contexts should consider in testing this ever-increasing group of students. [source] Agile requirements engineering practices and challenges: an empirical studyINFORMATION SYSTEMS JOURNAL, Issue 5 2010Balasubramaniam Ramesh Abstract This paper describes empirical research into agile requirements engineering (RE) practices. Based on an analysis of data collected in 16 US software development organizations, we identify six agile practices. We also identify seven challenges that are created by the use of these practices. We further analyse how this collection of practices helps mitigate some, while exacerbating other risks in RE. We provide a framework for evaluating the impact and appropriateness of agile RE practices by relating them to RE risks. Two risks that are intractable by agile RE practices emerge from the analysis. First, problems with customer inability and a lack of concurrence among customers significantly impact agile development. Second, risks associated with the neglecting non-functional requirements such as security and scalability are a serious concern. Developers should carefully evaluate the risk factors in their project environment to understand whether the benefits of agile RE practices outweigh the costs imposed by the challenges. [source] Comprehend and analyze knowledge networks to improve software evolutionJOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 3 2009Christian Del Rosso Abstract When a set of people are connected by a set of meaningful social relationships we talk of a social network. A social network represents a social structure and the underlying structural patterns can be used to analyze and comprehend how people relate to each other and their emergent behavior as a group. Developing software is fundamentally a human activity. Developers cooperate and exchange knowledge and information, creating in fact, a particular type of social network that we call knowledge network. In this paper we investigate knowledge networks in software development teams by applying social network analysis and we use the Apache web server as a case study. By analyzing the structural communication and coordination patterns in Apache we have been able to identify the Apache knowledge network, highlight potential communication bottlenecks, and find brokers and important coordination points in the software development team. Furthermore, our work enables a software architect to analyze and maintain the organization and the software architecture aligned during software evolution. An important lesson that we have is that the analysis of knowledge networks constitutes an additional tool to be added to the traditional software architecture assessment methods. Copyright © 2009 John Wiley & Sons, Ltd. [source] A Complexity Model and a Polynomial Algorithm for Decision-Tree-Based Feature ConstructionCOMPUTATIONAL INTELLIGENCE, Issue 1 2000Raymond L. Major Using decision trees as a concept description language, we examine the time complexity for learning Boolean functions with polynomial-sized disjunctive normal form expressions when feature construction is performed on an initial decision tree containing only primitive attributes. A shortcoming of several feature-construction algorithms found in the literature is that it is difficult to develop time complexity results for them. We illustrate a way to determine a limit on the number of features to use for building more concise trees within a standard amount of time. We introduce a practical algorithm that forms a finite number of features using a decision tree in a polynomial amount of time. We show empirically that our procedure forms many features that subsequently appear in a tree and the new features aid in producing simpler trees when concepts are being learned from certain problem domains. Expert systems developers can use a method such as this to create a knowledge base of information that contains specific knowledge in the form of If-Then rules. [source] Behaviour-based multiplayer collaborative interaction managementCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 1 2006Qingping Lin Abstract A collaborative virtual environment (CVE) allows geographically dispersed users to interact with each other and objects in a common virtual environment via network connections. One of the successful applications of CVE is multiplayer on-line role-playing game. To support massive interactions among virtual entities in a large-scale CVE and maintain consistent status of the interaction among users with the constraint of limited network bandwidth, an efficient collaborative interaction management method is required. In this paper, we propose a behaviour-based interaction management framework for supporting multiplayer role-playing CVE applications. It incorporates a two-tiered architecture which includes high-level role behaviour-based interaction management and low-level message routing. In the high level, interaction management is achieved by enabling interactions based on collaborative behaviour definitions. In the low level, message routing controls interactions according to the run-time status of the interactive entities. Collaborative Behaviour Description Language is designed as a scripting interface for application developers to define collaborative behaviours of interactive entities and simulation logics/game rules in a CVE. We demonstrate and evaluate the performance of the proposed framework through a prototype system and simulations. Copyright © 2006 John Wiley & Sons, Ltd. [source] Kinematics, Dynamics, Biomechanics: Evolution of Autonomy in Game AnimationCOMPUTER GRAPHICS FORUM, Issue 3 2005Steve Collins The believeable portrayal of character performances is critical in engaging the immersed player in interactive entertainment. The story, the emotion and the relationship between the player and the world they are interacting within are hugely dependent on how appropriately the world's characters look, move and behave. We're concerned here with the character's motion; with next generation game consoles like Xbox360TM and Playstation®3 the graphical representation of characters will take a major step forward which places even more emphasis on the motion of the character. The behavior of the character is driven by story and design which are adapted to game context by the game's AI system. The motion of the characters populating the game's world, however, is evolving to an interesting blend of kinematics, dynamics, biomechanics and AI drivenmotion planning. Our goal here is to present the technologies involved in creating what are essentially character automata, emotionless and largely brainless character shells that nevertheless exhibit enough "behavior" to move as directed while adapting to the environment through sensing and actuating responses. This abstracts the complexities of low level motion control, dynamics, collision detection etc. and allows the game's artificial intelligence system to direct these characters at a higher level. While much research has already been conducted in this area and some great results have been published, we will present the particular issues that face game developers working on current and next generation consoles, and how these technologies may be integrated into game production pipelines so to facilitate the creation of character performances in games. The challenges posed by the limited memory and CPU bandwidth (though this is changing somewhat with next generation) and the challenges of integrating these solutions with current game design approaches leads to some interesting problems, some of which the industry has solutions for and some others which still remain largely unsolved. [source] Novel software architecture for rapid development of magnetic resonance applicationsCONCEPTS IN MAGNETIC RESONANCE, Issue 3 2002Josef Debbins Abstract As the pace of clinical magnetic resonance (MR) procedures grows, the need for an MR scanner software platform on which developers can rapidly prototype, validate, and produce product applications becomes paramount. A software architecture has been developed for a commercial MR scanner that employs state of the art software technologies including Java, C++, DICOM, XML, and so forth. This system permits graphical (drag and drop) assembly of applications built on simple processing building blocks, including pulse sequences, a user interface, reconstruction and postprocessing, and database control. The application developer (researcher or commercial) can assemble these building blocks to create custom applications. The developer can also write source code directly to create new building blocks and add these to the collection of components, which can be distributed worldwide over the internet. The application software and its components are developed in Java, which assures platform portability across any host computer that supports a Java Virtual Machine. The downloaded executable portion of the application is executed in compiled C++ code, which assures mission-critical real-time execution during fast MR acquisition and data processing on dedicated embedded hardware that supports C or C++. This combination permits flexible and rapid MR application development across virtually any combination of computer configurations and operating systems, and yet it allows for very high performance execution on actual scanner hardware. Applications, including prescan, are inherently real-time enabled and can be aggregated and customized to form "superapplications," wherein one or more applications work with another to accomplish the clinical objective with a very high transition speed between applications. © 2002 Wiley Periodicals, Inc. Concepts in Magnetic Resonance (Magn Reson Engineering) 15: 216,237, 2002 [source] A large-scale monitoring and measurement campaign for web services-based applicationsCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 10 2010Riadh Ben Halima Abstract Web Services (WS) can be considered as the most influent enabling technology for the next generation of web applications. WS-based application providers will face challenging features related to nonfunctional properties in general and to performance and QoS in particular. Moreover, WS-based developers have to provide solutions to extend such applications with self-healing (SH) mechanisms as required for autonomic computing to face the complexity of interactions and to improve availability. Such solutions should be applicable when the components implementing SH mechanisms are deployed on both or only one platform on the WS providers and requesters sides depending on the deployment constraints. Associating application-specific performance requirements and monitoring-specific constraints will lead to complex configurations where fine tuning is needed to provide SH solutions. To contribute to enhancing the design and the assessment of such solutions for WS technology, we designed and implemented a monitoring and measurement framework, which is part of a larger Self-Healing Architectures (SHA) developed during the European WS-DIAMOND project. We implemented the Conference Management System (CMS), a real WS-based complex application. We achieved a large-scale experimentation campaign by deploying CMS on top of SHA on the French grid Grid5000. We experienced the problem as if we were a service provider who has to tune reconfiguration strategies. Our results are available on the web in a structured database for external use by the WS community. Copyright © 2010 John Wiley & Sons, Ltd. [source] Concurrent workload mapping for multicore security systemsCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 10 2009Benfano Soewito Abstract Multicore based network processors are promising components to build real-time and scalable security systems to protect the networks and systems. The parallel nature of the processing system makes it challenging for application developers to concurrently program security systems for high performance. In this paper we present an automatic programming methodology that considers application complexity, traffic variation, and attack signatures update. In particular, our mapping algorithm concurrently takes advantage of parallelism in the level of tasks, applications, and packets to achieve optimal performance. We present results that show the effectiveness of the analysis, mapping, and the performance of the model methodology. Copyright © 2009 John Wiley & Sons, Ltd. [source] Using Web 2.0 for scientific applications and scientific communitiesCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 5 2009Marlon E. Pierce Abstract Web 2.0 approaches are revolutionizing the Internet, blurring lines between developers and users and enabling collaboration and social networks that scale into the millions of users. As discussed in our previous work, the core technologies of Web 2.0 effectively define a comprehensive distributed computing environment that parallels many of the more complicated service-oriented systems such as Web service and Grid service architectures. In this paper we build upon this previous work to discuss the applications of Web 2.0 approaches to four different scenarios: client-side JavaScript libraries for building and composing Grid services; integrating server-side portlets with ,rich client' AJAX tools and Web services for analyzing Global Positioning System data; building and analyzing folksonomies of scientific user communities through social bookmarking; and applying microformats and GeoRSS to problems in scientific metadata description and delivery. Copyright © 2009 John Wiley & Sons, Ltd. [source] MyCoG.NET: a multi-language CoG toolkitCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2007A. Paventhan Abstract Grid application developers utilize Commodity Grid (CoG) toolkits to access Globus Grid services. Existing CoG toolkits are language-specific and have, for example, been developed for Java, Python and the Matlab scripting environment. In this paper we describe MyCoG.NET, a CoG toolkit supporting multi-language programmability under the Microsoft .NET framework. MyCoG.NET provides a set of classes and APIs to access Globus Grid services from languages supported by the .NET Common Language Runtime. We demonstrate its programmability using FORTRAN, C++, C# and Java, and discuss its performance over LAN and WAN infrastructures. We present a Grid application, in the field of experimental aerodynamics, as a case study to show how MyCoG.NET can be exploited. We demonstrate how scientists and engineers can create and use domain-specific workflow activity sets for rapid application development using Windows Workflow Foundation. We also show how users can easily extend and customize these activities. Copyright © 2006 John Wiley & Sons, Ltd. [source] Plug-and-play remote portlet publishingCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2007X. D. Wang Abstract Web Services for Remote Portlets (WSRP) is gaining attention among portal developers and vendors to enable easy development, increased richness in functionality, pluggability, and flexibility of deployment. Whilst currently not supporting all WSRP functionalities, open-source portal frameworks could in future use WSRP Consumers to access remote portlets found from a WSRP Producer registry service. This implies that we need a central registry for the remote portlets and a more expressive WSRP Consumer interface to implement the remote portlet functions. This paper reports on an investigation into a new system architecture, which includes a Web Services repository, registry, and client interface. The Web Services repository holds portlets as remote resource producers. A new data structure for expressing remote portlets is found and published by populating a Universal Description, Discovery and Integration (UDDI) registry. A remote portlet publish and search engine for UDDI has also been developed. Finally, a remote portlet client interface was developed as a Web application. The client interface supports remote portlet features, as well as window status and mode functions. Copyright © 2007 John Wiley & Sons, Ltd. [source] Towards a framework and a benchmark for testing tools for multi-threaded programsCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 3 2007Yaniv Eytani Abstract Multi-threaded code is becoming very common, both on the server side, and very recently for personal computers as well. Consequently, looking for intermittent bugs is a problem that is receiving more and more attention. As there is no silver bullet, research focuses on a variety of partial solutions. We outline a road map for combining the research within the different disciplines of testing multi-threaded programs and for evaluating the quality of this research. We have three main goals. First, to create a benchmark that can be used to evaluate different solutions. Second, to create a framework with open application programming interfaces that enables the combination of techniques in the multi-threading domain. Third, to create a focus for the research in this area around which a community of people who try to solve similar problems with different techniques can congregate. We have started creating such a benchmark and describe the lessons learned in the process. The framework will enable technology developers, for example, developers of race detection algorithms, to concentrate on their components and use other ready made components (e.g. an instrumentor) to create a testing solution. Copyright © 2006 John Wiley & Sons, Ltd. [source] Advanced eager scheduling for Java-based adaptive parallel computingCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 7-8 2005Michael O. Neary Abstract Javelin 3 is a software system for developing large-scale, fault-tolerant, adaptively parallel applications. When all or part of their application can be cast as a master,worker or branch-and-bound computation, Javelin 3 frees application developers from concerns about inter-processor communication and fault tolerance among networked hosts, allowing them to focus on the underlying application. The paper describes a fault-tolerant task scheduler and its performance analysis. The task scheduler integrates work stealing with an advanced form of eager scheduling. It enables dynamic task decomposition, which improves host load-balancing in the presence of tasks whose non-uniform computational load is evident only at execution time. Speedup measurements are presented of actual performance on up to 1000 hosts. We analyze the expected performance degradation due to unresponsive hosts, and measure actual performance degradation due to unresponsive hosts. Copyright © 2005 John Wiley & Sons, Ltd. [source] Features of the Java Commodity Grid KitCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 13-15 2002Gregor von Laszewski Abstract In this paper we report on the features of the Java Commodity Grid Kit (Java CoG Kit). The Java CoG Kit provides middleware for accessing Grid functionality from the Java framework. Java CoG Kit middleware is general enough to design a variety of advanced Grid applications with quite different user requirements. Access to the Grid is established via Globus Toolkit protocols, allowing the Java CoG Kit to also communicate with the services distributed as part of the C Globus Toolkit reference implementation. Thus, the Java CoG Kit provides Grid developers with the ability to utilize the Grid, as well as numerous additional libraries and frameworks developed by the Java community to enable network, Internet, enterprise and peer-to-peer computing. A variety of projects have successfully used the client libraries of the Java CoG Kit to access Grids driven by the C Globus Toolkit software. In this paper we also report on the efforts to develop serverside Java CoG Kit components. As part of this research we have implemented a prototype pure Java resource management system that enables one to run Grid jobs on platforms on which a Java virtual machine is supported, including Windows NT machines. Copyright © 2002 John Wiley & Sons, Ltd. [source] A flexible framework for consistency managementCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 1 2002S. Weber Abstract Recent distributed shared memory (DSM) systems provide increasingly more support for the sharing of objects rather than portions of memory. However, like earlier DSM systems these distributed shared object systems (DSO) still force developers to use a single protocol, or a small set of given protocols, for the sharing of application objects. This limitation prevents the applications from optimizing their communication behaviour and results in unnecessary overhead. A current general trend in software systems development is towards customizable systems, for example frameworks, reflection, and aspect-oriented programming all aim to give the developer greater flexibility and control over the functionality and performance of their code. This paper describes a novel object-oriented framework that defines a DSM system in terms of a consistency model and an underlying coherency protocol. Different consistency models and coherency protocols can be used within a single application because they can be customized, by the application programmer, on a per-object basis. This allows application specific semantics to be exploited at a very fine level of granularity and with a resulting improvement in performance. The framework is implemented in JAVA and the speed-up obtained by a number of applications that use the framework is reported. Copyright © 2002 John Wiley & Sons, Ltd. [source] Managing conflict during an organizational acquisitionCONFLICT RESOLUTION QUARTERLY, Issue 3 2006Cynthia F. Cohen Conflict frequently arises during an organizational acquisition, and how a company manages that conflict has an impact on the success of the acquisition. Software developers, testers, and managers of a recently acquired organization reported profound changes in organizational culture and numerous potential sources of conflict. Conflict was generally well managed by effectively handling economic rewards, the balancing of power, cultural changes, and emotional reactions to the acquisition. [source] Connecting EIA to environmental management systems: lessons from industrial estate developments in EnglandCORPORATE SOCIAL RESPONSIBILITY AND ENVIRONMENTAL MANAGEMENT, Issue 2 2007Paul Slinn Abstract This paper concerns the relationship between environmental assessment and environmental management systems in the context of recent industrial estate developments. Drawing on environmental statements and interviews with developers, an examination was carried out of the level of good practice in estate design and operation, and the way in which this was influenced by environmental impact assessment and environmental management systems. The study concludes that the environmental impact assessment system worked well within the context of land use planning, but that it failed to facilitate the planning of effective environmental management in practice, with the consequence that the projects examined failed to meet many of the good practice criteria against which they were tested. Finally, several recommendations are made to strengthen continuity between the two. Copyright © 2006 John Wiley & Sons, Ltd and ERP Environment. [source] Environmental management in large-scale building projects,learning from Hammarby SjöstadCORPORATE SOCIAL RESPONSIBILITY AND ENVIRONMENTAL MANAGEMENT, Issue 4 2002Rolf Johansson In an old industrial and harbour area of Stockholm, a new city for 30 000 people will be built in the next ten years. The Hammarby Sjöstad project is unique in its size and municipal organization as well as in its ambitious environmental objectives. In a case study based on interviews and document analysis, the environmental management process of this project is researched. The City of Stockholm will follow up the compliance with set goals; our study is a qualitative one focusing on the management process. We develop concepts and models as an aid for municipal management of future construction projects. Many factors outside the formal ones are considered important. Data is structured chronologically as events and from a stakeholder perspective, including the City, the developers and the contractors. The main focus is, however, on the City's Project Management Team. Collected data is furthermore analysed with the aid of key concepts, derived from organization theory, planning and construction practice and as suggested by the data. Preliminary results indicate that the continued study of informal means of control is just as important as that of the formal ones, and that identifying key situations and tools for environmental management should be in focus for the rest of the research study. Copyright © 2002 John Wiley & Sons, Ltd. and ERP Environment [source] CODE IS SPEECH: Legal Tinkering, Expertise, and Protest among Free and Open Source Software DevelopersCULTURAL ANTHROPOLOGY, Issue 3 2009GABRIELLA COLEMAN ABSTRACT In this essay, I examine the channels through which Free and Open Source Software (F/OSS) developers reconfigure central tenets of the liberal tradition,and the meanings of both freedom and speech,to defend against efforts to constrain their productive autonomy. I demonstrate how F/OSS developers contest and specify the meaning of liberal freedom,especially free speech,through the development of legal tools and discourses within the context of the F/OSS project. I highlight how developers concurrently tinker with technology and the law using similar skills, which transform and consolidate ethical precepts among developers. I contrast this legal pedagogy with more extraordinary legal battles over intellectual property, speech, and software. I concentrate on the arrests of two programmers, Jon Johansen and Dmitry Sklyarov, and on the protests they provoked, which unfolded between 1999 and 2003. These events are analytically significant because they dramatized and thus made visible tacit social processes. They publicized the challenge that F/OSS represents to the dominant regime of intellectual property (and clarified the democratic stakes involved) and also stabilized a rival liberal legal regime intimately connecting source code to speech. [source] Examining the Potential of Indigenous Institutions for Development: A Perspective from Borana, EthiopiaDEVELOPMENT AND CHANGE, Issue 2 2003Elizabeth E. Watson This article examines an institutional approach to development in which indigenous institutions are viewed as a resource for achieving development. It concentrates on indigenous natural resource management (NRM) institutions which have been seen by some development agencies to be a means to address the needs of people and the environment in a way that is also participatory. Using material from Borana, Ethiopia, the article describes the indigenous NRM institutions and examines the outcome of one attempt to work with them. In the process, it shows that partnerships between development agencies and indigenous NRM institutions are often fragile, and tend to dissolve when they fail to meet the preconceptions of the developers. Through an examination of this approach to development, the article also examines the usefulness of recent broad approaches to institutions. [source] An "Omics" view of drug developmentDRUG DEVELOPMENT RESEARCH, Issue 2 2004Russ B. Altman Abstract The pharmaceutical industry cannot be blamed for having a love/hate relationship with the fields of pharmacogenetics and pharmacogenomics. At the same time that pharmacogenetics and pharmacogenomics promise to save pipeline drugs by identifying subsets of the population for which they work best, they also threaten to increase the complexity of new drug applications, fragment markets, and create uncertainty for prescribers who simply do not understand or have time to master "personalized medicine." Most importantly, the logical case for genetics-specific drug selection and dosing is much more mature than the practical list of drugs for which outcomes are demonstrably improved. Understandably, pharmaceutical developers and regulators have been careful in creating strategies for using genetics in drug development, and only recently has the FDA begun to establish preliminary rules for pharmacogenetic testing. A growing public academic effort in pharmacogenetics and pharmacogenomics is helping flesh out the basic science underpinnings of the field, and this should combine with extensive efforts of industry to create a solid foundation for future use of genetics in drug development. Two grand challenges to accelerate our capabilities include the characterization of all human genes involved in the basic pharmacokinetics of drugs, and the detailed study of the genes and pathways associated with G-protein-coupled receptors and how they are affected by genetic variation. Drug Dev. Res. 62:81,85, 2004. © 2004 Wiley-Liss, Inc. [source] A Framework for Evaluating and Planning Assessments Intended to Improve Student AchievementEDUCATIONAL MEASUREMENT: ISSUES AND PRACTICE, Issue 3 2009Paul D. Nichols Assessments labeled as formative have been offered as a means to improve student achievement. But labels can be a powerful way to miscommunicate. For an assessment use to be appropriately labeled "formative," both empirical evidence and reasoned arguments must be offered to support the claim that improvements in student achievement can be linked to the use of assessment information. Our goal in this article is to support the construction of such an argument by offering a framework within which to consider evidence-based claims that assessment information can be used to improve student achievement. We describe this framework and then illustrate its use with an example of one-on-one tutoring. Finally, we explore the framework's implications for understanding when the use of assessment information is likely to improve student achievement and for advising test developers on how to develop assessments that are intended to offer information that can be used to improve student achievement. [source] Inclusive Achievement Testing for Linguistically and Culturally Diverse Test Takers: Essential Considerations for Test Developers and Decision MakersEDUCATIONAL MEASUREMENT: ISSUES AND PRACTICE, Issue 1 2009Shelley B. Fairbairn Substantial growth in the numbers of English language learners (ELLs) in the United States and Canada in recent years has significantly affected the educational systems of both countries. This article focuses on critical issues and concerns related to the assessment of ELLs in U.S. and Canadian schools and emphasizes assessment approaches for test developers and decision makers that will facilitate increased equity, meaningfulness, and accuracy in assessment and accountability efforts. It begins by examining the crucial issue of defining ELLs as a group. Next, it examines the impact of testing originating from the No Child Left Behind Act of 2001 (NCLB) in the U.S. and government-mandated standards-driven testing in Canada by briefly describing each country's respective legislated testing requirements and outlining their consequences at several levels. Finally, the authors identify key points that test developers and decision makers in both contexts should consider in testing this ever-increasing group of students. [source] Validity Issues in Computer-Based TestingEDUCATIONAL MEASUREMENT: ISSUES AND PRACTICE, Issue 3 2001Kristen L. Huff Advances in technology are stimulating the development of complex, computerized assessments. The prevailing rationales for developing computer-based assessments are improved measurement and increased efficiency. In the midst of this measurement revolution, test developers and evaluators must revisit the notion of validity. In this article, we discuss the potential positive and negative effects computer-based testing could have on validity, review the literature regarding validation perspectives in computer-based testing, and provide suggestions regarding how to evaluate the contributions of computer-based testing to more valid measurement practices. We conclude that computer-based testing shows great promise for enhancing validity, but at this juncture, it remains equivocal whether technological innovations in assessment have led to more valid measurement. [source] Model policies for land use and the environment: towards a critical typology?ENVIRONMENTAL POLICY AND GOVERNANCE, Issue 6 2006D. Peel Abstract This article considers contemporary debates in Scotland that are concerned with the design and implementation of land use development plan ,policies that work'. The interest in developing a resource bank of model policy texts is illustrative of the wider agenda to modernize the public sector and to secure efficiency gains in public policy making. On the one hand, this is presented as strengthening policy makers' ability to achieve stated policy outcomes, and to enforce particular policy objectives in the public interest. On the other hand, it is argued that a more uniform and consistent policy context across Scotland would offer a more certain operating environment for developers and users of the planning service. The discussion considers the diversity of land use planning topics identified as potentially appropriate for formulation as a model policy, and proposes a typology for critically interrogating their suitability in practice. Copyright © 2006 John Wiley & Sons, Ltd and ERP Environment. [source] Implementing life cycle assessment in product developmentENVIRONMENTAL PROGRESS & SUSTAINABLE ENERGY, Issue 4 2003Gurbakhash Singh Bhander The overall aim of this paper is to provide an understanding of the environmental issues involved in the early stages of product development, and the capacity of Life Cycle Assessment (LCA) techniques to address these issues. The paper aims to outline the problems for the designer in evaluating the environmental benignity of a product from the outset, and to provide a framework for decision support based on the performance evaluation at different stages of the design process. The barriers that prevent product developers from using LCA are presented, as well as opportunities for introducing environmental criteria in the design process by meeting the designer's information requirements at the different life cycle stages. This can lead to an in-depth understanding of the attitudes of product developers towards the subject area, and an understanding of possible future directions for product development. This paper introduces an Environmentally Conscious Design method, and presents trade-offs between design degrees of freedom and environmental solutions. Life cycle design frameworks and strategies are also addressed. The paper collects experiences and ideas around the state-of-the-art in eco-design, from literature and personal experience, and provides eco-design life cycle assessment strategies. The end result of this presentation is to define the requirements for performance measurement techniques, and the environment needed to support life cycle evaluation throughout the evaluation of early stages of a product system. [source] Novel regulation of yolk utilization by thyroid hormone in embryos of the direct developing frog Eleutherodactylus coquiEVOLUTION AND DEVELOPMENT, Issue 5 2010Srikanth Singamsetty SUMMARY Thyroid hormone (TH) is required for metamorphosis of the long, coiled tadpole gut into the short frog gut. Eleutherodactylus coqui, a direct developing frog, lacks a tadpole. Its embryonic gut is a miniature adult form with a mass of yolky cells, called nutritional endoderm, attached to the small intestine. We tested the TH requirement for gut development in E. coqui. Inhibition of TH synthesis with methimazole arrested gut development in its embryonic form. Embryos treated with methimazole failed to utilize the yolk in their nutritional endoderm, and survived for weeks without further development. Conversely, methimazole and 3,3,,5-tri-iodo- l -thyronine, the active form of TH, stimulated gut development and utilization and disappearance of the nutritional endoderm. In Xenopus laevis, the receptor for TH, TR,, is upregulated in response to TH. Similarly, EcTR,, the E. coqui ortholog, was upregulated by TH in the gut. EcTR, expression was high in the nutritional endoderm, suggesting a direct role for TH in yolk utilization by these cells. An initial step in the breakdown of yolk in X. laevis is acidification of the yolk platelet. E. coqui embryos in methimazole failed to acidify their yolk platelets, but acidification was stimulated by TH indicating its role in an early step of yolk utilization. In addition to a conserved TH role in gut development, a novel regulatory role for TH in yolk utilization has evolved in these direct developers. [source] |