Home About us Contact | |||
Similar Problems (similar + problem)
Selected AbstractsSIMILAR PROBLEMS, DIFFERENT SOLUTIONS: COMPARING REFUSE COLLECTION IN THE NETHERLANDS AND SPAINPUBLIC ADMINISTRATION, Issue 2 2010GERMŔ BEL Because of differences in institutional arrangements, public service markets, and national traditions regarding government intervention, local public service provision can vary greatly. In this paper we compare the procedures adopted by the local governments of The Netherlands and Spain in arranging for the provision of solid waste collection. We find that Spain faces a problem of consolidation, opting more frequently to implement policies of privatization and cooperation, at the expense of competition. By contrast, The Netherlands, which has larger municipalities on average, resorts somewhat less to privatization and cooperation, and more to competition. Both options,cooperation and competition,have their merits when striving to strike a balance between transaction costs and scale economies. The choices made in organizational reform seem to be related to several factors, among which the nature of the political system and the size of municipalities appear to be relevant. [source] A method to predict triaxial residual stresses in plastic pipesPOLYMER ENGINEERING & SCIENCE, Issue 10 2004Z. W. Guan Significant hoop and longitudinal stresses are present in medium-density polyethylene (MDPE) pipe, arising from differential cooling from the inner and the outer surfaces of a pipe during production. Owing to the difficulty of directly measuring deformations, these stresses have hitherto been almost exclusively estimated indirectly from deflection measurements on large samples cut from the pipe wall. Furthermore, because of procedural problems, only uniaxial hoop or longitudinal stresses are normally attempted, and these are known to be specimen size,dependent. Similar problems are experienced with other polymeric pipes. In this paper, based on direct biaxial strain measurements on small samples cut from the pipe wall, a method to predict triaxial residual stress distributions through the pipe wall is presented. Thermal effects that generate residual stresses in plastic pipe were considered in the theory. The analytical solutions satisfy the self-equilibrating conditions for both the hoop and the longitudinal stresses. Also, the radial stress is shown to be insignificant through the wall thickness of a mildly thick pipe. Polym. Eng. Sci. 44:1828,1838, 2004. © 2004 Society of Plastics Engineers. [source] Impact fracture toughness of polyethylene/polypropylene multilayersPOLYMER ENGINEERING & SCIENCE, Issue 9 2004Luisa Moreno In a number of applications, a brittle polymeric surface layer is deliberately molded onto a tough substrate for decorative or protective purposes. This can increase the susceptibility of the tough polymer to premature failure. Similar problems arise when a surface layer becomes embrittled by environmental effects. Choosing a surface material that has good mechanical properties without having this effect can be difficult. In this work the fracture resistances of two polyethylenes and an ethylene/propylene copolymer, and of symmetrical two-component multilayers of these polymers, were determined as a function of temperature, using instrumented impact tests. The law of mixtures accounts adequately for the fracture resistance of multilayer structures where there is no mechanical interaction between skin and core. However, it gave misleading results for a structure in which high skin modulus at low temperatures appeared to influence the fracture resistance of the core through a constraint effect. Polym. Eng. Sci. 44:1627,1635, 2004. © 2004 Society of Plastics Engineers. [source] Xeroderma pigmentosum with limited involvement of the UV-exposed areas: a case reportINTERNATIONAL JOURNAL OF DERMATOLOGY, Issue 4 2003Mostafa Mirshams-Shahshahani MD A 21-year-old woman with skin type IV, who had developed photophobia and brown, spotty, hyperpigmented lesions on her face from early childhood, presented to our center for treatment of her facial lesions. Examination on admission revealed numerous, freckle-like, hyperpigmented macules and actinic keratoses over the central part of the face, with sparing of the forehead, chin, and peripheral area (Fig. 1). The area involved was approximated to be around 2% of the total body surface. The dorsal parts of the hands showed no lesions (Fig. 2), but guttate hypomelanotic lesions were apparent on both forearms. Figure 1. Limitation of xeroderma pigmentosum lesions to the center of the face Figure 2. Hands are devoid of any lesions Histologic examination of biopsies from four different facial lesions revealed them to be keratoacanthoma (1.5 × 2.5 cm ulcerative nodule on the right cheek), sclerosing basal cell epithelioma (nasal lesion), lentigo simplex, and hypertrophic actinic keratosis. Corneal clouding, conjunctival injection, loss of lashes, and atrophy of the lids were apparent on ophthalmologic examination. Other parts of the physical examination, including examination of the oral cavity, were nonsignificant. In addition, except for the presence of mild eczema in a sibling, the patient's family history regarding the presence of any similar problem and also any other important dermatologic or general disorder was negative. [source] Inference of object-oriented design patternsJOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 5 2001Paolo Tonella Abstract When designing a new application, experienced software engineers usually adopt solutions that have proven successful in previous projects. Such reuse of code organizations is seldom made explicit. Nevertheless, it represents important information, which can be extremely valuable in the maintenance phase by documenting the design choices underlying the implementation. In addition it can be reused whenever a similar problem is encountered. In this paper an approach for the inference of recurrent design patterns directly from the code is proposed. No assumption is made on the availability of any pattern library, and the concept analysis algorithm,adapted for this purpose,is able to infer the presence of class groups which instantiate a common, repeated pattern. In fact, concept analysis provides sets of objects sharing attributes, which,in the case of object-oriented design patterns,become class members or inter-class relations. The approach was applied to three C++ applications for which the structural relations among classes led to the extraction of a set of design patterns, which could be enriched with non-structural information about class members and method invocations. The resulting patterns could be interpreted as meaningful organizations aimed at solving general problems which have several instances in the applications analyzed. Copyright © 2001 John Wiley & Sons, Ltd. [source] Repair of cyclobutyl pyrimidine dimers in human skin: variability among normal humans in nucleotide excision and in photorepairPHOTODERMATOLOGY, PHOTOIMMUNOLOGY & PHOTOMEDICINE, Issue 3 2002Betsy M. Sutherland Background/Aims: Photoreactivation (PR) of cyclobutyl pyrimidine dimers (CPD) in human skin remains controversial. Recently Whitmore et al. (1) reported negative results of experiments using two photorepair light (PRL) sources on UV-irradiated skin of volunteers. However, their PRL sources induced substantial levels of dimers in skin, suggesting that the additional dimers formed could have obscured PR. We met a similar problem of dimer induction by a PRL source. We designed and validated a PRL source of sufficient intensity to catalyse PR, but that did not induce CPD, and used it to measure photorepair in human skin. Methods and Results: Using a solar simulator filtered with three types of UV-filters, we found significant dimer formation in skin, quantified by number average length analysis using electrophoretic gels of isolated skin DNA. To prevent scattered UV from reaching the skin, we interposed shields between the filters and skin, and showed that the UV-filtered/shielded solar simulator system did not induce damage in isolated DNA or in human skin. We exposed skin of seven healthy human volunteers to 302 nm radiation, then to the improved PRL source (control skin areas were kept in the dark for measurement of excision repair). Conclusions: Using a high intensity PRL source that did not induce dimers in skin, we found that three of seven subjects carried out rapid photorepair of dimers; two carried out moderate or slow dimer photorepair, and three did not show detectable photorepair. Excision repair was similarly variable in these volunteers. Subjects with slower excision repair showed rapid photorepair, whereas those with rapid excision generally showed little or no photoreactivation. [source] Competence Models and the Maintenance ProblemCOMPUTATIONAL INTELLIGENCE, Issue 2 2001Barry Smyth Case-based reasoning (CBR) systems solve problems by retrieving and adapting the solutions to similar problems that have been stored previously as a case base of individual problem solving episodes or cases. The maintenance problem refers to the problem of how to optimize the performance of a CBR system during its operational lifetime. It can have a significant impact on all the knowledge sources associated with a system (the case base, the similarity knowledge, the adaptation knowledge, etc.), and over time, any one, or more, of these knowledge sources may need to be adapted to better fit the current problem-solving environment. For example, many maintenance solutions focus on the maintenance of case knowledge by adding, deleting, or editing cases. This has lead to a renewed interest in the issue of case competence, since many maintenance solutions must ensure that system competence is not adversely affected by the maintenance process. In fact, we argue that ultimately any generic maintenance solution must explicitly incorporate competence factors into its maintenance policies. For this reason, in our work we have focused on developing explanatory and predictive models of case competence that can provide a sound foundation for future maintenance solutions. In this article we provide a comprehensive survey of this research, and we show how these models have been used to develop a number of innovative and successful maintenance solutions to a variety of different maintenance problems. [source] A fast triangle to triangle intersection test for collision detectionCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 5 2006Oren Tropp Abstract The triangle-to-triangle intersection test is a basic component of all collision detection data structures and algorithms. This paper presents a fast method for testing whether two triangles embedded in three dimensions intersect. Our technique solves the basic sets of linear equations associated with the problem and exploits the strong relations between these sets to speed up their solution. Moreover, unlike previous techniques, with very little additional cost, the exact intersection coordinates can be determined. Finally, our technique uses general principles that can be applied to similar problems such as rectangle-to-rectangle intersection tests, and generally to problems where several equation sets are strongly related. We show that our algorithm saves about 20% of the mathematical operations used by the best previous triangle-to-triangle intersection algorithm. Our experiments also show that it runs 18.9% faster than the fastest previous algorithm on average for typical scenarios of collision detection (on Pentium 4). Copyright © 2006 John Wiley & Sons, Ltd. [source] Towards a framework and a benchmark for testing tools for multi-threaded programsCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 3 2007Yaniv Eytani Abstract Multi-threaded code is becoming very common, both on the server side, and very recently for personal computers as well. Consequently, looking for intermittent bugs is a problem that is receiving more and more attention. As there is no silver bullet, research focuses on a variety of partial solutions. We outline a road map for combining the research within the different disciplines of testing multi-threaded programs and for evaluating the quality of this research. We have three main goals. First, to create a benchmark that can be used to evaluate different solutions. Second, to create a framework with open application programming interfaces that enables the combination of techniques in the multi-threading domain. Third, to create a focus for the research in this area around which a community of people who try to solve similar problems with different techniques can congregate. We have started creating such a benchmark and describe the lessons learned in the process. The framework will enable technology developers, for example, developers of race detection algorithms, to concentrate on their components and use other ready made components (e.g. an instrumentor) to create a testing solution. Copyright © 2006 John Wiley & Sons, Ltd. [source] A Consideration of Museum Education Collections: Theory and ApplicationCURATOR THE MUSEUM JOURNAL, Issue 2 2001Shane J. Macfarlan Museum education collections are used to provide visitors with opportunities to handle museum objects. These collections are primarily composed of objects that are damaged, lack provenance, or do not fit the scope of the collection. Sometimes, these collections are displayed haphazardly and their interpretation may lack thematic context. Some museum education collections are not being utilized to their fullest educational capacity. The application of cognitive, exhibition, and collections management theories can alleviate some problems with museum education collections. A critique of the education collection at the Lubbock Lake Landmark is presented as a case study of these problems and some of the potential solutions to them. The study can be used as a template by other museums to solve similar problems in their education collections. [source] A pilot randomized trial in primary care to investigate and improve knowledge, awareness and self-management among South Asians with diabetes in ManchesterDIABETIC MEDICINE, Issue 12 2003A. Vyas Abstract Aims To investigate whether a secondary,primary care partnership education package could improve understanding of diabetes care among South Asians. Methods In a pilot randomized controlled trial, in the setting of eight general practices randomized to intervention or control, patients were invited to four or more rotating visits per year by one of a diabetes specialist nurse, dietician or chiropodist working with general practice staff. Participants were from lists of South Asian patients with known Type 2 diabetes in each (general) practice. Results Patients and practice scores at baseline and 1-year follow-up, from an interview using a questionnaire on knowledge, awareness and self-management of diabetes. Responses were developed into educational packages used during intervention. Of the 411 patients listed at baseline only 211 were traced for interview (refusal only 4%). Mean age was 55.4 years, age of diabetes onset 47.1 years. Fourteen percent were employed and 35% were able to communicate in English fluently. Only 118 could be traced and interviewed at 1 year, although there was no significant difference in demography between those who completed the study and those who did not. Despite a mean of four visits/patient, intervention had no impact on scores for diabetes knowledge, or awareness [score change 0.14, 95% confidence interval (CI) ,0.20, 0.49] or self-management (,0.05, 95% CI ,0.48, 0.39) between baseline and 1 year. Conclusions This form of secondary/primary care support did not transfer information effectively, and we suspect similar problems would arise in other similar communities. Different methods of clinician/patient information exchange need to be developed for diabetes in this South Asian group. [source] Psychodrama: helping families to adapt to childhood diabetesEUROPEAN DIABETES NURSING, Issue 3 2006B Bektas RN. Abstract Effective management of diabetes in children requires a holistic approach that takes into account the roles of diabetes education, treatment and disease management, and the integral role of family relationships. Psychodrama is a group-based psychological support technique that aims to improve the acceptance and understanding of diabetes within the families of diagnosed children. Through group improvisation, role plays and feedback sessions, the families of children with diabetes participate in a cathartic process that helps them to share their problems, benefit from others' insight and feedback and to discuss behavioural changes that will avoid similar problems in the future. The families that participated in this study reported an enhanced understanding of the contribution that relationships with their children have on the successful management of their diabetes. Through recognition of the reasons for their anxieties about their children's diabetes, they were able to address fixed behavioural patterns in a supportive, non-judgmental arena, and to work towards positive change. Their children benefited indirectly through changes in their parents' behaviour and improved communication within their families. A reduction in the children's HbA1c levels was observed through the course of the study, although this could not be considered a direct result of psychodrama. Copyright © 2006 FEND. [source] Review of the Integrated Groundwater and Surface-Water Model (IGSM)GROUND WATER, Issue 2 2003Eric M. LaBolle Development of the finite-element-based Integrated Groundwater and Surface-Water Model (IGSM) began in the 1970s. Its popularity grew in the early 1990s with its application to California's Central Valley Groundwater Surface-Water Model in support of the Central Valley Project Improvement Act. Since that time, IGSM has been applied by federal, state, and local agencies to model a number of major basins in California. Our review of the recently released version 5.0 of IGSM reveals a solution methodology that deviates from established solution techniques, potentially compromising its reliability under many circumstances. One difficulty occurs because of the semi-explicit time discretization used. Combined with the fixed monthly time step of IGSM, this approach can prevent applications from accurately converging when using parameter values typically found in nature. Additionally, IGSM fails to properly couple and simultaneously solve ground water and surface water models with appropriate mass balance and head convergence under the reasonable conditions considered herein. As a result, IGSM-predicted streamflow is error prone, and errors could exceed 100%. IGSM does not inform the user that there may be a convergence problem with the solution, but instead generally reports good mass balance. Although our review touches on only a few aspects of the code, which exceeds 17,000 lines, our experience is that similar problems arise in other parts of IGSM. Review and examples demonstrate the potential consequences of using the solution methods in IGSM for the prediction, planning, and management of water resources, and provide perspective on the roles of standards and code validation in ground water modeling. [source] Haematological toxicity of drugs used in psychiatry,HUMAN PSYCHOPHARMACOLOGY: CLINICAL AND EXPERIMENTAL, Issue S1 2008Robert J. Flanagan Abstract Almost all classes of psychotropic agents have been reported to cause blood dyscrasias. Mechanisms include direct toxic effects upon the bone marrow, the formation of antibodies against haematopoietic precursors or involve peripheral destruction of cells. Agranulocytosis is probably the most important drug-related blood dyscrasia. The mortality from drug-induced agranulocytosis is 5,10% in Western countries. The manifestations of agranulocytosis are secondary to infection. Aggressive treatment with intravenous broad-spectrum antimicrobials and bone marrow stimulants may be required. Of drugs encountered in psychiatry, antipsychotics including clozapine (risk of agranulocytosis approximately 0.8%, predominantly in the first year of treatment) and phenothiazines (chlorpromazine agranulocytosis risk approximately 0.13%), and antiepileptics (notably carbamazepine, neutropenia risk approximately 0.5%) are the most common causes of drug-related neutropenia/agranulocytosis. Drugs known to cause neutropenia should not be used concomitantly with other drugs known to cause this problem. High temperature and other indicators of possible infection should be looked for routinely during treatment. Clozapine is well known as a drug that can cause blood dyscrasias, but olanzapine and other atypicals may also cause similar problems. In addition to genetic factors, there are likely to be dose-related and immunological components to these phenomena. Important lessons have been learnt from the haematological monitoring that is necessary with clozapine and the monitoring has been very successful in preventing deaths related to clozapine-induced agranulocytosis. Continuing research into the mechanisms of drug-induced neutropenia and agranulocytosis may serve to further enhance the safe use not only of clozapine, but also of other agents. Copyright © 2007 John Wiley & Sons, Ltd. [source] A comparison of boundary element and finite element methods for modeling axisymmetric polymeric drop deformationINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN FLUIDS, Issue 7 2001Russell Hooper Abstract A modified boundary element method (BEM) and the DEVSS-G finite element method (FEM) are applied to model the deformation of a polymeric drop suspended in another fluid subjected to start-up uniaxial extensional flow. The effects of viscoelasticity, via the Oldroyd-B differential model, are considered for the drop phase using both FEM and BEM and for both the drop and matrix phases using FEM. Where possible, results are compared with the linear deformation theory. Consistent predictions are obtained among the BEM, FEM, and linear theory for purely Newtonian systems and between FEM and linear theory for fully viscoelastic systems. FEM and BEM predictions for viscoelastic drops in a Newtonian matrix agree very well at short times but differ at longer times, with worst agreement occurring as critical flow strength is approached. This suggests that the dominant computational advantages held by the BEM over the FEM for this and similar problems may diminish or even disappear when the issue of accuracy is appropriately considered. Fully viscoelastic problems, which are only feasible using the FEM formulation, shed new insight on the role of viscoelasticity of the matrix fluid in drop deformation. Copyright © 2001 John Wiley & Sons, Ltd. [source] SmartMetals: a new method for metal identification based on fuzzy logicJOURNAL OF CHEMOMETRICS, Issue 11 2009Viktor Pocajt Abstract This paper presents a method of searching, identifying and cross-referencing metal alloys based on their chemical composition and/or mechanical properties, typically obtained by analysis and tests. The method uses a general pattern similar to the approach of a human expert, and relies on a classification of metals based on metallurgical expertise and fuzzy logic for identifying metals and comparing their chemical and mechanical properties. The algorithm has been tested and deployed in real applications for fast metal identification and finding of unknown equivalents, by the leading companies in the field. The same principles can also be used in other domains for similar problems, such as organic and inorganic materials identification and generic drugs comparison. Copyright © 2009 John Wiley & Sons, Ltd. [source] Handling Mass Death by Integrating the Management of Disasters and Pandemics: Lessons from the Indian Ocean Tsunami, the Spanish Flu and Other IncidentsJOURNAL OF CONTINGENCIES AND CRISIS MANAGEMENT, Issue 2 2007Joseph Scanlon At first glance, there appear to be significant differences between mass death from disasters and catastrophes and mass death from pandemics. In a disaster or catastrophe the major problem is identifying the dead and, sometimes, determining cause of death. This can be very frustrating for next of kin. In a pandemic, the identity of the dead is usually known as is the cause of their death. There is an immediate certainty in pandemic death. Despite these major differences there are many similarities. Because it takes time to identify the dead after a disaster or catastrophe, there is a steady release of bodies for cremation or burial, just as in a pandemic. In both types of incidents, there tends to be a shortage of supplies and personnel and, therefore, a need for use of volunteers. There are also massive amounts of paper work. This would suggest a need in both cases for stockpiling and for training of volunteers. And, although this does not always happen, both types of incidents tend to strike harder among the poorer elements in cities yet both create serious economic problems. Despite these many similarities, planning for the first tends to be done by emergency agencies, especially the police; planning for the second by health agencies. Given the many similarities this separation makes no sense. Since both types of mass death incidents lead to similar problems, it would make sense to take an all-hazards approach to planning for dealing with mass death. [source] Commonalities in the neurobiology between autism and fragile XJOURNAL OF INTELLECTUAL DISABILITY RESEARCH, Issue 10 2008R. Hagerman There is a close association between autism and fragile X syndrome (FXS) with 30% of males with FXS having autism and 2 to 7% of children with autism having the fragile X mutation. The protein that is missing or deficient in FXS, FMRP, is an RNA binding and transport protein which regulates the translation of many messages important for synaptic plasticity. Typically FMRP inhibits the translation of these messages, such that protein production increases when FMRP is absent. Some of these proteins are known to also cause autism when they are mutated including neuroligin 3 and 4 and the SHANK protein. Therefore, when FMRP is missing there is dysregulation of other proteins that are known to cause autism. FMRP is an important inhibitor of protein production in the metabotropic glutamate receptor 5 pathway (mGluR5) which leads to long term depression (LTD) or the weakening of synaptic connections. Therefore, when FMRP is missing there is enhanced mGluR5 activity leading to enhanced LTD and weak or immature synaptic connections. The use of mGluR5 antagonists to reverse the LTD in the animal models of FXS has led to reversal of the learning, behaviour and dendritic spine abnormalities in these animals. There are now initial studies taking place in humans regarding the use of mGluR5 antagonists to improve behaviour and cognition in FXS. It is likely that these mGluR5 antagonists will also be helpful in a subgroup of patients with non fragile X autism who have similar problems with hyperactivity, hyperarousal and anxiety to those seen in FXS. A second cause of autism is the fragile X premutation but this mechanism of involvement is related to RNA toxicity which perhaps stimulates neuroimmune problems and may mimic other causes of autism. Neurons with the premutation are more vulnerable to environmental toxicity and oxidative stress leading to early cell death. [source] Guaranteed inconsistency avoidance during software evolutionJOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 6 2003Keith Gallagher Abstract The attempt to design and integrate consistent changes to an existing system is the essence of software maintenance. Software developers also confront similar problems: there are changes during testing and the release of new system builds. Whether in development or maintenance, changes to evolving systems must be made consistently; that is, without damaging correct computations. It is difficult for the programmer to ascertain the complete effect of a code change; the programmer may make a change to a program that is syntactically and semantically legal, but which has ripples into the parts of the program that were intended to remain unchanged. Using the standard denotational semantics for procedural programming languages, this paper formalizes decomposition slicing, which identifies interferences between software components and isolates the components to be changed. We enumerate the conditions for changing one component in ways that will guarantee that changes to it will not interact inconsistently and prove that changes made under these conditions are sound. Thus, the programmer can then execute changes secure in the knowledge that the semantics of the new system are guaranteed to be consistent with the projection of the semantics of the original for which it behaved correctly. Validating that the changes do not interfere not only guarantees consistency with respect to previous unchanging behaviors, but can also be achieved with a complexity proportional to the size of the change to be made. Copyright © 2003 John Wiley & Sons, Ltd. [source] The stone forum: Implementing a consensus building methodology to address impacts associated with small mining and quarry operationsNATURAL RESOURCES FORUM, Issue 1 2000C. Peiter Abstract Small-scale mining, including quarry operations, continues to play an important social and economic role in hundreds of communities throughout Brazil. Often operating outside the formal economy, conflicts between the owners of small-scale mining operations, the mineworkers, various government agencies, and other stakeholders have contributed to the progressive degradation of the environment, poor health and safety standards, and low productivity. The Centre for Mineral Technology (CETEM) of Brazil is implementing a consensus building methodology in order to produce dimension stone by small-scale miners on a more sustainable basis in the Pádua region, located in the northwest of the State of Rio de Janeiro. The approach being used by CETEM is based on its own experience in working with gold prospectors in the Amazon, and lessons and experiences learned from Canadian officials and industry representatives. The lessons and insights gained from this project may prove to be useful to those involved in addressing similar problems elsewhere in Brazil, South America and around the world. [source] Skin and oral mucosa equivalents: construction and performanceORTHODONTICS & CRANIOFACIAL RESEARCH, Issue 1 2010J Liu To cite this article: Liu J, Bian Z, Kuijpers-Jagtman AM, Von den Hoff JW: Skin and oral mucosa equivalents: construction and performance Orthod Craniofac Res 2010;13:11,20 Abstract Authors,,, Liu J, Bian Z, Kuijpers-Jagtman AM, Von den Hoff JW The skin and the oral mucosa act as a barrier against the external environment. Loss of this barrier function causes dehydration and a high risk of infection. For the treatment of extensive skin wounds such as in severe burns, autologous skin for transplantation is often not available in sufficient amounts. Reconstructions in the oral cavity, as required after tumor resections or cleft palate repair, are often complicated by similar problems. In the last two decades, the field of tissue engineering has provided new solutions to these problems. Techniques have been developed for the culture of epithelial grafts, dermal substitutes, and the combination of these two to a ,functional' skin or mucosa equivalent. The present review focuses on developments in the field of tissue engineering of skin and oral mucosa. The performance of different types of engineered grafts in animal models and clinical studies is discussed. Recent developments such as the use of epithelial stem cells, and gene therapy with transduced skin grafts are also discussed. [source] Crannogs and Island Duns: Classification, Dating and FunctionOXFORD JOURNAL OF ARCHAEOLOGY, Issue 3 2000D. W. Harding A recent paper, Islets through Time (OJA 17, 2, 227--44), by Jon Henderson highlighted the fact that the majority of dated crannogs were occupied in the later prehistoric or early historic period, and offered a new classification of artificial islets. This paper addresses consequential issues of definition and classification and urges that artificial islets, whether classed hitherto as crannogs or island duns, should be seen as complementary elements within a spectrum of settlement types, in particular for the Early Iron Age and the early historic periods. Comparison shows that studies of crannogs and their land-based counterparts have faced similar problems of interpretation and that typological compartmentalization has acted to the detriment of a proper understanding of both. [source] A Problem-based Learning Model for Teaching the Instructional Design Business Acquisition ProcessPERFORMANCE IMPROVEMENT QUARTERLY, Issue 1 2002Karl M. Kapp ABSTRACT There is a growing emphasis on utilizing a problem-based learning [PBL] pedagogy to help instructional design students gain an understanding of the complex forces operating within an actual design environment. However, little literature exists to suggest that PBL is being used to teach the process by which instructional design firms and practitioners secure work,the Instructional Design Business Acquisition Process (IDBAP). This study outlines a conceptual framework for using an adapted problem-based learning model for teaching the IDBAP, which consists of writing a response to a request for proposal (RFP), developing a working prototype, and orally presenting the solution. This study also examines the impact of a PBL pedagogy on students' perception of their confidence in solving instructional design problems. The results of this empirical research indicate that students who participate in a problem-based learning pedagogy gain confidence in their abilities to solve instructional design problems, view themselves in emotional control when solving an instructional design problem, and are more inclined to approach similar problems in the future. [source] From experience: Capturing hard-won NPD lessons in checklistsTHE JOURNAL OF PRODUCT INNOVATION MANAGEMENT, Issue 5 2001Raymond F. Riek The application of a good New Product Development (NPD) process is frequently limited by the experience of the user. Avoiding relatively minor errors and omissions that can lead to seriously flawed project results is still an art. Checklists for each stage of a development project can capture this art and their disciplined use can avoid many potentially critical omissions and errors. Development of checklists frequently comes from the hard experiences many of us have had in bringing new products to market. Consequently, benchmarking "trials and tribulations" rather than success stories can be more appropriate to developing a thoughtful checklist. This article is a partial accumulation of one practitioner's experiences of over three decades of executing, managing, directing and observing these projects. Fifteen NPD case histories are examined to develop learnings from these experiences. These cases are organized around three basic product development issues: managing technical risks, managing commercial risks, and managing NPD personnel. In these examples, NPD project problems have a common theme of poor technical or commercial risk management, as opposed to technical failure. Improved planning and a more disciplined management interface would have avoided many of the problems discussed in these case histories. Analysis of each of the case histories and learnings is provided from which suggested checklist items are derived. These checklist additions are presented by development stage to allow use by other NPD teams, with the intention of avoiding the repetition of similar problems. [source] Identifying and overcoming the potential barriers to the adoption of natural orifice transluminal endoscopic surgeryASIAN JOURNAL OF ENDOSCOPIC SURGERY, Issue 2 2010S. D. Schwaitzberg Abstract Natural orifice translumenal endoscopic surgery (NOTES) is an emerging innovative approach to performing minimally invasive surgical procedures. In its full potential, the concept of incisionless surgery will have mass appeal to patients. However, the barriers to adopting NOTES will have to be overcome before widespread acceptance of these techniques can occur. These potential barriers include infection, visceral leakage, difficulties in tissue manipulation, and increased cost. The history of surgical innovation has continuously overcome similar problems in other settings, and all of these potential obstacles are likely solvable. Training surgeons will be an additional barrier that will need to be overcome, but this obstacle will need to be approached differently than when laparoscopy was introduced, as standards are higher today for privileging and credentialing in most hospitals than 20 years ago. Alternative technologies that were not adopted prior to the introduction of NOTES may now appear more viable making the competitive environment more complex. Increased funding for comparative effectiveness studies and training for competency in innovation will also need original solutions, but are clearly in our patients' best interest. [source] NATURE, MARKETS AND STATE RESPONSE: THE DROUGHT OF 1939 IN JAPAN AND KOREAAUSTRALIAN ECONOMIC HISTORY REVIEW, Issue 1 2010Janet Hunter drought; hydroelectricity; Japan; Korea; rice production Large areas of Northeast Asia experienced drought in 1939. Agricultural production in Korea decreased significantly, but the drought did not cause famine in Japan despite its dependence on rice imports from Korea. The paper analyses the impact of the 1939 drought on the markets for rice and electricity in Japan. The authorities were ill-prepared for such a disaster but willing to use it for the purpose of covering for other problems. The drought thus accelerated the move of Japan's economic system towards a managed economy. A lower total rainfall in Japan in 1940 did not generate similar problems, suggesting that the broader political, economic, and social context is crucial to the identification of short-term climatic fluctuations as crises. [source] |