Home About us Contact | |||
Traditional Approaches (traditional + approach)
Selected AbstractsGood Soldiers, A Traditional ApproachJOURNAL OF APPLIED PHILOSOPHY, Issue 1 2001Hilliard Aronovitch This article contends that in crucial respects effective soldiers are ethical soldiers, that good soldiers in the military sense are good soldiers in the moral sense, and that this is so for quite traditional reasons. The thesis is defended by identifying and then resolving basic paradoxes regarding what soldiers must be trained to do or be, e.g.: be trained to kill but also not to be brutal; be trained to react in combat situations almost automatically but also to deliberate and decide if a command is unlawful; as peacekeepers, be trained to be impartial but also to know right from wrong and be firmly committed to upholding the former and opposing the latter. It is shown that contradictory things are not really thus being called for. With the aid of a blend of deontology and virtue theory, it is argued that certain standard qualities of effective soldiers have an associated moral dimension. For example, true military courage implies an unwillingness to engage in cruelty; the self-control on which success of missions depends implies eschewing motives of personal vengeance; and the capacity for comprehending complex equipment and data implies a mentality for assessing the validity of orders. [source] Safety of Minimally Invasive Pituitary Surgery (MIPS) Compared with a Traditional ApproachTHE LARYNGOSCOPE, Issue 11 2004David R. White MD Introduction: Transsphenoidal hypophysectomy is becoming progressively less invasive. Recent endoscopic techniques avoid nasal or intraoral incisions, use of nasal speculums, and nasal packing. Several case series of endoscopic endonasal pituitary surgery have been reported, but relatively little data exists comparing complication rates to more traditional approaches. We compare the complications of our first 50 cases of endoscopic, minimally invasive pituitary surgery (MIPS) to our last 50 sublabial transseptal (SLTS) procedures. Study Design: Retrospective case control study. Methods: Fifty consecutive MIPS procedures and 50 consecutive SLTS procedures were reviewed retrospectively. Complication rates were analyzed and compared. Results: Total complications per patient (P = .005), postoperative epistaxis (P = .031), lip anesthesia (P = .013), and deviated septum (P = .028) occurred more often in the SLTS group. No significant difference was seen in cerebrospinal fluid leak, meningitis, ophthalmoplegia, visual acuity loss, diabetes insipidus, intracranial hemorrhage, or death. In the MIPS group, length of stay (P < .001), use of lumbar drainage (P = .007), and nasal packing (P < .001) were also significantly reduced. Conclusions: Endoscopic endonasal pituitary surgery provides improved complication rates when compared with SLTS approaches. In addition, we note advantages of the MIPS approach, including reduced length of hospital stay and decreased use of lumbar drainage and nasal packing. [source] Economic Production Lot Sizing with Periodic Costs and Overtime,DECISION SCIENCES, Issue 3 2001E. Powell Robinson Jr Abstract Traditional approaches for modeling economic production lot-sizing problems assume that a single, fixed equipment setup cost is incurred each time a product is run, regardless of the quantity manufactured. This permits multiple days of production from one production setup. In this paper, we extend the model to consider additional fixed charges, such as cleanup or inspection costs, that are associated with each time period's production. This manufacturing cost structure is common in the food, chemical, and pharmaceutical industries, where process equipment must be sanitized between item changeovers and at the end of each day's production. We propose two mathematical problem formulations and optimization algorithms. The models' unique features include regular time production constraints, a fixed charge for each time period's production, and the availability of overtime production capacity. Experimental results indicate the conditions under which our algorithms' performance is superior to traditional approaches. We also test the procedures on a set of lot-sizing problems facing a national food processor and document their potential economic benefit. [source] Challenges in the application of geometric constraint modelsGLOBAL ECOLOGY, Issue 3 2007Craig R. McClain ABSTRACT Discerning the processes influencing geographical patterns of species richness remains one of the central goals of modern ecology. Traditional approaches to exploring these patterns have focused on environmental and ecological correlates of observed species richness. Recently, some have suggested these approaches suffer from the lack of an appropriate null model that accounts for species ranges being constrained to occur within a bounded domain. Proponents of these null geometric constraint models (GCMs), and the mid-domain effect these models produce, argue their utility in identifying meaningful gradients in species richness. This idea has generated substantial debate. Here we discuss what we believe are the three major challenges in the application of GCMs. First, we argue that there are actually two equally valid null models for the random placement of species ranges within a domain, one of which actually predicts a uniform distribution of species richness. Second, we highlight the numerous decisions that must be made to implement a GCM that lead to marked differences in the predictions of the null model. Finally, we discuss challenges in evaluating the importance of GCMs once they have been implemented. [source] A sociotechnical approach to achieve zero defect manufacturing of complex manual assembliesHUMAN FACTORS AND ERGONOMICS IN MANUFACTURING & SERVICE INDUSTRIES, Issue 2 2007Kitty Hong Traditional approaches to defect reduction in manufacturing environments rely heavily on the introduction of technology-based detection techniques that require significant investments in equipment and technical skills. In this article, the authors outline a novel, alternative approach that utilizes the largely untapped abilities of assembly-line operators. Targeting zero-defect manufacturing, the SEISMIC (stabilize, evaluate, identify, standardize, monitor, implement, and control) methodology developed herein is a sociotechnical-based system built on the decentralization of technical knowledge and the transfer of responsibility for product quality from technical staff to manual operators. Along with defect reduction, important secondary goals of the SEISMIC methodology are improved operator performance and job satisfaction. The SEISMIC methodology provides a quantitative approach for classifying assembly environments and determining their required skill sets. Effective methods for transferring the identified skills throughout the production team are also provided. A pilot application of the protocol in an automotive assembly environment has achieved promising results in the target areas of defect reduction and operator performance. © 2007 Wiley Periodicals, Inc. Hum Factors Man 17: 137,148, 2007. [source] The effect of particle shape and grain-scale properties of shale: A micromechanics approachINTERNATIONAL JOURNAL FOR NUMERICAL AND ANALYTICAL METHODS IN GEOMECHANICS, Issue 11 2010J. A. Ortega Abstract Traditional approaches for modeling the anisotropic elasticity response of the highly heterogeneous clay fabric in shale have mainly resorted to geometric factors such as definitions of particles shapes and orientations. However, predictive models based on these approaches have been mostly validated using macroscopic elasticity data. The recent implementation of instrumented indentation aimed at probing nano-scale mechanical behaviors has provided a new context for characterizing and modeling the anisotropy of the porous clay in shale. Nanoindentation experimental data revealed the significant contribution of the intrinsic anisotropy of the solid clay to the measured elastic response. In this investigation, we evaluate both the effects of geometric factors and of the intrinsic anisotropic elasticity of the solid clay phase on the observed anisotropy of shale at multiple length scales through the development of a comprehensive theoretical micromechanics approach. It was found that among various combinations of these sources of anisotropy, the elastic response of the clay fabric represented as a granular ensemble of aligned effective clay particles with spherical morphology and anisotropic elasticity compares satisfactorily to nanoindentation and ultrasonic pulse velocity measurements at nano- and macroscopic length scales, respectively. Other combinations of sources of anisotropy could yield comparable predictions, particularly at macroscopic scales, at the expense of requiring additional experimental data to characterize the morphology and orientations of particles. Copyright © 2009 John Wiley & Sons, Ltd. [source] Segmenting youth voting behaviour through trusting,distrusting relationships: a conceptual approachINTERNATIONAL JOURNAL OF NONPROFIT & VOLUNTARY SECTOR MARKETING, Issue 3 2004Janine Dermody This paper reviews current evidence on the declining political engagement of British youth. What emerges is that causes of their political disaffection are manifold and complex, but trust, distrust and cynicism feature strongly. Traditional approaches to trust and distrust fail to recognise this complexity; consequently this paper offers a more sophisticated conceptual framework that examines trust and distrust as separate but linked dimensions, as advocated by Lewicki, McAllister and Bies.[Lewicki, R. J., McAllister D. J. and Bies R. J. (1998) ,Trust and distrust: New relationships and realities', Academy of Management Review, Vol. 23, No. 3, pp. 438,458.] From the analysis four segments of ,voter' types are identified. By segmenting voters in this way, marketers can design strategies to help increase young people's trust and reduce their distrust, thereby increasing their propensity to vote in future elections. A synopsis of marketing aims to stimulate the ,youth vote' is presented along with areas for further research. Copyright © 2004 Henry Stewart Publications [source] Non-diagonal controllers in MIMO quantitative feedback designINTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 4 2002Edward Boje Abstract This paper discusses multivariable quantitative feedback design through the use of controllers with off-diagonal elements. Controller design for multivariable plants with significant uncertainty is simpler and potentially less conservative if some sort of dominance is achieved (by reducing the interaction effect of off-diagonal plant elements) before a diagonal (decentralized) controller design is attempted. Traditional approaches for achieving dominance are not applicable when plant uncertainty must be considered. This paper discusses parallel and series implementations and for the latter, a pseudo-Gauss elimination approach to the design has been developed. The interaction is measured using the Perron,Frobenius root of an interaction matrix. In some applications, it is possible to trade off individual plant cases against each other in order to reduce to the worst-case interaction over the entire plant set. Copyright © 2002 John Wiley & Sons, Ltd. [source] Value-centric framework and pareto optimality for design and acquisition of communication satellitesINTERNATIONAL JOURNAL OF SATELLITE COMMUNICATIONS AND NETWORKING, Issue 6 2009Joy Brathwaite Abstract Investments in space systems are substantial, indivisible, and irreversible, characteristics of high-risk investments. Traditional approaches to system design, acquisition, and risk mitigation are derived from a cost-centric mindset, and as such they incorporate little information about the value of the spacecraft to its stakeholders. These traditional approaches are appropriate in stable environments. However, the current technical and economic conditions are distinctly uncertain and rapidly changing. Consequently, these traditional approaches have to be revisited and adapted to the current context. We propose that in uncertain environments, decision-making with respect to design and acquisition choices should be value-based. We develop a value-centric framework, analytical tools, and an illustrative numerical example for communication satellites. Our two proposed metrics for decision-making are the system's expected value and value uncertainty. Expected value is calculated as the expected NPV of the satellite. The cash inflow is calculated as a function of the satellite loading, its transponder pricing, and market demand. The cash outflows are the various costs for owning and operating the satellite. Value uncertainty emerges due to uncertainties in the various cash flow streams, in particular because of market conditions. We propagate market uncertainty through Monte Carlo simulation, and translate it into value uncertainty for the satellite. The end result is a portfolio of Pareto-optimal satellite design alternatives. By using value and value uncertainty as decision metrics in the down-selection process, decision-makers draw on more information about the system in its environment, and in making value-based design and acquisition choices, they ultimately make more informed and better choices. Copyright © 2009 John Wiley & Sons, Ltd. [source] Higher Cost, Lower Validity and Higher Utility: Comparing the Utilities of Two Tests that Differ in Validity, Costs and SelectivityINTERNATIONAL JOURNAL OF SELECTION AND ASSESSMENT, Issue 2 2000George C. Thornton Traditional approaches to comparing the utility of two tests have not systematically considered the effects of different levels of selectivity that are feasible and appropriate in various selection situations. For example, employers who hope to avoid adverse impact often find they can be more selective with some tests than with others. We conducted two studies to compare the utilities of two tests that differ in costs, validity, and feasible levels of selectivity which can be employed. First, an analytical solution was conducted starting with a standard formula for utility. This analysis showed that for both fixed and variable hiring costs, a higher-cost, lower-validity procedure can have higher utility than a lower-cost, higher-validity procedure when the selection ratios permissible using the two procedures are sufficiently (yet realistically) different. Second, using a computer simulation method, several combinations of the critical variables were varied systematically to detect the limits of this effect in a finite set of specific selection situations. The results showed that the existence of more severe levels of adverse impact greatly reduced the utility of a written test with relatively high validity and low cost in comparison with an assessment center with lower validity and higher cost. Both studies showed that the consideration of selectivity can yield surprising conclusions about the comparative utility of two tests. Even if one test has lower validity and higher cost than a second test, the first may yield higher utility if it allows the organization to exercise stricter levels of selectivity. [source] Clinical networks for nursing researchINTERNATIONAL NURSING REVIEW, Issue 3 2002W. P. Gillibrand MS c Abstract As a central feature of national research and development strategies, clinical effectiveness emphasizes the importance of rigorous experimental research in nursing. It is naïve to assume that over-worked practitioners, with little research training and supervision, can undertake this type of research. Traditional approaches to research support rely on the practitioner registering for a higher degree and academic supervision. This assumes that the responsibility for research lies with practice, with higher education adopting a reactive stance in supporting research and development in nursing. The literature demonstrates a growing number of innovative models for facilitating nursing research. These, however, tend to focus on single appointments with limited and predefined access to clinical areas and patient populations. This article details a new initiative from the Clinical Nursing Practice Research Unit (CNPRU) that aims to support programmatic research in nursing practice through Clinical Networks for Nursing Research. Our research strategy is to contribute to the development of nursing science by facilitating effective collaboration between clinicians and higher education in core clinical specialties, including stroke rehabilitation, diabetes, mental health and community nursing. Each researcher has developed networks with a number of clinical areas, locally, regionally or nationally, through seminars, conferences or newsletters, to link practitioners and generate answerable research questions. Network communications also rely heavily on the establishment of interactive websites. This strategy has resulted in a number of collaborative, evaluative studies including clinical trials in rehabilitation, diabetic nursing and primary care. [source] Conducting suicide research in naturalistic clinical settings,JOURNAL OF CLINICAL PSYCHOLOGY, Issue 4 2009David A. Jobes Abstract Unique challenges arise for clinical researchers designing studies focused on suicidal behaviors due to the inherently high-risk nature of such research. Traditional approaches to clinical trial design are briefly discussed, highlighting the limitations and obstacles of these approaches when working with suicidal individuals. Using their own personal experiences and setbacks from an ongoing clinical suicidology research program, the authors argue for greater emphasis on effectiveness and translational research designs conducted in naturalistic clinical settings to test the practical utility of empirically-supported treatments for suicidal behaviors, to gain new perspectives on suicidal individuals, and to better understand the nature of suicidal risk. © 2009 Wiley Periodicals, Inc. J Clin Psychol 65:1,14, 2009. [source] WATER QUALITY MODELING OF ALTERNATIVE AGRICULTURAL SCENARIOS IN THE U.S. CORN BELT,JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 3 2002Kellie B. Vaché ABSTRACT: Simulated water quality resulting from three alternative future land-use scenarios for two agricultural watersheds in central Iowa was compared to water quality under current and historic land use/land cover to explore both the potential water quality impact of perpetuating current trends and potential benefits of major changes in agricultural practices in the U.S. Corn Belt. The Soil Water Assessment Tool (SWAT) was applied to evaluate the effect of management practices on surface water discharge and annual loads of sediment and nitrate in these watersheds. The agricultural practices comprising Scenario 1, which assumes perpetuation of current trends (conversion to conservation tillage, increase in farm size and land in production, use of currently-employed Best Management Practices (BMPs)) result in simulated increased export of nitrate and decreased export of sediment relative to the present. However, simulations indicate that the substantial changes in agricultural practices envisioned in Scenarios 2 and 3 (conversion to conservation tillage, strip intercropping, rotational grazing, conservation set-asides and greatly extended use of best management practices (BMPs) such as riparian buffers, engineered wetlands, grassed waterways, filter strips and field borders) could potentially reduce current loadings of sediment by 37 to 67 percent and nutrients by 54 to 75 percent. Results from the study indicate that major improvements in water quality in these agricultural watersheds could be achieved if such environmentally-targeted agricultural practices were employed. Traditional approaches to water quality improvement through application of traditional BMPs will result in little or no change in nutrient export and minor decreases in sediment export from Corn Belt watersheds. [source] Integrating positive psychology into schools: Implications for practicePSYCHOLOGY IN THE SCHOOLS, Issue 1 2004Mark D. Terjesen Traditional approaches for working with children and families in the schools focus on problems and disturbance. The concept of positive psychology as a way to change this focus is offered through exploration of its integration within school psychology. Specifically, the application of positive psychology can form the basis of preventive practices within the school setting. Examples of this application are provided within common roles of the school psychologist (consultation, direct work, educational assessment and planning). © 2004 Wiley Periodicals, Inc. Psychol Schs 41: 163,172, 2004. [source] Artificial Manipulation of Voice in the Human by an Implanted StimulatorTHE LARYNGOSCOPE, Issue 10 2008FACS, Michael Broniatowski MD Abstract Objectives/Hypothesis: Traditional approaches influencing voice quality (e.g., anatomical and chemical denervation for spasmodic dysphonia, surgical medialization for paralysis) have ignored the dynamic nature of the larynx. Study Design: We report here the first attempt to manipulate voice using an implanted stimulator to systematically control vocal fold adduction. Methods: Devices placed for aspiration in three subjects retaining speech after stroke, cerebral palsy, and multiple sclerosis were used to stimulate recurrent laryngeal nerves with 42 Hz, 52 to 200 microsecond pulses of incremental amplitudes during phonation with the tracheostomy tube occluded. Vocal fold adduction increased with stimulation strength (P < .05). Speech was analyzed with the Vox Metria program. Results: We found highly significant differences for fundamental frequency (P < .007), jitter (P < .004), and shimmer (P < .005), between natural and stimulated voice (aah and eeh) when using higher charges. Conclusions: Dynamic vocal fold manipulation seems promising in terms of versatility lacking with static approaches to voice control. [source] Models for individual oral health promotion and their effectiveness: a systematic reviewAUSTRALIAN DENTAL JOURNAL, Issue 3 2009D Yevlahova Abstract Background:, There is a recognized need to deliver oral health information to people during clinical encounters to enable them to develop personal skills in managing their own oral health. Traditional approaches to individual oral health education have been shown to be largely ineffective and new approaches are required to address personal motivations for preventive behaviour. This systematic review aims to identify and assess the effectiveness of behaviour models as a basis for individual oral health promotion. Methods:, Electronic databases were searched for articles evaluating the effectiveness of health behaviour models in oral and general health between 2000 and 2007. Eighty-nine studies were retrieved and data were extracted from the 32 studies that met the inclusion criteria. Results:, Thirty-two studies were identified in the fields of clinical prevention and health education, motivational interviewing (MI), counselling, and models based interventions. MI interventions were found to be the most effective method for altering health behaviours in a clinical setting. Conclusions:, There is a need to develop an effective model for chairside oral health promotion that incorporates this evidence and allows oral health professionals to focus more on the underlying social determinants of oral disease during the clinical encounter. There is potential to further develop the MI approach within the oral health field. [source] Politics, leadership, and experience in designing Ontario's cabinetCANADIAN PUBLIC ADMINISTRATION/ADMINISTRATION PUBLIQUE DU CANADA, Issue 2 2001Ted Glenn Traditional approaches to this question stress the importance of representational imperatives (i.e., region, language and gender), the need for managerial capacity and collegiality in complex organizations, or a particular government's fiscal or policy program. While these approaches have merit, they fail to pay sufficient attention to the fact that cabinet decision-making systems are in the first instance very intimate reflections and extensions of the political instincts, personal aptitudes, and governing experience of first ministers. The author sets out to understand recent reforms to Ontario cabinet decision-making in precisely this way - how did Premier Michael Harris' sense of his government's mandate, his personal approach to decision-making, and the practical lessons learned over the course of his government's first mandate influence the design of Ontario's cabinet decision-making system between 1995 and 1999? This article begins with a short history of Ontario's cabinet decision-making system, focusing on the period from 1968 to 1995. It then provides details of reforms introduced between 1995 and 1999 and concludes with some thoughts on how Premier Harris' political instincts, personal aptitudes, and governing experience influenced these reforms. Sommaire: Pourquoi les systèmes de prise de décisions du Cabinet sont-ils conçus comme ils le sont? Les réponses traditionnelles à cette question soulignent l'impor-tance des impératifs de représentation (c.-à-d. la région, la langue et le sexe), le besoin de compétence en matière de gestion et la collégialité dans les organismes complexes, ou bien un programme politique ou budgétaire particulier du gouvernement. Ces approches sont valables, mais elles ne tiennent pas suffisamment compte du fait que les systemes de prise de décisions du Cabinet sont, avant tout, le fruit de reflexions très approfondies et d'instincts politiques, d'aptitudes personnelles, et de l'expérience gouvernementale des premiers ministres. L'auteur de cet article essaie de comprendre, précisément dans ce sens, les récentes réformes en matière de prise de décisions au Cabinet de 1'Ontario: comment est-ce que l'idée qu'a Michael Harris du mandat de son gouvernement, son approche personnelle face à la prise de décisions, et les leçons pratiques tirées de son premier mandat (1995,1999) ont-elles influencé la conception du système de prise de décisions du Cabinet de l'Ontario? L'auteur commence par brosser un bref historique du système de prise de décisions du Cabinet de l'Ontario, en se penchant tout particulièrement sur la période allant de 1968 A 1995. Ensuite, il présente en détail les réformes introduites de 1995 à 1999 et conclut par quelques réflexions sur la manière dont les instincts politiques du Premier ministre Harris, ses aptitudes personnelles et son expérience du gouvernement ont influencé ces réformes. [source] Stable stylized wireframe renderingCOMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 3-4 2010Chen Tang Abstract Stylized wireframe rendering of 3D model is widely used in animation software in order to depict the configuration of deformable model in comprehensible ways. However, since some inherent flaws in traditional depth test based rendering technology, shape of lines can not been preserved as continuous movement or deformation of models. There often exists severe aliasing like flickering artifact when objects rendered in line form animate, especially rendered with thick or dashed line. To cover this artifact, unlike traditional approach, we propose a novel fast line drawing method with high visual fidelity for wireframe depiction which only depends on intrinsic topology of primitives without any preprocessing step or extra adjacent information pre-stored. In contrast to previous widely-used solutions, our method is advantageous in highly accurate visibility, clear and stable line appearance without flickering even for thick and dashed lines with uniform width and steady configuration as model moves or animates, so that it is strongly suitable for animation system. In addition, our approach can be easily implemented and controlled without any additional preestimate parameters supplied by users. Copyright © 2010 John Wiley & Sons, Ltd. [source] Using Quality Management Tools to Enhance Feedback from Student EvaluationsDECISION SCIENCES JOURNAL OF INNOVATIVE EDUCATION, Issue 1 2005John B. Jensen ABSTRACT Statistical tools found in the service quality assessment literature,the T2 statistic combined with factor analysis,can enhance the feedback instructors receive from student ratings. T2 examines variability across multiple sets of ratings to isolate individual respondents with aberrant response patterns (i.e., outliers). Analyzing student responses that are outside the "normal" range of responses can identify aspects of the course that cause pockets of students to be dissatisfied. This fresh insight into sources of student dissatisfaction is particularly valuable for instructors willing to make tactical classroom changes that accommodate individual students rather than the traditional approach of using student ratings to develop systemwide changes in course delivery. A case study is presented to demonstrate how the recommended procedure minimizes data overload, allows for valid schoolwide and longitudinal comparisons of correlated survey responses, and helps instructors identify priority areas for instructional improvement. [source] Eastern Donors and Western Soft Law: Towards a DAC Donor Peer Review of China and India?DEVELOPMENT POLICY REVIEW, Issue 5 2010Sebastian Paulo The international system is still governed by a normative framework designed mainly by OECD countries, especially with regard to soft-law standards in the field of development co-operation. However, the growing relevance of ,Eastern donors' is weakening its efficiency and raises the question of how compliance with these standards can be assured in a changing donor landscape. Despite efforts to integrate emerging countries into the traditional approach of the OECD Development Assistance Committee (DAC) to monitoring compliance through peer reviews, the aid architecture of the future might turn out to be a synthesis of established and new approaches. [source] A universal metric for sequential MIMO detection,EUROPEAN TRANSACTIONS ON TELECOMMUNICATIONS, Issue 8 2007Christian Kuhn Conventionally, detection in multiple-antenna systems is based on a tree-search or a lattice-search with a metric that can be computed by recursively accumulating the corresponding metric increments for a given hypothesis. For that purpose, a multiple-antenna detector traditionally applies a preprocessing to obtain the search-metric in a suitable form. In contrast to that, we present a reformulation of the search-metric that directly allows for an appropriate evaluation of the metric on the underlying structure without the need for a computationally costly preprocessing step. Unlike the traditional approach, the new metric can also be applied when the system has fewer receive than transmit antennas. We present simulation results in which the new metric is applied for turbo detection involving the list-sequential (LISS) detector that was pioneered by Joachim Hagenauer. Copyright © 2007 John Wiley & Sons, Ltd. [source] ONE CASE,ONE SPECIALIZED JUDGE: WHY COURTS HAVE AN OBLIGATION TO MANAGE ALIENATION AND OTHER HIGH-CONFLICT CASESFAMILY COURT REVIEW, Issue 1 2010Hon. Donna J. Martinson This article challenges the traditional approach to alienation and other high-conflict cases in which many different generalist judges deal with the case. The objectives of the judicial process, dealing with cases in a just, timely, and affordable way that instils confidence in the public and litigants, cannot be met unless high-conflict cases are actively managed by one specialist family law judge. Allowing parents in high-conflict cases to decide when and how often their case should come before the court exacerbates the negative effects of the litigation on children. This article concludes that, unless the litigation is properly managed by specialist judges, the justice system unintentionally causes harm to children. [source] Importance of Unsaturated Zone Flow for Simulating Recharge in a Humid ClimateGROUND WATER, Issue 4 2008Randall J. Hunt Transient recharge to the water table is often not well understood or quantified. Two approaches for simulating transient recharge in a ground water flow model were investigated using the Trout Lake watershed in north-central Wisconsin: (1) a traditional approach of adding recharge directly to the water table and (2) routing the same volume of water through an unsaturated zone column to the water table. Areas with thin (less than 1 m) unsaturated zones showed little difference in timing of recharge between the two approaches; when water was routed through the unsaturated zone, however, less recharge was delivered to the water table and more discharge occurred to the surface because recharge direction and magnitude changed when the water table rose to the land surface. Areas with a thick (15 to 26 m) unsaturated zone were characterized by multimonth lags between infiltration and recharge, and, in some cases, wetting fronts from precipitation events during the fall overtook and mixed with infiltration from the previous spring snowmelt. Thus, in thicker unsaturated zones, the volume of water infiltrated was properly simulated using the traditional approach, but the timing was different from simulations that included unsaturated zone flow. Routing of rejected recharge and ground water discharge at land surface to surface water features also provided a better simulation of the observed flow regime in a stream at the basin outlet. These results demonstrate that consideration of flow through the unsaturated zone may be important when simulating transient ground water flow in humid climates with shallow water tables. [source] Gravel-Corrected Kd ValuesGROUND WATER, Issue 6 2000Daniel I. Kaplan Standard measurements of solute sorption to sediments are typically made on the <2 mm sediment fraction. This fraction is used by researchers to standardize the method and to ease experimental protocol so that large labware is not required to accommodate the gravel fraction (>2 mm particles). Since sorption is a phenomenon directly related to surface area, sorption measurements based on the <2 mm fraction would be expected to overestimate actual whole-sediment values for sediments containing gravel. This inaccuracy is a problem for ground water contaminant transport modelers who use laboratory-derived sorption values, typically expressed as a distribution coefficients (Kd), to calculate the retardation factor (Rf), a parameter that accounts for solute-sediment chemical interactions. The objectives of this laboratory study were to quantify the effect of gravel on strontium Kd and Rf values and to develop an empirical method to calculate gravel-corrected Kdgc values for the study site (Hanford Site in Richland, Washington). Three gravel corrections, Kd values, were evaluated: a correction based on the assumption that the gravel simply diluted the Kd<2mm and had no sorption capacity (Kdgc,g=0), a correction based on the assumption that the Kd of the intact sediment (Kdtot was a composite of the Kd<2mm and the Kd>2mm (Kdgc,g = x), and a correction based on surface area (Kdgc,surf). On average, Kd<2mm tended to overestimate Kdtot by 28% to 47%; Kdgc,g = x overestimated Kdtot by only 3% to 5%; and Kdgc,g = 0 and Kdgc,surf underestimated Kdtot by 10% to 39%. Kdgc,g = x provided the best estimate of actual values (Kdtot); however, Kdgc,g = 0 was appreciably easier to acquire. Although other contaminants will likely have different gravel-correction values, these results have important implications regarding the traditional approach to modeling contaminant transport which uses Kd<2mm values. Such calculations may overestimate the tendency of gravel-containing sediments to retard contaminant migration. [source] Laparoscopic pancreatic surgery: a review of present results and future prospectsHPB, Issue 4 2010Omer S. Al-Taan Abstract Pancreatic surgery is still associated with a relatively high morbidity and mortality compared with other specialties. This is a result of the complex nature of the organ, the difficult access as a result of the retroperitoneal position and the number of technically challenging anastomoses required. Nevertheless, the past two decades have witnessed a steady improvement in morbidity and a decrease in mortality achieved through alterations of technique (particularly relating to the pancreatic anastomoses) together with hormonal manipulation to decrease pancreatic secretions. Recently minimally invasive pancreatic surgery has been attempted by several centres around the world which has stimulated considerable interest in this approach. The majority of the cases attempted have been distal pancreatectomies, because of the more straightforward nature of the resection and the lack of a pancreatic ductal anastomosis, but more recently reports of laparoscopic pancreaticoduodenectomy have started to appear. The reports of the series to date have been difficult to interpret and although the results are claimed to be equivalent or better than those associated with a traditional approach a careful examination of the literature and comparison with the best results previously reported does not presently support this. In the present review we examined all the reports of pancreatic procedures performed laparoscopically and compared the results with those previously achieved at open surgery. [source] Floodplain friction parameterization in two-dimensional river flood models using vegetation heights derived from airborne scanning laser altimetryHYDROLOGICAL PROCESSES, Issue 9 2003David C. Mason Abstract Two-dimensional (2-D) hydraulic models are currently at the forefront of research into river flood inundation prediction. Airborne scanning laser altimetry is an important new data source that can provide such models with spatially distributed floodplain topography together with vegetation heights for parameterization of model friction. The paper investigates how vegetation height data can be used to realize the currently unexploited potential of 2-D flood models to specify a friction factor at each node of the finite element model mesh. The only vegetation attribute required in the estimation of floodplain node friction factors is vegetation height. Different sets of flow resistance equations are used to model channel sediment, short vegetation, and tall and intermediate vegetation. The scheme was tested in a modelling study of a flood event that occurred on the River Severn, UK, in October 1998. A synthetic aperture radar image acquired during the flood provided an observed flood extent against which to validate the predicted extent. The modelled flood extent using variable friction was found to agree with the observed extent almost everywhere within the model domain. The variable-friction model has the considerable advantage that it makes unnecessary the unphysical fitting of floodplain and channel friction factors required in the traditional approach to model calibration. Copyright © 2003 John Wiley & Sons, Ltd. [source] Comparison between cohesive zone modelsINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN BIOMEDICAL ENGINEERING, Issue 11 2004K. Y. Volokh Cohesive zone models (CZMs) are widely used for numerical simulation of the fracture process. Cohesive zones are surfaces of discontinuities where displacements jump. A specific constitutive law relating the displacement jumps and proper tractions defines the cohesive zone model. Within the cohesive zone approach crack nucleation, propagation, and arrest are a natural outcome of the theory. The latter is in contrast to the traditional approach of fracture mechanics where stress analysis is separated from a description of the actual process of material failure. The common wisdom says that only cohesive strength,the maximum stress on the traction,separation curve,and the separation work,the area under the traction,separation curve,are important in setting a CZM while the shape of the traction,separation curve is subsidiary. It is shown in our note that this rule may not be correct and a specific shape of the cohesive zone model can significantly affect results of the fracture analysis. For this purpose four different cohesive zone models,bilinear, parabolic, sinusoidal, and exponential,are compared by using a block-peel test, which allows for simple analytical solutions. Numerical performance of the cohesive zone models is considered. It appears that the convergence properties of nonlinear finite element analyses are similar for all four CZMs in the case of the block-peel test. Copyright © 2004 John Wiley & Sons, Ltd. [source] Parallel Delaunay mesh generation kernelINTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 2 2003Nikos Chrisochoides Abstract We present the results of an evaluation study on the re-structuring of a latency-bound mesh generation algorithm into a latency-tolerant parallel kernel. We use concurrency at a fine-grain level to tolerate long, variable, and unpredictable latencies of remote data gather operations required for parallel guaranteed quality Delaunay triangulations. Our performance data from a 16 node SP2 and 32 node Cluster of Sparc Workstations suggest that more than 90% of the latency from remote data gather operations can be masked effectively at the cost of increasing communication overhead between 2 and 20% of the total run time. Despite the increase in the communication overhead the latency-tolerant mesh generation kernel we present in this paper can generate tetrahedral meshes for parallel field solvers eight to nine times faster than the traditional approach. Copyright © 2003 John Wiley & Sons, Ltd. [source] Effect of an interactive computerized psycho-education system on patients suffering from depressionJOURNAL OF CLINICAL NURSING, Issue 5 2008MPsychN, Mei-Feng Lin PhD Aims., The aim of this study was to examine the effect of an Interactive Computerized Psycho-Education System on patients suffering from depression and to compare the use of an Interactive Computerized Psycho-Education System vs. traditional pamphlet education approach. Background., Depression management depends on pharmacological treatment and psychotherapy and on appropriate and timely patient education. Whilst multimedia learning concepts have been applied in areas such as education, this approach has not been widely used in psychiatric outpatient departments. Design and method., A preliminary pre and post quasi-experimental design with patients with depression was employed at an hospital. Participants in the experimental group (n = 19) received an Interactive Computerized Psycho-Education System intervention programme (Interactive Computerized Psycho-Education System and the educational manual). Participants in the control group (n = 13) were exposed only to the traditional pamphlet education approach (consultation from psychiatrists and information sheets). Primary outcome was depression knowledge scores. Secondary outcomes were scores on the Compliance Behaviour Assessment Scale. Results., In the experimental group (n = 19), the time spent working on the Interactive Computerized Psycho-Education System was about 30,180 minutes per session, with an average of 67 minutes. Participants in the experimental group had a considerably decreased incidence of medication non-compliance compared with participants in the control group. Knowledge scores of the experimental group ranged from 30,100, with an average score of 74.7. Conclusion., The Interactive Computerized Psycho-Education System is acceptable and may be as more effective than a traditional education approach to achieve adherence to medications for depression. Relevance to clinical practice., Compared with a traditional approach, the combination of the Interactive Computerized Psycho-Education System and a nursing-consulting clinic may assist patients with depression to achieve and maintain better medication compliance in addition to improving their knowledge of depression. [source] Collective Household Models: Principles and Main ResultsJOURNAL OF ECONOMIC SURVEYS, Issue 4 2002Frederic Vermeulen In the traditional approach to consumer behaviour it is assumed that households behave as if they were single decision-making units. This approach has methodological, empirical and welfare economic deficiencies. A valuable alternative to the traditional model is the collective approach to household behaviour. The collective approach explicitly takes account of the fact that multi-person households consist of several members which may have different preferences. Among these household members, an intrahousehold bargaining process is assumed to take place. In addition to providing an introduction to the collective approach, this survey intends to show how different collective household models, each with their own aims and assumptions, are connected. [source] |