Analysis Framework (analysis + framework)

Distribution by Scientific Domains


Selected Abstracts


A Case Study of a Variance Analysis Framework for Managing Distribution Costs,

ACCOUNTING PERSPECTIVES, Issue 2 2007
Kevin Gaffney
ABSTRACT Managing the distribution function as part of an overall supply-chain management strategy has become increasingly important given rising fuel costs in recent years. This paper presents a comprehensive variance analysis framework developed by supply-chain managers at Catalyst Paper Corporation as a tool for reporting and controlling distribution costs. The model decomposes the overall static-budget variance into four primary variance categories: volume, customer mix, distribution mix, and carrier charges. The framework addresses key limitations in the coverage of variance analysis contained in many management accounting textbooks. Specifically, Catalyst's framework incorporates: (a) mix variance calculations where there is more than one mix factor within a single cost element; (b) the impact of unplanned and unrealized activities; and (c) multiple nested mix variance calculations. Although developed in the context of distribution costs, the framework can be applied to the analysis of other manufacturing and non-manufacturing costs where multiple mix factors exist. L'importance de la gestion de la fonction de distribution dans le cadre de la stratégie globale de gestion de la chaîne d'approvisionnement s'est accrue avec la hausse des coûts du carburant des dernières années. Les auteurs présentent un cadre complet d'analyse des écarts, élaboré par les gestionnaires de la chaîne d'approvisionnement chez Catalyst Paper Corporation aux fins de la présentation et du contrôle des coûts de distribution. Le modèle décompose l'écart global du budget fixe en quatre grandes catégories d'écarts: les écarts sur volume, les écarts sur composition de la clientèle, les écarts sur composition de la distribution et les écarts sur frais de transport. Le cadre résout les principales limites de la couverture de l'analyse des écarts évoquées dans de nombreux manuels de comptabilité de management. Le cadre d'analyse de Catalyst Paper Corporation englobe: a) les calculs de l'écart sur composition lorsqu'il existe plus d'un facteur de composition dans un même élément de coût; b) l'incidence des activités non planifiées et non réalisées; et c) les calculs de l'écart sur composition à multiples critères de classification. Bien qu'il ait été élaboré dans le contexte des coûts de distribution, ce cadre peut être appliqué à l'analyse d'autres coûts liés ou non à la fabrication, lorsque les facteurs de composition sont multiples. [source]


Probabilistic Neural Network for Reliability Assessment of Oil and Gas Pipelines

COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 5 2002
Sunil K. Sinha
A fuzzy artificial neural network (ANN),based approach is proposed for reliability assessment of oil and gas pipelines. The proposed ANN model is trained with field observation data collected using magnetic flux leakage (MFL) tools to characterize the actual condition of aging pipelines vulnerable to metal loss corrosion. The objective of this paper is to develop a simulation-based probabilistic neural network model to estimate the probability of failure of aging pipelines vulnerable to corrosion. The approach is to transform a simulation-based probabilistic analysis framework to estimate the pipeline reliability into an adaptable connectionist representation, using supervised training to initialize the weights so that the adaptable neural network predicts the probability of failure for oil and gas pipelines. This ANN model uses eight pipe parameters as input variables. The output variable is the probability of failure. The proposed method is generic, and it can be applied to several decision problems related with the maintenance of aging engineering systems. [source]


Corn stover feedstock trials to support predictive modeling

GCB BIOENERGY, Issue 5 2010
DOUGLAS L. KARLEN
Abstract To be sustainable, feedstock harvest must neither degrade soil, water, or air resources nor negatively impact productivity or subsequent crop yields. Simulation modeling will help guide the development of sustainable feedstock production practices, but not without field validation. This paper introduces field research being conducted in six states to support Sun Grant Regional Partnership modeling. Our objectives are to (1) provide a fundamental understanding of limiting factor(s) affecting corn (Zea mays L.) stover harvest, (2) develop tools (e.g., equations, models, etc.) that account for those factors, and (3) create a multivariant analysis framework to combine models for all limiting factors. Sun Grant modelers will use this information to improve regional estimates of feedstock availability. A minimum data set, including soil organic carbon (SOC), total N, pH, bulk density (BD), and soil-test phosphorus (P), and potassium (K) concentrations, is being collected. Stover yield for three treatments (0%, 50%, and 90% removal) and concentrations of N, P, and K in the harvested stover are being quantified to assess the impact of stover harvest on soil resources. Grain yield at a moisture content of 155 g kg,1 averaged 9.71 Mg ha,1, matching the 2008 national average. Stover dry matter harvest rates ranged from 0 to 7 Mg ha,1. Harvesting stover increased N,P,K removal by an average of 42, 5, and 45 kg ha,1 compared with harvesting only grain. Replacing those three nutrients would cost $53.68 ha,1 based on 2009 fertilizer prices. This first-year data and that collected in subsequent years is being used to develop a residue management tool that will ultimately link multiple feedstock supplies together in a landscape vision to help develop a comprehensive carbon management plan, quantify corn stover harvest effects on soil quality, and predict regional variability in feedstock supplies. [source]


Estimating production costs in the economic evaluation of health-care programs

HEALTH ECONOMICS, Issue 1 2009
Carmen Herrero
Abstract We propose a method for calculating the production costs of an intervention in a manner that accounts for differences in productive ,effort.' This method could be used within a cost-effectiveness analysis framework in the evaluation of new medical technologies, pharmaceuticals, treatment programs, or public health interventions. We apply it to show evidence in favor of implementing a newborn screening program to detect congenital hearing impairment. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Detecting the effects of spatial variability of rainfall on hydrological modelling within an uncertainty analysis framework

HYDROLOGICAL PROCESSES, Issue 14 2009
P. M. Younger
Abstract Spatial patterns of rainfall are known to cause differences in observed flow. In this paper, the effects of perturbations in rainfall patterns on changes in parameter sets as well as model output are explored using the hydrological model Dynamic TOPMODEL for the Brue catchment (135 km2) in southwest England. Overall rainfall amount remains the same at each time step so the perturbations act as effectively treated errors in the spatial pattern. The errors were analysed with particular emphasis on when they could be detected under an uncertainty framework. Higher rainfall perturbations (multipliers of × 4 and greater) in the low lying and high areas of the catchment resulted in changes to event peaks and accompanying compensation in the baseflow. More significantly, changes in the effective model parameter values required by the best models to take account of the more extreme patterns were able to be detected by noting when distributions of parameters change under uncertainty. Copyright © 2009 John Wiley & Sons, Ltd. [source]


A Case Study of a Variance Analysis Framework for Managing Distribution Costs,

ACCOUNTING PERSPECTIVES, Issue 2 2007
Kevin Gaffney
ABSTRACT Managing the distribution function as part of an overall supply-chain management strategy has become increasingly important given rising fuel costs in recent years. This paper presents a comprehensive variance analysis framework developed by supply-chain managers at Catalyst Paper Corporation as a tool for reporting and controlling distribution costs. The model decomposes the overall static-budget variance into four primary variance categories: volume, customer mix, distribution mix, and carrier charges. The framework addresses key limitations in the coverage of variance analysis contained in many management accounting textbooks. Specifically, Catalyst's framework incorporates: (a) mix variance calculations where there is more than one mix factor within a single cost element; (b) the impact of unplanned and unrealized activities; and (c) multiple nested mix variance calculations. Although developed in the context of distribution costs, the framework can be applied to the analysis of other manufacturing and non-manufacturing costs where multiple mix factors exist. L'importance de la gestion de la fonction de distribution dans le cadre de la stratégie globale de gestion de la chaîne d'approvisionnement s'est accrue avec la hausse des coûts du carburant des dernières années. Les auteurs présentent un cadre complet d'analyse des écarts, élaboré par les gestionnaires de la chaîne d'approvisionnement chez Catalyst Paper Corporation aux fins de la présentation et du contrôle des coûts de distribution. Le modèle décompose l'écart global du budget fixe en quatre grandes catégories d'écarts: les écarts sur volume, les écarts sur composition de la clientèle, les écarts sur composition de la distribution et les écarts sur frais de transport. Le cadre résout les principales limites de la couverture de l'analyse des écarts évoquées dans de nombreux manuels de comptabilité de management. Le cadre d'analyse de Catalyst Paper Corporation englobe: a) les calculs de l'écart sur composition lorsqu'il existe plus d'un facteur de composition dans un même élément de coût; b) l'incidence des activités non planifiées et non réalisées; et c) les calculs de l'écart sur composition à multiples critères de classification. Bien qu'il ait été élaboré dans le contexte des coûts de distribution, ce cadre peut être appliqué à l'analyse d'autres coûts liés ou non à la fabrication, lorsque les facteurs de composition sont multiples. [source]


ParCYCLIC: finite element modelling of earthquake liquefaction response on parallel computers

INTERNATIONAL JOURNAL FOR NUMERICAL AND ANALYTICAL METHODS IN GEOMECHANICS, Issue 12 2004
Jun Peng
Abstract This paper presents the computational procedures and solution strategy employed in ParCYCLIC, a parallel non-linear finite element program developed based on an existing serial code CYCLIC for the analysis of cyclic seismically-induced liquefaction problems. In ParCYCLIC, finite elements are employed within an incremental plasticity, coupled solid,fluid formulation. A constitutive model developed for simulating liquefaction-induced deformations is a main component of this analysis framework. The elements of the computational strategy, designed for distributed-memory message-passing parallel computer systems, include: (a) an automatic domain decomposer to partition the finite element mesh; (b) nodal ordering strategies to minimize storage space for the matrix coefficients; (c) an efficient scheme for the allocation of sparse matrix coefficients among the processors; and (d) a parallel sparse direct solver. Application of ParCYCLIC to simulate 3-D geotechnical experimental models is demonstrated. The computational results show excellent parallel performance and scalability of ParCYCLIC on parallel computers with a large number of processors. Copyright © 2004 John Wiley & Sons, Ltd. [source]


New IQC for quasi-concave nonlinearities

INTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, Issue 7 2001
Alexandre Megretski
Abstract A new set of integral quadratic constraints (IQC) is derived for a class of ,rate limiters', modelled as a series connections of saturation-like memoryless nonlinearities followed by integrators. The result, when used within the standard IQC framework (in particular, with finite gain/passivity-based argiments, Lyapunov theory, structured singular values, etc.), is expected to be widely useful in nonlinear system analysis. For example, it enables ,discrimination' between ,saturation-like' and ,deadzone-like' nonlinearities and can be used to prove stability of systems with saturation in cases when replacing the saturation block by another memoryless nonlinearity with equivalent slope restrictions makes the whole system unstable. In particular, it is shown that the L2 gain of a unity feedback system with a rate limiter in the forward loop cannot exceed \sqrt{2}. In addition, a new, more flexible version of the general IQC analysis framework is presented, which relaxes the homotopy and boundedness conditions, and is more aligned with the language of the emerging IQC software. Copyright © 2001 John Wiley & Sons, Ltd. [source]


John Heron's six-category intervention analysis: towards understanding interpersonal relations and progressing the delivery of clinical supervision for mental health nursing in the United Kingdom

JOURNAL OF ADVANCED NURSING, Issue 2 2001
Graham Sloan BSc DipN RMN RGN DipCogPsychotherapy
John Heron's six-category intervention analysis: towards understanding interpersonal relations and progressing the delivery of clinical supervision for mental health nursing in the United Kingdom Aims.,This paper provides a critique of how Heron's six-category intervention analysis framework has been adopted by nursing in the United Kingdom (UK) as a theoretical framework in nursing research and model for clinical supervision. From this, its merits as an analytic framework and model for clinical supervision in nursing are discussed. Background.,Heron's six-category intervention analysis has been acknowledged as a means by which nursing could develop its therapeutic integrity. It has also been used as a theoretical framework in nursing research focusing on nurses' perceptions of their interpersonal style. More recently descriptions of this framework have been proposed as a structure for clinical supervision. However, its use as a theoretical framework to underpin research investigating the interpersonal skills of nurses and as a model of clinical supervision must firstly be scrutinized. Findings.,Returning to Heron's original description and comparing this with its current adoption in the UK, misconceptions of this framework can be identified. Its value as an analytic tool investigating interpersonal relations in nursing has still to be evaluated. Furthermore, nursing's emphasis on certain intervention categories has undermined the potential potency of this framework and its contribution as a model for clinical supervision in nursing. Conclusion.,We argue that Heron's six-category intervention analysis as a framework to investigate the interpersonal competence of nurses, particularly mental health nurses, requires investigation. This, in turn, would provide an opportunity to challenge the framework's theoretical standpoint. In addition to its value as an analytic tool, all six categories of Heron's framework have equal relevance to its contribution in nursing as a supervision model. [source]


DECISION SUPPORT FOR ALLOCATION OF WATERSHED POLLUTION LOAD USING GREY FUZZY MULTIOBJECTIVE PROGRAMMING,

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 3 2006
Ho-Wen Chen
ABSTRACT: This paper uses the grey fuzzy multiobjective programming to aid in decision making for the allocation of waste load in a river system under versatile uncertainties and risks. It differs from previous studies by considering a multicriteria objective function with combined grey and fuzzy messages under a cost benefit analysis framework. Such analysis technically integrates the prior information of water quality models, water quality standards, wastewater treatment costs, and potential benefits gained via in-stream water quality improvement. While fuzzy sets are characterized based on semantic and cognitive vagueness in decision making, grey numbers can delineate measurement errors in data collection. By employing three distinct set theoretic fuzzy operators, the synergy of grey and fuzzy implications may smoothly characterize the prescribed management complexity. With the aid of genetic algorithm in the solution procedure, the modeling outputs contribute to the development of an effective waste load allocation and reduction scheme for tributaries in this subwatershed located in the lower Tseng-Wen River Basin, South Taiwan. Research findings indicate that the inclusion of three fuzzy set theoretic operators in decision analysis may delineate different tradeoffs in decision making due to varying changes, transformations, and movements of waste load in association with land use pattern within the watershed. [source]


Exploring social mobility with latent trajectory groups

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES A (STATISTICS IN SOCIETY), Issue 1 2008
Patrick Sturgis
Summary., We present a new methodological approach to the study of social mobility. We use a latent class growth analysis framework to identify five qualitatively distinct social class trajectory groups between 1980 and 2000 for male respondents to the 1970 British Cohort Study. We model the antecedents of trajectory group membership via multinomial logistic regression. Non-response, which is a considerable problem in long-term panels and cohort studies, is handled via direct maximum likelihood estimation, which is consistent and efficient when data are missing at random. Our results suggest a combination of meritocratic and ascriptive influences on the probability of membership in the different trajectory groups. [source]


Experiences of intensive care nurses assessing sedation/agitation in critically ill patients

NURSING IN CRITICAL CARE, Issue 4 2008
Stephanie Weir
Abstract Background:, Patients admitted to the intensive care unit (ICU) will more often than not require sedative and analgesic drugs to enable them to tolerate the invasive procedures and therapies caused as a result of their underlying condition and/or necessary medical interventions. Aim:, This article reports a study exploring the perceptions and experiences of intensive care nurses using a sedation/agitation scoring (SAS) tool to assess and manage sedation and agitation amongst critically ill patients. The principle aims and objectives of this study were as follows: ,,to explore nurse's everyday experiences using a sedation scoring tool; ,,to explore and understand nurse's attitudes and beliefs of the various components of assessing and managing sedation among critically ill patients. Method:, Using a descriptive qualitative approach, semistructured interviews were carried out with a purposive sample of eight ICU nurses within a district general hospital ICU. The interviews focused on nurse's own experiences and perceptions of using a sedation scoring tool in clinical practice. Burnard's 14-stage thematic content analysis framework was employed to assist in the data analysis process. Results:, Three key themes emerged that may have implications not only for clinical practice but for further research into the use of the SAS tool. ,,Benefits to patient care as a direct result of using a sedation scoring tool. ,,The concerns of nursing staff. ,,The implications of using such a tool in clinical practice. Conclusion:, This paper reinforces the potential benefits to patients as a direct result of implementing the SAS scoring tool and clinical guidelines. Furthermore, it highlights the reluctance of a number of staff to adhere to such guidelines and discusses the concerns regarding less experienced nurses administering sedative agents. Attention was also drawn to the educational requirements of nursing and medical staff when using the SAS scoring tool. [source]


New Worlds in Political Science

POLITICAL STUDIES, Issue 2 2010
Patrick Dunleavy
,Political science' is a ,vanguard' field concerned with advancing generic knowledge of political processes, while a wider ,political scholarship' utilising eclectic approaches has more modest or varied ambitions. Political science nonetheless necessarily depends upon and is epistemologically comparable with political scholarship. I deploy Boyer's distinctions between discovery, integration, application and renewing the profession to show that these connections are close woven. Two sets of key challenges need to be tackled if contemporary political science is to develop positively. The first is to ditch the current unworkable and restrictive comparative politics approach, in favour of a genuinely global analysis framework. Instead of obsessively looking at data on nation states, we need to seek data completeness on the whole (multi-level) world we have. A second cluster of challenges involves looking far more deeply into political phenomena; reaping the benefits of ,digital-era' developments; moving from sample methods to online census methods in organisational analysis; analysing massive transactional databases and real-time political processes (again, instead of depending on surveys); and devising new forms of ,instrumentation', informed by post-rational choice theoretical perspectives. [source]


A comparative analysis of a modified picture frame test for characterization of woven fabrics

POLYMER COMPOSITES, Issue 4 2010
A.S. Milani
An experimental, finite-element analysis framework is utilized to estimate the deformation state in a modified version of the picture frame test. During the analysis, the effect of fiber misalignment and the deformation heterogeneity in the tested fabric, a 2 × 2 PP/E-Glass twill, is accounted for and a force prediction model is presented. Using an equivalent stress,strain normalization scheme, the comparison of the modified test with the conventional (original) picture frame and bias-extension tests is also made, and results reveal similarities and differences that should receive attention in the identification of constitutive models of woven fabrics using these basic tests. Ideally, the trellising behavior should not change from one test to another but results show that in the presence of fiber misalignment, the modified picture frame test yields a behavior closer to that of the bias-extension test, while the general form of the test's repeatability, measured by a signal-to-noise metric, remains similar to the original picture frame test. POLYM. COMPOS., 2010. © 2009 Society of Plastics Engineers [source]


Random maps, coalescing saddles, singularity analysis, and Airy phenomena

RANDOM STRUCTURES AND ALGORITHMS, Issue 3-4 2001
Cyril Banderier
Abstract A considerable number of asymptotic distributions arising in random combinatorics and analysis of algorithms are of the exponential-quadratic type, that is, Gaussian. We exhibit a class of "universal" phenomena that are of the exponential-cubic type, corresponding to distributions that involve the Airy function. In this article, such Airy phenomena are related to the coalescence of saddle points and the confluence of singularities of generating functions. For about a dozen types of random planar maps, a common Airy distribution (equivalently, a stable law of exponent ) describes the sizes of cores and of largest (multi)connected components. Consequences include the analysis and fine optimization of random generation algorithms for a multiple connected planar graphs. Based on an extension of the singularity analysis framework suggested by the Airy case, the article also presents a general classification of compositional schemas in analytic combinatorics. © 2001 John Wiley & Sons, Inc. Random Struct. Alg., 19: 194,246, 2001 [source]


Climate drivers of red wine quality in four contrasting Australian wine regions

AUSTRALIAN JOURNAL OF GRAPE AND WINE RESEARCH, Issue 2 2008
C.J. SOAR
Abstract Background and Aims: The understanding of the links between weather and wine quality is fragmented and often qualitative. This study quantified and integrated key weather variables during ripening, and their influence in red wine quality in the Hunter Valley, Margaret River, Coonawarra and Barossa Valley. Methods and Results: Long-term records of published vintage scores were used as an indicator of wine quality. A ,2 analysis was used to compare good (top 25%) versus poor (bottom 25%) vintages in relation to the frequency of defined weather conditions. Using maximum temperature as an example, better quality was associated with temperatures above 34°C throughout most of ripening in the Hunter, below 28°C in early January in the Margaret River, 28,33.9°C towards harvest in Coonawarra, and below 21.9°C in late January and early February and 28,30.9°C towards harvest in the Barossa. Conclusion: Our quantitative assessment allows for the timing and magnitude of weather influences on wine quality on a regional basis. Significance of the Study: The improved specificity of the links between weather and wine quality will help in the development of a risk analysis framework for wine quality across Australia. [source]


Large-scale gasification-based coproduction of fuels and electricity from switchgrass

BIOFUELS, BIOPRODUCTS AND BIOREFINING, Issue 2 2009
Eric D. Larson
Abstract Large-scale gasification-based systems for producing Fischer-Tropsch (F-T) fuels (diesel and gasoline blendstocks), dimethyl ether (DME), or hydrogen from switchgrass , with electricity as a coproduct in each case are assessed using a self-consistent design, simulation, and cost analysis framework. We provide an overview of alternative process designs for coproducing these fuels and power assuming commercially mature technology performance and discuss the commercial status of key component technologies. Overall efficiencies (lower-heating-value basis) of producing fuels plus electricity in these designs ranges from 57% for F-T fuels, 55,61% for DME, and 58,64% for hydrogen. Detailed capital cost estimates for each design are developed, on the basis of which prospective commercial economics of future large-scale facilities that coproduce fuels and power are evaluated. © 2009 Society of Chemical Industry and John Wiley & Sons, Ltd [source]


A Sensitivity Analysis for Shared-Parameter Models for Incomplete Longitudinal Outcomes

BIOMETRICAL JOURNAL, Issue 1 2010
An Creemers
Abstract All models for incomplete data either explicitly make assumptions about aspects of the distribution of the unobserved outcomes, given the observed ones, or at least implicitly imply such. One consequence is that there routinely exist a whole class of models, coinciding in their description of the observed portion of the data but differing with respect to their "predictions" of what is unobserved. Within such a class, there always is a single model corresponding to so-called random missingness, in the sense that the mechanism governing missingness depends on covariates and observed outcomes, but given these not further on unobserved outcomes. We employ these results in the context of so-called shared-parameter models where outcome and missingness models are connected by means of common latent variables or random effects, to devise a sensitivity analysis framework. Precisely, the impact of varying unverifiable assumptions about unobserved measurements on parameters of interest is studied. Apart from analytic considerations, the proposed methodology is applied to assess treatment effect in data from a clinical trial in toenail dermatophyte onychomycosis. While our focus is on longitudinal outcomes with incomplete outcome data, the ideas developed in this paper are of use whenever a shared-parameter model could be considered. [source]


Analysis of Single-Molecule Fluorescence Spectroscopic Data with a Markov-Modulated Poisson Process

CHEMPHYSCHEM, Issue 14 2009
Mark Jäger Dr.
Abstract We present a photon-by-photon analysis framework for the evaluation of data from single-molecule fluorescence spectroscopy (SMFS) experiments using a Markov-modulated Poisson process (MMPP). A MMPP combines a discrete (and hidden) Markov process with an additional Poisson process reflecting the observation of individual photons. The algorithmic framework is used to automatically analyze the dynamics of the complex formation and dissociation of Cu2+ ions with the bidentate ligand 2,2,-bipyridine-4,4,dicarboxylic acid in aqueous media. The process of association and dissociation of Cu2+ ions is monitored with SMFS. The dcbpy-DNA conjugate can exist in two or more distinct states which influence the photon emission rates. The advantage of a photon-by-photon analysis is that no information is lost in preprocessing steps. Different model complexities are investigated in order to best describe the recorded data and to determine transition rates on a photon-by-photon basis. The main strength of the method is that it allows to detect intermittent phenomena which are masked by binning and that are difficult to find using correlation techniques when they are short-lived. [source]


Bans against corporal punishment: a systematic review of the laws, changes in attitudes and behaviours

CHILD ABUSE REVIEW, Issue 4 2010
Adam J. Zolotor
Abstract Twenty-four countries have passed legislative bans on corporal punishment since the passage of the Convention on the Rights of the Child. This systematic review briefly reviews the arguments for corporal punishment bans and the contents and context of the current legal bans. All such bans have occurred in representative governments. Following this background, the paper will examine the impacts of the laws with regard to attitudes regarding corporal punishment and parental discipline behaviours. It is clear from the findings of this systematic review that legal bans on corporal punishment are closely associated with decreases in support of and use of corporal punishment as a child discipline technique. However, it is less clear if such legislative bans always generally precede a decline in popular support for corporal punishment or result from such a decline in popular support. The known impact of such bans on child physical abuse will then be reviewed. The paper concludes with a policy analysis framework for considering new legislation to ban corporal punishment. Copyright © 2010 John Wiley & Sons, Ltd. [source]