Different Subjects (different + subject)

Distribution by Scientific Domains


Selected Abstracts


Discovering hidden knowledge in data classification via multivariate analysis

EXPERT SYSTEMS, Issue 2 2010
Yisong Chen
Abstract: A new classification algorithm based on multivariate analysis is proposed to discover and simulate the grading policy on school transcript data sets. The framework comprises three major steps. First, factor analysis is adopted to separate the scores of several different subjects into grading-related ones and grading-unrelated ones. Second, multidimensional scaling is employed for dimensionality reduction to facilitate subsequent data visualization and interpretation. Finally, a support vector machine is trained to classify the filtered data into different grades. This work provides an attractive framework for intelligent data analysis and decision making. It also exhibits the advantages of high classification accuracy and supports intuitive data interpretation. [source]


Molecular characterisation of GSD III subjects and identification of six novel mutations in AGL,,

HUMAN MUTATION, Issue 6 2002
S. Lucchiari
Abstract Deficiency of amylo-1,6-glucosidase, 4-,-glucanotransferase enzyme (AGL or glycogen debranching enzyme) is causative of Glycogen Storage Disease type III, a rare autosomal recessive disorder of glycogen metabolism. The disease has been demonstrated to show clinical and biochemical heterogeneity, reflecting the genotype-phenotype heterogeneity among different subjects. The aim of this study was the molecular characterisation of eight unrelated patients from an ethnically heterogeneous population (six Italians, one from India and another one from Tunisia). We describe six novel mutations responsible for the disease (C234R, R675W, 2547delG, T38A, W1327X, IVS6 +3 A>G) and the presence in two Italian subjects of a splice variant (IVS21+1 G>A) already described elsewhere. This last one is confirmed to be the most frequent mutation among the Italian patients come to our observation, accounting for 28% of 21 patients. One subject was found to be a compound heterozygous. Our data confirm the substantial genetic heterogeneity of this disease. Consequently, the strategy of mutation finding based on screening of recurrent common mutations is limited, as far as regards Italian GSD III patients, to check for the presence of IVS21+1 G>A. © 2002 Wiley-Liss, Inc. [source]


Analyzing the 24-hour blood pressure and heart-rate variability with self-organizing feature maps

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, Issue 1 2002
G. Tambouratzis
In this article, the self-organizing map (SOM) is employed to analyze data describing the 24-hour blood pressure and heart-rate variability of human subjects. The number of observations varies widely over different subjects, and therefore a direct statistical analysis of the data is not feasible without extensive pre-processing and interpolation for normalization purposes. The SOM network operates directly on the data set, without any pre-processing, determines several important data set characteristics, and allows their visualization on a two-dimensional plot. The SOM results are very similar to those obtained using classic statistical methods, indicating the effectiveness of the SOM method in accurately extracting the main characteristics from the data set and displaying them in a readily understandable manner. In this article, the relation is studied between the representation of each subject on the SOM, and his blood pressure and pulse-rate measurements. Finally, some indications are included regarding how the SOM can be used by the medical community to assist in diagnosis tasks. © 2002 John Wiley & Sons, Inc. [source]


Comparing the Difficulty of Examination Subjects with Item Response Theory

JOURNAL OF EDUCATIONAL MEASUREMENT, Issue 2 2008
Oksana B. Korobko
Methods are presented for comparing grades obtained in a situation where students can choose between different subjects. It must be expected that the comparison between the grades is complicated by the interaction between the students' pattern and level of proficiency on one hand, and the choice of the subjects on the other hand. Three methods based on item response theory (IRT) for the estimation of proficiency measures that are comparable over students and subjects are discussed: a method based on a model with a unidimensional representation of proficiency, a method based on a model with a multidimensional representation of proficiency, and a method based on a multidimensional representation of proficiency where the stochastic nature of the choice of examination subjects is explicitly modeled. The methods are compared using the data from the Central Examinations in Secondary Education in the Netherlands. The results show that the unidimensional IRT model produces unrealistic results, which do not appear when using the two multidimensional IRT models. Further, it is shown that both the multidimensional models produce acceptable model fit. However, the model that explicitly takes the choice process into account produces the best model fit. [source]


Self-Help Groups in the Welfare State: Treatment Program or Voluntary Action?

NONPROFIT MANAGEMENT & LEADERSHIP, Issue 2 2002
Magnus Karlsson
This article identifies two different perspectives used when studying self-help groups: the professional treatment perspective and the voluntary action perspective. An outline of the perspectives leads to a discussion of their consequences for self-help group research. The authors categorize about five hundred scientific publications from all over the world on the basis of the perspectives they present on self-help groups; the results indicate that different perspectives seem to be preferred in different countries and when discussing different subjects. Finally, the authors suggest questions and concepts that the perspectives generate, and they emphasize the importance of being aware of which perspective is used in the study of self-help groups. [source]


Spectral optical coherence tomography: a new imaging technique in contact lens practice

OPHTHALMIC AND PHYSIOLOGICAL OPTICS, Issue 2 2006
omiej J. Ka
Abstract Purpose:, Spectral optical coherence tomography (SOCT) is a new non-invasive, non-contact, high-resolution technique, which provides cross-sectional images of objects that weakly absorb and scatter light. The aim of this article is to demonstrate the application of SOCT to imaging of eyes fitted with contact lenses. Methods:, Nine eyes of six different subjects fitted with various contact lenses have been examined with a slit-lamp and a prototype SOCT instrument. Results:, Our SOCT system provides high-resolution (4,6 ,m longitudinal, 10 ,m transversal) tomograms composed of 3000,5000 A-scans with acquisition time of 100,250 ms. The quality of the images is adequate for detailed evaluation of contact lens fit. Design, shape and lens edge position were assessed, and complications of contact lens wear could be visualized. Thickness of the lens, corneal epithelium and stroma as well as the space between the lens and the eye surface have been measured. Conclusions:, SOCT allows high-resolution, cross-sectional visualization of the eye fitted with a contact lens. The ability to carry out a detailed evaluation of the fitting relationship between the lens and the ocular surface might be useful in research and optometric practice. SOCT can also be helpful in diagnosis, evaluation and documentation of contact lens complications. [source]


Pharmacokinetic parameters estimation using adaptive Bayesian P-splines models

PHARMACEUTICAL STATISTICS: THE JOURNAL OF APPLIED STATISTICS IN THE PHARMACEUTICAL INDUSTRY, Issue 2 2009
Astrid Jullion
Abstract In preclinical and clinical experiments, pharmacokinetic (PK) studies are designed to analyse the evolution of drug concentration in plasma over time i.e. the PK profile. Some PK parameters are estimated in order to summarize the complete drug's kinetic profile: area under the curve (AUC), maximal concentration (Cmax), time at which the maximal concentration occurs (tmax) and half-life time (t1/2). Several methods have been proposed to estimate these PK parameters. A first method relies on interpolating between observed concentrations. The interpolation method is often chosen linear. This method is simple and fast. Another method relies on compartmental modelling. In this case, nonlinear methods are used to estimate parameters of a chosen compartmental model. This method provides generally good results. However, if the data are sparse and noisy, two difficulties can arise with this method. The first one is related to the choice of the suitable compartmental model given the small number of data available in preclinical experiment for instance. Second, nonlinear methods can fail to converge. Much work has been done recently to circumvent these problems (J. Pharmacokinet. Pharmacodyn. 2007; 34:229,249, Stat. Comput., to appear, Biometrical J., to appear, ESAIM P&S 2004; 8:115,131). In this paper, we propose a Bayesian nonparametric model based on P-splines. This method provides good PK parameters estimation, whatever be the number of available observations and the level of noise in the data. Simulations show that the proposed method provides better PK parameters estimations than the interpolation method, both in terms of bias and precision. The Bayesian nonparametric method provides also better AUC and t1/2 estimations than a correctly specified compartmental model, whereas this last method performs better in tmax and Cmax estimations. We extend the basic model to a hierarchical one that treats the case where we have concentrations from different subjects. We are then able to get individual PK parameter estimations. Finally, with Bayesian methods, we can get easily some uncertainty measures by obtaining credibility sets for each PK parameter. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Non-invasive in vivo determination of UVA efficacy of sunscreens using diffuse reflectance spectroscopy

PHOTODERMATOLOGY, PHOTOIMMUNOLOGY & PHOTOMEDICINE, Issue 4 2003
R. Gillies
Background: Evaluation of sunscreen efficacy is most relevant when measured on the surface it is meant to protect, namely on human skin in vivo. Application of any material to the surface of the skin alters its optical properties. Diffuse reflectance spectroscopy (DRS) is a non-invasive technique to measure changes in the optical properties of the skin decoupled from its biological responses following sunscreen application. Methods: This study compared measurements of UVA efficacy of oxybenzone and avobenzone at different concentrations (0,5%) using DRS, human phototest and an in vitro technique. Twenty subjects were enrolled for each product measured by DRS and 10 different subjects were enrolled for each product measured by human phototest. Six areas 5 cm × 10 cm were outlined on each subject's back. DRS measurements were performed on four subsites within each area before and 20 min after sunscreen application. UVA efficacy for each concentration of product was calculated from the measured transmission spectrum of a given product convoluted with the spectrum of a Xenon light source adequately filtered to obtain the UVA spectrum from 320 to 400 nm and the erythema action spectrum. Phototesting was performed using the same light source and persistent pigment darkening as the biological endpoint. Measurements were made with sunscreen coverage of 2 mg/cm2. In vitro measurements were performed using an Optometrics instrument. Results: All three techniques showed a linear response between calculated UVA efficacy and product concentration. Conclusions: This study showed that DRS is a rapid and reproducible method to calculate UVA efficacy of sunscreen materials and that its results correlate closely with those obtained by human phototesting. [source]


Positron Emission Tomography in Clinical Islet Transplantation

AMERICAN JOURNAL OF TRANSPLANTATION, Issue 12 2009
O. Eriksson
The fate of islets in clinical transplantation is unclear. To elude on this positron emission tomography combined with computed tomography (PET/CT) was performed for 60 min during islet transplantation in five patients receiving six transplants. A fraction of the islets (23%) were labeled with 18F-fluorodeoxyglucose ([18F]FDG) and carefully mixed with unlabeled islets just prior to intraportal transplantation. The peak radioactivity concentration in the liver was found at 19 min after start of islet infusion and corresponded to only 75% of what was expected, indicating that islets are lost during the transplantation procedure. No accumulation of radioactivity was found in the lungs. A nonphysiological peak of C-peptide was found in plasma during and immediately after transplantation in all subjects. Distribution in the liver was heterogeneous with wide variations in location and concentration. Islets found in areas with concentrations of >400 IEQ/cc liver tissue varied between 1% and 32% of the graft in different subjects. No side effects attributed to the PET/CT procedure were found. Clinical outcome in all patients was comparable to that previously observed indicating that the [18F]FDG labeling procedure did not harm the islets. The technique has potential to be used to assess approaches to enhance islet survival and engraftment in clinical transplantation. [source]


Association Models for Clustered Data with Binary and Continuous Responses

BIOMETRICS, Issue 1 2010
Lanjia Lin
Summary We consider analysis of clustered data with mixed bivariate responses, i.e., where each member of the cluster has a binary and a continuous outcome. We propose a new bivariate random effects model that induces associations among the binary outcomes within a cluster, among the continuous outcomes within a cluster, between a binary outcome and a continuous outcome from different subjects within a cluster, as well as the direct association between the binary and continuous outcomes within the same subject. For the ease of interpretations of the regression effects, the marginal model of the binary response probability integrated over the random effects preserves the logistic form and the marginal expectation of the continuous response preserves the linear form. We implement maximum likelihood estimation of our model parameters using standard software such as PROC NLMIXED of SAS. Our simulation study demonstrates the robustness of our method with respect to the misspecification of the regression model as well as the random effects model. We illustrate our methodology by analyzing a developmental toxicity study of ethylene glycol in mice. [source]


Spatial Multistate Transitional Models for Longitudinal Event Data

BIOMETRICS, Issue 1 2008
F. S. Nathoo
Summary Follow-up medical studies often collect longitudinal data on patients. Multistate transitional models are useful for analysis in such studies where at any point in time, individuals may be said to occupy one of a discrete set of states and interest centers on the transition process between states. For example, states may refer to the number of recurrences of an event, or the stage of a disease. We develop a hierarchical modeling framework for the analysis of such longitudinal data when the processes corresponding to different subjects may be correlated spatially over a region. Continuous-time Markov chains incorporating spatially correlated random effects are introduced. Here, joint modeling of both spatial dependence as well as dependence between different transition rates is required and a multivariate spatial approach is employed. A proportional intensities frailty model is developed where baseline intensity functions are modeled using parametric Weibull forms, piecewise-exponential formulations, and flexible representations based on cubic B-splines. The methodology is developed within the context of a study examining invasive cardiac procedures in Quebec. We consider patients admitted for acute coronary syndrome throughout the 139 local health units of the province and examine readmission and mortality rates over a 4-year period. [source]


Semantic Interpretation as Computation in Nonmonotonic Logic: The Real Meaning of the Suppression Task

COGNITIVE SCIENCE - A MULTIDISCIPLINARY JOURNAL, Issue 6 2005
Keith Stenning
Abstract Interpretation is the process whereby a hearer reasons to an interpretation of a speaker's discourse. The hearer normally adopts a credulous attitude to the discourse, at least for the purposes of interpreting it. That is to say the hearer tries to accommodate the truth of all the speaker's utterances in deriving an intended model. We present a nonmonotonic logical model of this process which defines unique minimal preferred models and efficiently simulates a kind of closed-world reasoning of particular interest for human cognition. Byrne's "suppression" data (Byrne, 1989) are used to illustrate how variants on this logic can capture and motivate subtly different interpretative stances which different subjects adopt, thus indicating where more fine-grained empirical data are required to understand what subjects are doing in this task. We then show that this logical competence model can be implemented in spreading activation network models. A one pass process interprets the textual input by constructing a network which then computes minimal preferred models for (3-valued) valuations of the set of propositions of the text. The neural implementation distinguishes easy forward reasoning from more complex backward reasoning in a way that may be useful in explaining directionality in human reasoning. [source]