Home About us Contact | |||
Homogeneity
Kinds of Homogeneity Terms modified by Homogeneity Selected AbstractsThe Accuracy of Predicting Cardiac Arrest by Emergency Medical Services Dispatchers: The Calling Party EffectACADEMIC EMERGENCY MEDICINE, Issue 9 2003Alex G. Garza MD Abstract Objectives: To analyze the accuracy of paramedic emergency medical services (EMS) dispatchers in predicting cardiac arrest and to assess the effect of the caller party on dispatcher accuracy in an advanced life support, public utility model EMS system, with greater than 90,000 calls and greater than 60,000 transports per year. Methods: This was a retrospective analysis from January 1, 2000, through June 30, 2000, of 911 calls with dispatcher-assigned presumptive patient condition (PPC) or field diagnosis of cardiac arrest. Sensitivity and positive predictive value (PPV) of the PPC code for cardiac arrest by calling parties were calculated. Homogeneity of sensitivity and PPV of the PPC code for cardiac arrest by calling parties was studied with chi-square analysis. Relevant proportions, relative risk ratios, and associated 95% confidence intervals (95% CIs) were calculated. Student's t-test was used to compare quality assurance scores between calling parties. Results: There were 506 patients included in the study. Overall sensitivity for dispatcher-assigned PPC of cardiac arrest was 68.3% (95% CI = 63.3% to 73.0%) with a PPV of 65.0% (95% CI = 60.0% to 69.7%). There was a significant difference in the PPV for the EMS dispatcher diagnosis of cardiac arrest depending on the type of caller (,2= 17.34, p < 0.001). Conclusions: A higher level of medical training may improve dispatch accuracy for predicting cardiac arrest. The type of calling party influenced the PPV of dispatcher-assigned condition. [source] The technical quality of nonsurgical root canal treatment performed by a selected cohort of Australian endodontistsINTERNATIONAL ENDODONTIC JOURNAL, Issue 7 2008D. E. Bierenkrant Abstract Aim, To investigate the technical quality of nonsurgical root canal treatment performed by endodontists in Melbourne, Australia. Methodology, Clinical and radiographic records of 100 sequential nonsurgical patients were obtained from each of six endodontists working in private practice. The following variables were analysed: proximity of root filling to radiographic apex; homogeneity and radiodensity of root filling; lateral adaptation of the root filling to the canal walls; taper; extrusion of material; small, appropriate or excessive apical enlargement; presence of lateral canals; transportation; procedural errors. The radiographs were assessed by three independent evaluators. Exploratory data analysis was undertaken using simple frequencies and cross-tabulations. A generalised linear mixed model (GLMM) was used for the formal statistical modelling. Results, Of the 1351 canals that were examined, 91.7% were filled within 2 mm of the radiographic apex and 74% were within 1 mm. Homogeneity and adequate density were found along the entire length of the canal in 86.1% and 88.6% of cases respectively. Lateral adaptation was adequate in 95.6% of cases and the taper was ,smooth and continuous' in 83.8% of roots. No and/or small extrusion of sealer was noted in 98.3% of cases. Apical enlargement was ,appropriate' in 85% of roots. Both transportation (1.1%) and procedural errors (1.3%) were rare occurrences. Conclusions, The technical quality of root fillings performed by endodontists in Melbourne, Australia complied with current guidelines in 77.4%,91.0% of roots. All variables examined confirmed high levels of technical proficiency. There were very few instances of canal transportation and/or procedural errors. [source] A new instrumental precipitation dataset for the greater alpine region for the period 1800,2002INTERNATIONAL JOURNAL OF CLIMATOLOGY, Issue 2 2005Ingeborg Auer Abstract The paper describes the development of a dataset of 192 monthly precipitation series covering the greater alpine region (GAR, 4,18°E by 43,49°N). A few of the time series extend back to 1800. A description is provided of the sometimes laborious processes that were involved in this work: from locating the original sources of the data to homogenizing the records and eliminating as many of the outliers as possible. Locating the records required exhaustive searches of archives currently held in yearbooks and other sources of the states, countries and smaller regional authorities that existed at various times during the last 200 years. Homogeneity of each record was assessed by comparison with neighbouring series, although this becomes difficult when the density of stations reduces in the earliest years. An additional 47 series were used, but the density of the sites in Austria and Switzerland was reduced to maintain an even coverage in space across the whole of the GAR. We are confident of the series back to 1840, but the quality of data before this date must be considered poorer. Of all of the issues involved in homogenizing these data, perhaps the most serious problem is associated with the differences in the height above ground of the precipitation gauges, in particular the general lowering of gauge heights in the late 19th century for all countries, with the exception of Italy. The standard gauge height in the early-to-mid 19th century was 15,30 m above the ground, with gauges being generally sited on rooftops. Adjustments to some series of the order of 30,50% are necessary for compatibility with the near-ground location of gauges during much of the 20th century. Adjustments are sometimes larger in the winter, when catching snowfall presents serious problems. Data from mountain-top observatories have not been included in this compilation (because of the problem of measuring snowfall), so the highest gauge sites are at elevations of 1600,1900 m in high alpine valley locations. Two subsequent papers will analyse the dataset. The first will compare the series with other large-scale precipitation datasets for this region, and the second will describe the major modes of temporal variability of precipitation totals in different seasons and determine coherent regions of spatial variability. Copyright © 2005 Royal Meteorological Society [source] A Case for Homogeneity of Personality at the Occupational LevelINTERNATIONAL JOURNAL OF SELECTION AND ASSESSMENT, Issue 2 2009Robert C. Satterwhite The forces of attraction,selection,attrition have been hypothesized to create homogeneity of personality within organizations, and vocational choice theory predicts that these forces lead to a ,modal personality' within given occupations. This study compared the homogeneity of a set of personality characteristics for 6582 incumbents from eight organizations in eight occupations. The results indicated that (1) the homogeneity hypothesis was supported both within organizations as well as within occupations; and (2) the homogeneity within occupations was higher than that found in organizations. [source] Field evaluation of the efficacy of a probiotic containing Bacillus licheniformis and Bacillus subtilis spores, on the health status and performance of sows and their littersJOURNAL OF ANIMAL PHYSIOLOGY AND NUTRITION, Issue 11-12 2004C. Alexopoulos Summary The aim of this study was to assess the efficacy of BioPlus 2B, a probiotic containing Bacillus licheniformis and Bacillus subtilis spores, on the health status and productivity of sows and their litters. A total of 109 gilts and sows were allocated into two experimental groups, as follows: untreated controls (UC) and BioPlus 2B (same feeding as the UC group plus BioPlus 2B) at a dose of 400 g/ton of feed (equal to 1.28 × 106 viable spores/g of feed). Treatment started from the day of allocation (14 days prior to the expected farrowing) up to the weaning day. Homogeneity of the groups was satisfied with regard to the parity. From the results it was evident that BioPlus 2B supplementation of the feed improved gilt/sow performance as shown by: (i) the increase of sow feed consumption during the first 14 days postpartum and (ii) the decrease of sow weight loss during the suckling period. Certain blood and milk parameters were significantly improved, as shown by higher serum cholesterol and total lipids concentrations and higher milk fat and protein content at mid-suckling period. As a consequence, a positive effect was also noticed as regard litter health and performance characteristics in terms of: (i) decrease in piglet diarrhoea score, (ii) decrease in pre-weaning mortality thus leading to increase in the number of weaned piglets per litter and (iii) increase in piglet body weight at weaning. Moreover, BioPlus 2B tended to improve the health status and fertility of sows demonstrating: (i) tendency to a lower proportion of sows with Mastitis-Metritus-Agalactia (MMA) problems and (ii) lower proportion of sows returning to oestrus. [source] Increasing the Homogeneity of CAT's Item-Exposure Rates by Minimizing or Maximizing Varied Target Functions While Assembling Shadow TestsJOURNAL OF EDUCATIONAL MEASUREMENT, Issue 3 2005Yuan H. Li A computerized adaptive testing (CAT) algorithm that has the potential to increase the homogeneity of CAT's item-exposure rates without significantly sacrificing the precision of ability estimates was proposed and assessed in the shadow-test (van der Linden & Reese, 1998) CAT context. This CAT algorithm was formed by a combination of maximizing or minimizing varied target functions while assembling shadow tests. There were four target functions to be separately used in the first, second, third, and fourth quarter test of CAT. The elements to be used in the four functions were associated with (a) a random number assigned to each item, (b) the absolute difference between an examinee's current ability estimate and an item difficulty, (c) the absolute difference between an examinee's current ability estimate and an optimum item difficulty, and (d) item information. The results indicated that this combined CAT fully utilized all the items in the pool, reduced the maximum exposure rates, and achieved more homogeneous exposure rates. Moreover, its precision in recovering ability estimates was similar to that of the maximum item-information method. The combined CAT method resulted in the best overall results compared with the other individual CAT item-selection methods. The findings from the combined CAT are encouraging. Future uses are discussed. [source] Fast spin-echo triple-echo Dixon: Initial clinical experience with a novel pulse sequence for fat-suppressed T2-weighted abdominal MR imagingJOURNAL OF MAGNETIC RESONANCE IMAGING, Issue 3 2009Russell N. Low MD Abstract Purpose To evaluate a prototype fast spin echo (FSE) triple-echo-Dixon (fTED) technique for breath-hold, fat-suppressed, T2-weighted abdominal imaging. Materials and Methods Forty patients underwent breath-hold T2-weighted abdominal imaging with fTED and conventional fast recovery (FR) FSE with chemical shift-selective saturation (CHESS). FRFSE and fTED images were compared for overall image quality, homogeneity of fat suppression, image sharpness, anatomic detail, and phase artifact. Depiction of disease was recorded separately for FRFSE and fTED images. Results FTED successfully reconstructed water-only and fat-only images from source images in all 40 cases. Water and fat separation was perfect in 36 (0.90) patients. Homogeneity of fat suppression was superior on the fTED images in 38 (0.95) of 40 cases. FTED images showed better anatomic detail in 27 (0.68), and less susceptibility artifact in 20 (0.50). FRFSE images showed less vascular pulsation artifact in 30 (0.75) cases, and less phase artifact in 21 (0.53) cases. There was no difference in depiction of disease for FRFSE and fTED images. Conclusion FTED is a robust sequence providing breath-hold T2-weighted images with superior fat suppression, excellent image quality, and at least equal depiction of disease compared to conventional breath-hold T2-weighted FRFSE imaging. J. Magn. Reson. Imaging 2009;30:569,577. © 2009 Wiley-Liss, Inc. [source] Pressure Effect on the Homogeneity of Spark Plasma-Sintered Tungsten Carbide PowderJOURNAL OF THE AMERICAN CERAMIC SOCIETY, Issue 10 2009Salvatore Grasso A combined experimental/numerical methodology was developed to aid full densification of pure ultrafine tungsten carbide powder by means of Spark Plasma Sintering (SPS) operating in Current Control mode. Applied pressure ranged from 5 to 80 MPa while the current intensity was set and held constant at 1400 A. The developed SPS model used a moving-mesh technique to account for the electrothermal contact resistance change during both shrinkage and punch sliding follow-up. The pressure dependence on the electrothermal contact resistance was also taken into account by the model. The experimental and numerical results showed the effects of pressure on grain growth, residual porosity, and hardness observed along the sample radius. Upon increasing sintering pressure, complete densification was obtained by reducing the peak temperature measured at the die surface. By combining experimental and modeling results, a direct correlation between compact microstructure homogeneity and sintering parameters (i.e., temperature and applied pressure) was established. [source] Homogeneity of fossil assemblages extracted from mine dumps: an analysis of Plio-Pleistocene fauna from South African cavesLETHAIA, Issue 4 2005FRANK SÉNÉGAS Mine dumps associated with limestone cave deposits are common in dolomitic areas of southern Africa. The dumps often contain blocks of breccia, which are rich in micro-mammalian fossils (especially rodents, shrews and bats). Unfortunately, these fossiliferous breccia blocks are out of geological and stratigraphic context. Nevertheless, they provide a large amount of palaeontological material of great interest. In order to use this kind of material, a first approach is to test for homogeneity of the fossil assemblages extracted from the breccia blocks. Fisher's exact test can be used. Two analyses were undertaken. The first was performed on block samples taken in situ from breccia at the Drimolen hominid site. The results indicated that the samples were homogenous, as expected. The second analysis was carried out on different samples extracted from blocks of breccia collected from a dump at the Gondolin site. The results show that it is possible to group several samples in a single representative assemblage. Some blocks could be grouped together and then used to address taphonomic issues. Once these problems are solved, the data set can be used with greater confidence to address matters concerning palaeoenvironmental reconstructions associated with Plio-Pleistocene hominids. [source] The Division of Labor Under Homogeneity: A Critique of Mises and RothbardAMERICAN JOURNAL OF ECONOMICS AND SOCIOLOGY, Issue 2 2007Walter Block Even the most passionate defenders of free trade, such as Mises and Rothbard, claim that trade cannot occur under conditions of strict homogeneity of land, labor, and capital. We show that specialization, trade, and the division of labor can emerge even when resources are initially homogenous, due to "natural heterogeneity," economies of scale, and learning. [source] Order and Disorder in Powder Mixtures: Spatial Distribution Functions as Tools to Assess Powder HomogeneityPARTICLE & PARTICLE SYSTEMS CHARACTERIZATION, Issue 5-6 2008Albert Mihranyan Abstract In interactive mixtures with small carrier particles, the content variability is often higher than predicted by available models despite the significant degree of interaction visualized with Scanning Electron Microscopy (SEM). The present work details how pair-correlation functions can be used to reveal information about the spatial distribution of mixture constituents and their interactions. SEM pictures of a 2,% w/w oxazepam/sodium starch glycolate (SSG) mixture were recorded (n = 14). The constituent coordinates were extracted and pair-correlation functions as well as the cross-correlation function were calculated. A significant degree of interaction was observed between the constituents in the experimental mixture, compared to a randomized control system. In particular, the probability of finding an oxazepam particle was especially high inside the perimeter of the carrier particle and along its edges. The observed cross-correlation between oxazepam and SSG particles was periodic and repeated at distances corresponding to 1,1.5 carrier diameters. It was concluded that interactive mixtures of powders can be compared to disordered/amorphous solids since both exhibit short-range order, whilst lacking long-range translational periodicity. [source] Homogeneity of multilayers produced with a static mixerPOLYMER ENGINEERING & SCIENCE, Issue 1 2001J. C. Van Der Hoeven A multiflux static mixer can be used to produce multilayered structures. The flow is repeatedly cut, stretched and stacked by mixing elements in the channel of such a device. In the standard design, however, the obtained layer thicknesses are inhomogeneous. The causes for the multiflux static mixer's deviation from ideal behavior are identified by 3D numeical simulations as unequal pressure drops in the separating flows. Changes in the arrangements of the elements are proposed and their effects are verified by simulations and experiments. A significant improvement of the layer homogeneity is achieved by introducing additional elements with separatings walls at the inlets and at the outlets of the mixing elements. [source] Homogeneity of active demyelinating lesions in established multiple sclerosisANNALS OF NEUROLOGY, Issue 1 2008Esther C. W. Breij PhD Objective Four different patterns of demyelination have been described in active demyelinating lesions of multiple sclerosis (MS) patients that were biopsied shortly after disease onset. These patterns were suggested to represent heterogeneity of the underlying pathogenesis. The aim of this study was to determine whether lesion heterogeneity also exists in an unselected collection of autopsy material from patients with established MS. Methods All MS brain tissue available in the VU Medical Center was assessed for the presence of active demyelinating lesions using magnetic resonance imaging,guided sampling and immunohistochemistry. Tissue blocks containing active demyelinating lesions were evaluated for the presence of complement and antibody deposition, oligodendrocyte apoptosis, differential loss of myelin proteins, and hypoxia-like damage using histology, immunohistochemistry, and confocal microscopy. Blocks with active demyelinating lesions were compared with blocks with active (nondemyelinating) and inactive lesions. Results Complement and antibodies were consistently associated with macrophages in areas of active demyelination. Preferential loss of myelin proteins, extensive hypoxia-like damage, and oligodendrocyte apoptosis were absent or rare. This pattern was observed in all tissue blocks containing active demyelinating lesions; lesion heterogeneity between patients was not found. Interpretation The immunopathological appearance of active demyelinating lesions in established MS is uniform. Initial heterogeneity of demyelinating lesions in the earliest phase of MS lesion formation may disappear over time as different pathways converge in one general mechanism of demyelination. Consistent presence of complement, antibodies, and Fc, receptors in phagocytic macrophages suggests that antibody- and complement-mediated myelin phagocytosis is the dominant mechanism of demyelination in established MS. Ann Neurol 2008;63:16,25 [source] A semi-Markov model of disease recurrence in insured dogsAPPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, Issue 5 2007Xikui Wang Abstract We use a semi-Markov model to analyse the stochastic dynamics of disease occurrence of dogs insured in Canada from 1990 to 1999, and the probability pattern of death from illness. After statistically justifying the use of a stochastic model, we demonstrate that a stationary first-order semi-Markov process is appropriate for analysing the available data set. The probability transition function is estimated and its stationarity is tested statistically. Homogeneity of the semi-Markov model with respect to important covariates (such as geographic location, insurance plan, breed and age) is also statistically examined. We conclude with discussions and implications of our results in veterinary contents. Copyright © 2007 John Wiley & Sons, Ltd. [source] Testing Homogeneity of Two Zero-inflated Poisson PopulationsBIOMETRICAL JOURNAL, Issue 1 2009Siu Keung Tse Abstract The problem of testing treatment difference in the occurrence of a safety parameter in a randomized parallel-group comparative clinical trial under the assumption that the number of occurrence follows a zero-inflated Poisson (ZIP) distribution is considered. Likelihood ratio tests (LRT) for homogeneity of two ZIP populations are derived under the hypotheses that (i) there is no difference in inflation parameters, (ii) there is no difference in non-zero means; and (iii) there is no difference in both inflation parameters and non-zero means. Approximate formulas for sample size calculation are also obtained for achieving a desired power for detecting a clinically meaningful difference under the corresponding alternative hypotheses. An example concerning the assessment of the gastrointestinal (GI) safety in terms of the number of erosion counts of a newly developed compound for the treatment of osteoarthritis and rheumatoid arthritis is given for illustration purpose (© 2009 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Testing Marginal Homogeneity Against Stochastic Order in Multivariate Ordinal DataBIOMETRICS, Issue 2 2009B. Klingenberg Summary Many assessment instruments used in the evaluation of toxicity, safety, pain, or disease progression consider multiple ordinal endpoints to fully capture the presence and severity of treatment effects. Contingency tables underlying these correlated responses are often sparse and imbalanced, rendering asymptotic results unreliable or model fitting prohibitively complex without overly simplistic assumptions on the marginal and joint distribution. Instead of a modeling approach, we look at stochastic order and marginal inhomogeneity as an expression or manifestation of a treatment effect under much weaker assumptions. Often, endpoints are grouped together into physiological domains or by the body function they describe. We derive tests based on these subgroups, which might supplement or replace the individual endpoint analysis because they are more powerful. The permutation or bootstrap distribution is used throughout to obtain global, subgroup, and individual significance levels as they naturally incorporate the correlation among endpoints. We provide a theorem that establishes a connection between marginal homogeneity and the stronger exchangeability assumption under the permutation approach. Multiplicity adjustments for the individual endpoints are obtained via stepdown procedures, while subgroup significance levels are adjusted via the full closed testing procedure. The proposed methodology is illustrated using a collection of 25 correlated ordinal endpoints, grouped into six domains, to evaluate toxicity of a chemical compound. [source] Levene Tests of Homogeneity of Variance for General Block and Treatment DesignsBIOMETRICS, Issue 1 2002Michael E. O'Neill Summary. This article develops a weighted least squares version of Levene's test of homogeneity of variance for a general design, available both for univariate and multivariate situations. When the design is balanced, the univariate and two common multivariate test statistics turn out to be proportional to the corresponding ordinary least squares test statistics obtained from an analysis of variance of the absolute values of the standardized mean-based residuals from the original analysis of the data. The constant of proportionality is simply a design-dependent multiplier (which does not necessarily tend to unity). Explicit results are presented for randomized block and Latin square designs and are illustrated for factorial treatment designs and split-plot experiments. The distribution of the univariate test statistic is close to a standard F -distribution, although it can be slightly underdispersed. For a complex design, the test assesses homogeneity of variance across blocks, treatments, or treatment factors and offers an objective interpretation of residual plots. [source] ChemInform Abstract: Investigation of Optical and Structural Homogeneity of Ca4GdO(BO3)3 Single Crystals.CHEMINFORM, Issue 1 2002A. Klos Abstract ChemInform is a weekly Abstracting Service, delivering concise information at a glance that was extracted from about 100 leading journals. To access a ChemInform Abstract of an article which was published elsewhere, please select a "Full Text" option. The original article is trackable via the "References" option. [source] The North American Industry Classification System and Its Implications for Accounting Research,CONTEMPORARY ACCOUNTING RESEARCH, Issue 4 2003Jayanthi Krishnan Abstract Industry classification is an important component of the methodological infrastructure of accounting research. Researchers have generally used the Standard Industrial Classification (SIC) system for assigning firms to industries. In 1999, the major statistical agencies of Canada, Mexico, and the United States began implementing the North American Industry Classification System (NAICS). The new scheme changes industry classification by introducing production as the basis for grouping firms, creating 358 new industries, extensively rearranging SIC categories, and establishing uniformity across all NAFTA nations. We examine the implications of the change for accounting research. We first assess NAICS's effectiveness in forming industry groups. Following Guenther and Rosman 1994, we use financial ratio variances to measure intra-industry homogeneity and find that NAICS offers some improvement over the SIC system in defining manufacturing, transportation, and service industries. We also evaluate whether NAICS might have an impact on empirical research by reproducing part of Lang and Lundholm's 1996 study of information-transfer and industry effects. Using SIC delineations, they focus on whether industry conditions or the level of competition is the main source of uncertainty resolved by earnings announcements. Across all levels of aggregation, we find inferences are similar using either SIC or NAICS. How-ever, we also observe that the regression coefficients in Lang and Lundholm's model show smaller intra-industry dispersion for NAICS, relative to SIC, definitions. Overall, the results suggest that NAICS definitions lead to more cohesive industries. Because of this, researchers may encounter some differences in using NAICS-industry definitions, rather than SIC, but these will depend on research design and industry composition of the sample. [source] Aerosols and gaseous contrast agents for magnetic resonance imaging of the lungCONTRAST MEDIA & MOLECULAR IMAGING, Issue 5 2008Karim Mosbah Abstract Magnetic resonance imaging of lungs and the investigation of pulmonary pathologies with this technique are limited by low proton spin density, degraded magnetic homogeneity and motion. Inhaled contrast agents (gases or aerosols) can improve the diagnostic value of MRI for lung. Paramagnetic contrast agents such as gadolinium chelates aerosol or dioxygen gas increase the relaxivity of proton in lung parenchyma and can be used to assess the ventilated fraction of the bronchoalveolar space. Similarly, inhalation of non proton-MRI nuclei such as perfluorinated gas or hyperpolarized gases (3He or 129Xe) can provide functional ventilation image. In this review paper, the principles, the practical implementation, the limitations and possible safety issues of these different techniques are summarized. The main pre-clinical and clinical applications of these approaches based on oral contrast agents are reviewed and illustrated with cutting-edge lung MRI studies. Copyright © 2008 John Wiley & Sons, Ltd. [source] An HPLC/mass spectrometry platform for the development of multimodality contrast agents and targeted therapeutics: prostate-specific membrane antigen small molecule derivativesCONTRAST MEDIA & MOLECULAR IMAGING, Issue 5 2006Valerie Humblet Abstract The production of disease-targeted agents requires the covalent conjugation of a targeting molecule with a contrast agent or therapeutic, followed by purification of the product to homogeneity. Typical targeting molecules, such as small molecules and peptides, often have high charge-to-mass ratios and/or hydrophobicity. Contrast agents and therapeutics themselves are also diverse, and include lanthanide chelates for MRI, 99mTc chelates for SPECT, 90Y chelates for radiotherapy, 18F derivatives for PET, and heptamethine indocyanines for near-infrared fluorescent optical imaging. We have constructed a general-purpose HPLC/mass spectrometry platform capable of purifying virtually any targeted agent for any modality. The analytical sub-system is composed of a single dual-head pump that directs mobile phase to either a hot cell for the purification of radioactive agents or to an ES-TOF MS for the purification of nonradioactive agents. Nonradioactive agents are also monitored during purification by ELSD, absorbance and fluorescence. The preparative sub-system is composed of columns and procedures that permit rapid scaling from the analytical system. To demonstrate the platform's utility, we describe the preparation of five small molecule derivatives specific for prostate-specific membrane antigen (PSMA): a gadolinium derivative for MRI, indium, rhenium and technetium derivatives for SPECT, and an yttrium derivative for radiotherapy. All five compounds are derived from a highly anionic targeting ligand engineered to have a single nucleophile for N -hydroxysuccinimide-based conjugation. We also describe optimized column/mobile phase combinations and mass spectrometry settings for each class of agent, and discuss strategies for purifying molecules with extreme charge and/or hydrophobicity. Taken together, our study should expedite the development of disease-targeted, multimodality diagnostic and therapeutic agents. Copyright © 2006 John Wiley & Sons, Ltd. [source] Crystal growth, optical and luminescence properties of (Ce,Sr)-doped PrAlO3 single crystalsCRYSTAL RESEARCH AND TECHNOLOGY, Issue 12 2007A. Novoselov Abstract Using the micro-pulling-down method, (Ce,Sr)-doped PrAlO3 square-shaped single crystals (4×4×12 mm) were grown. Structural parameters studied by X-ray powder diffraction were consistent with R3m space group. Compositional homogeneity was checked with electron probe micro-analysis and found quite uniform. Absorption spectra and luminescence characteristics under UV and X-ray excitations were measured at room temperature with no Ce-related emission appeared. (© 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Influence of the starting powders on the synthesis of nickel ferriteCRYSTAL RESEARCH AND TECHNOLOGY, Issue 8 2006F. Kenfack Abstract The thermal decomposition of freeze-dried nickel(II)-iron(III) formate was investigated by means of DTA, TG, mass spectrometry and X-ray powder diffractometry. For the preparation of homogeneous freeze-dried nickel(II)-iron(III) formate precursors, a rigorous control of nickel ion concentration in the precursor solution was required. The decomposition of the reactive nickel(II)-iron(III) formate does not only reflect aspects of single formates, but also an interaction between components which lowers the decomposition temperature. Crystalline nickel ferrite powders were obtained at 600-800°C. This temperature is quite lower than 1100°C employed for the ceramic method. In the presence of air, the regeneration of nickel ferrite from the taenite phase (,Ni,Fe) is accomplished at 800°C. This temperature is also 300°C below the temperature employed when the mixtures NiO:,-Fe2O3 or Ni:2Fe are the starting powders. The main reason for the high reactivity of the freeze-dried formates and the taenite alloy is the large homogeneity of these precursors. (© 2006 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Inhomogeneity of composition in near-stoichiometric LiNbO3 single crystal grown from Li rich meltCRYSTAL RESEARCH AND TECHNOLOGY, Issue 4 2006L. Gao Abstract A near-stoichiometric LiNbO3 single crystal has been grown by the Czochralski technique from a melt of 58.5 mol% Li2O. Its composition homogeneity was assessed by measuring the UV absorption edge. It was found that the maximum composition difference is about 0.03 mol% in the radial direction and 0.05 mol% in the axial direction. Differential scanning calorimetry (DSC) analysis was performed on the powder from the synthesized raw material and the frozen melt after crystal growth. The analytical results indicate that, during crystal growth, the magnitude of lithium volatilization from the melt surface is more than the degree of segregation from the crystal. The volatilized lithium diffuses into the crystal to compensate for the lithium segregation in the LiNbO3 crystal. (© 2006 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Preparation and investigation of (CuInSe2)x(2ZnSe)1-x and (CuInTe2)x(2ZnTe)1-x solid solution crystalsCRYSTAL RESEARCH AND TECHNOLOGY, Issue 4 2004I. V. Bodnar Abstract The (CuInSe2)x(2ZnSe)1-x and (CuInTe2)x(2ZnTe)1-x solid solution crystals prepared by Bridgman method and chemical vapor transport have been studied. The nature of the crystalline phases, the local structure homogeneity and composition of these materials have been investigated by X-ray diffraction (XRD) and Electron Probe Microanalysis (EPMA) methods. The analysis revealed the presence of chalcopyrite-sphalerite phase transition between 0.6 , X , 0.7. Lattice constants, value of , position parameter and bond length between atoms were also calculated. It was found that the lattice parameters exhibit a linear dependence versus composition. The transmission spectra of solid solution crystals in the region of the main absorption edge were studied. It was established that the optical band gap of these materials changes non-linearly with the X composition. (© 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] How to read and understand and use systematic reviews and meta-analysesACTA PSYCHIATRICA SCANDINAVICA, Issue 6 2009S. Leucht Objective,: Systematic reviews and meta-analyses are increasingly frequently used in the evaluation of medical treatments. This review explains the principles of the methodology, significance and limitations of systematic reviews. Method:, Short review article. Results:, In contrast to conventional reviews, systematic reviews use a structured approach in retrieving, analyzing and interpreting the evidence. A comprehensive search strategy is applied to identify all relevant trials. Study selection and data extraction is performed independently by at least two reviewers. The data are usually synthesized in a meta-analysis applying statistical methods to examine homogeneity. Funnel plots and other statistical methods are applied to detect publication bias. Conclusion:, Due to the enormous amount of scientific information published every year, systematic reviews and meta-analyses have become indispensable methods for the evaluation of medical treatments. [source] A validation analysis of two self-reported HAM-D6 versionsACTA PSYCHIATRICA SCANDINAVICA, Issue 4 2009P. Bech Objective:, The six items of the clinician-administrated Hamilton Depression Scale (HAM-D6) cover the core items of depressive states reflecting the antidepressive effect of medication. In this study, the two self-reported versions of the HAM-D6 have been psychometrically validated to ensure the unidimensionality of this administration form in patients with mild-to-moderate depression. Method:, The item response theory analysis of Mokken was used to test the unidimensionality of both the Interactive Voice Recording System (IVRS) version of the HAM-D6 and a paper-and-pencil self-reported version (S-HAM-D6). Patients with typical major depression and with seasonal affective disorder were included. Results:, The Mokken analysis showed that the two self-reported versions of the HAM-D6 obtained coefficients of homogeneity above 0.40, similar to the clinician-rated HAM-D6 and thus implying unidimensionality. By contrast, the full HAM-D17 versions (self-reported as well as clinician-rated) obtained coefficients of homogeneity below 0.40, implying that the HAM-D17 is a multidimensional scale. Conclusion:, The analysis show that both the IVRS version and the S-HAM-D6 version are unidimensional self-rating scales for the measurement of depressive states. [source] Structural and biophysical simulation of angiogenesis and vascular remodeling ,DEVELOPMENTAL DYNAMICS, Issue 4 2001Ralf Gödde Abstract The purpose of this report is to introduce a new computer model for the simulation of microvascular growth and remodeling into arteries and veins that imitates angiogenesis and blood flow in real vascular plexuses. A C++ computer program was developed based on geometric and biophysical initial and boundary conditions. Geometry was defined on a two-dimensional isometric grid by using defined sources and drains and elementary bifurcations that were able to proliferate or to regress under the influence of random and deterministic processes. Biophysics was defined by pressure, flow, and velocity distributions in the network by using the nodal-admittance-matrix-method, and accounting for hemodynamic peculiarities like Fahraeus-Lindqvist effect and exchange with extravascular tissue. The proposed model is the first to simulate interdigitation between the terminal branches of arterial and venous trees. This was achieved by inclusion of vessel regression and anastomosis in the capillary plexus and by remodeling in dependence from hemodynamics. The choice of regulatory properties influences the resulting vascular patterns. The model predicts interdigitating arteriovenous patterning if shear stress-dependent but not pressure-dependent remodeling was applied. By approximating the variability of natural vascular patterns, we hope to better understand homogeneity of transport, spatial distribution of hemodynamic properties and biomass allocation to the vascular wall or blood during development, or during evolution of circulatory systems. © 2001 Wiley-Liss, Inc. [source] Multiple causality in developmental disorders: methodological implications from computational modellingDEVELOPMENTAL SCIENCE, Issue 5 2003Michael S.C. Thomas When developmental disorders are defined on the basis of behavioural impairments alone, there is a risk that individuals with different underlying cognitive deficits will be grouped together on the basis that they happen to share a certain impairment. This phenomenon is labelled multiple causality. In contrast, a developmental disorder generated by a single underlying cognitive deficit may nevertheless show variable patterns of impairments due to individual differences. Connectionist computational models of development are used to investigate whether there may be ways to distinguish disorder groups with a single underlying cause (homogeneous disorder groups) from disorder groups with multiple underlying causes (heterogeneous disorder groups) on the basis of behavioural measures alone. A heuristic is proposed to assess the underlying causal homogeneity of the disorder group based on the variability of different behavioural measures from the target domain. Heterogeneous disorder groups are likely to show smaller variability on the measure used to define the disorder than on subsequent behavioural measures, while homogeneous groups should show approximately equivalent variability. Homogeneous disorder groups should show reductions in the variability of behavioural measures over time, while heterogeneous groups may not. It is demonstrated how these predictions arise from computational assumptions, and their use is illustrated with reference to behavioural data on naming skills from two developmental disorder groups, Williams syndrome and children with Word Finding Difficulties. [source] Low-grade urothelial carcinoma: Reappraisal of the cytologic criteria on ThinPrep®DIAGNOSTIC CYTOPATHOLOGY, Issue 3 2003Ph.D., Wei Xin M.D. Abstract The diagnostic criteria for low-grade urothelial lesions that have been described in the past were based on urinary specimens prepared by the cytospin method. Recognizing the recent popularity of the ThinPrep® methodology and the cytologic alterations it introduces to the cellular features, we sought to evaluate the reproducibility of these criteria in ThinPrep urinary samples. One hundred twenty-six ThinPrep urinary specimens with a tissue diagnosis of low-grade urothelial carcinoma (LGUC) and 45 negative controls were evaluated. Three pathologists blindly reviewed the slides separately and the consensus on each feature was used in the study. Logistic regression analysis was used to determine which criteria in combination were most predictive of low-grade urothelial carcinoma. All specimens were evaluated for the following 18 features: nucleus/cytoplasm ratio, irregular nuclear border, cytoplasm homogeneity, cell clusters, high cellularity, prominent nucleoli, granular nuclear chromatin, hyperchromasia, acute inflammation, vesicular chromatin, nuclear molding, nuclear eccentricity, elongated nuclei, necrosis, anisonucleosis, irregular bordered fragments, absent cytoplasmic collar, and peripheral palisading. High nucleus-to-cytoplasm ratio, irregular nuclear borders, and homogeneous cytoplasm (combination sensitivity of 59% and specificity of 100%) were the best predictive features for LGUC. Minor predictive criteria were eccentric nuclei and nuclear molding. ThinPrep provides well preserved, cleaner specimens without significantly altering the morphology. The three key criteria applied in cytospin specimens to diagnose LGUC were reproducible in ThinPrep specimens. Diagn. Cytopathol. 2003;29:125,129. © 2003 Wiley-Liss, Inc. [source] |