Home About us Contact | |||
Hard
Kinds of Hard Terms modified by Hard Selected AbstractsEffect of reline material and denture base surface treatment on the impact strength of a denture base acrylic resinGERODONTOLOGY, Issue 1 2010Luciano Elias Da Cruz Perez doi:10.1111/j.1741-2358.2009.00292.x Effect of reline material and denture base surface treatment on the impact strength of a denture base acrylic resin Objective:, In this study, the effect of relining and surface treatment on the impact strength (IS) of a heat-polymerising denture base acrylic resin (Lucitone 550-L) was evaluated. Materials and methods:, Rectangular bars of L were made (60 × 6 × 2 mm) and relined (2 mm) with the relining resins Ufi Gel Hard (UH) and Tokuso Rebase Fast (TR). Specimens relined with L and intact L, TR and UH specimens were also made (60 × 6 × 4 mm), for comparison. Before relining, the L surface was left untreated or wetted with methyl methacrylate monomer and/or the bonding agents (BA) supplied by manufacturers of the reline resins. V-notches were machined at the midpoint of the length of all specimens. The notches were made either across the width (Nw) or across the thickness of the specimens (Nth). The Charpy impact test was performed using a 0.5-J pendulum, which had been specially designed and constructed. Data were analysed separately for each notch position using one-way analysis of variance and Tukey honestly significant difference post - hoc test (p = 0.05). Results:, The IS of L was similar to that of L/L. For the Nw notch, treating the denture base L with TR BA and relining with TR reline material produced the highest IS. Conclusion:, The IS of specimens made from heat polymerising acrylic resin Lucitone 550 was increased after relining using the hard chairside reline resin TR with its proprietary BA. [source] Polymerization of linseed oil with phenolic resinsJOURNAL OF APPLIED POLYMER SCIENCE, Issue 2 2010Gökhan Çayl Abstract In this study, linseed oil was directly polymerized with different oil soluble resoles. p- Ethyl (PEP), p-tertiary butyl (PTB), p-tertiary octyl (PTO), and p- phenyl (PPP) phenols were separately reacted with formaldehyde to give linseed oil soluble resoles. These were then reacted with linseed oil to give transparent rubbery polymers. A model reaction was performed with methyl oleate and PTB phenol resole to clarify the reaction mechanism. Reaction products were characterized with 1H-NMR and IR techniques. Spectral examination of the model reaction showed that polymerization reaction proceeded via ene reaction of the quinone methide formed at the end group of the resole with the allylic positions of the fatty ester. Rubbery polymers were obtained with linseed oil using 10 to 40% of the different resoles. Hard, load bearing and tough materials were obtained at 40% phenolic resin loading. Mechanical properties of the materials were characterized by dynamic mechanical analyzer (DMA) and stress,strain tests. The best mechanical and thermal properties were obtained with PEP resole which showed a storage modulus of 350 MPa and a tan , peak at 65°C. Storage moduli of 275, 250, and 30 were obtained for PPP, PTB, and PTO resoles-linseed oil polymers, respectively. At the same phenolic resin loading, elongation at break and swelling ratios in CH2Cl2 increased with the longer alkyl substitution on the resole resins. The highest thermal stability was observed by PEP resole,linseed oil polymer which has a 5% weight loss temperature of 340°C as determined by TGA. The lowest thermal stability was observed for PTB resole-linseed oil polymer at 220°C. © 2010 Wiley Periodicals, Inc. J Appl Polym Sci, 2010 [source] Effect of disinfection by microwave irradiation on the strength of intact and relined denture bases and the water sorption and solubility of denture base and reline materialsJOURNAL OF APPLIED POLYMER SCIENCE, Issue 1 2008Rosangela Seiko Seó Abstract This study evaluated the influence of microwave disinfection on the strength of intact and relined denture bases. Water sorption and solubility were also evaluated. A heat-polymerized acrylic resin (Lucitone 550) was used to construct 4-mm-thick (n = 40) and 2-mm-thick (n = 160) denture bases. Denture bases (2-mm) were relined with an autopolymerizing resin (Tokuso Rebase Fast, Ufi Gel Hard, Kooliner, or New Truliner). Specimens were divided into four groups (n = 10): without treatment, one or seven cycles of microwave disinfection (650 W for 6 min), and water storage at 37°C for 7 days. Specimens were vertically loaded (5 mm/min) until failure. Disc-shaped specimens (50 mm × 0.5 mm) were fabricated (n = 10) to evaluate water sorption and solubility. Data on maximum fracture load (N), deflection at fracture (mm), fracture energy (N mm), water sorption (%), and solubility (%) were analyzed by two-way analysis of variance and Student,Newman,Keuls tests (, = 0.05). One cycle of microwave disinfection decreased the deflection at fracture and fracture energy of Tokuso Rebase Fast and New Truliner specimens. The strength of denture bases microwaved daily for 7 days was similar to the strength of those immersed in water for 7 days. Microwave disinfection increased the water sorption of all materials and affected the solubility of the reline materials. © 2007 Wiley Periodicals, Inc. J Appl Polym Sci, 2008 [source] Old Habits Die Hard: The Quest for Correct(ed) QT Interval MeasurementsJOURNAL OF CARDIOVASCULAR ELECTROPHYSIOLOGY, Issue 8 2010DANIELA HUSSER M.D. No abstract is available for this article. [source] Influence of Matrix Type on Surface Roughness of Three Resins for Provisional Crowns and Fixed Partial DenturesJOURNAL OF PROSTHODONTICS, Issue 2 2009Raul Ayuso-Montero DDS Abstract Purpose: This study evaluated the effect of matrix type on the surface roughness of resins for provisional crowns and fixed partial dentures. Materials and Methods: Ninety specimens of two acrylic resins (Trim II, Tab2000) and one bis-acryl composite (Protemp II Garant) were fabricated using one of three matrices: irreversible hydrocolloid (Cavex CA37), poly(vinyl siloxane) (Aquasil) or vacuum-formed matrix (Bio-flow Hard). The sample size for each resin-matrix combination was 10. The vestibular face of one natural maxillary central incisor was used as a model to fabricate all the specimens, following the custom fabrication technique. The average roughness measurements, Ra (,m), were obtained using a profilometer, and the data were analyzed using Kruskal-Wallis and Mann-Whitney U- tests. The results were contrasted against the surface roughness of the tooth using a one-sample t- test. Results: Aquasil and vacuum-formed matrix had a smoother surface than Cavex CA37 regardless of the resin tested (p < 0.05). Protemp II Garant had the smoothest surface regardless of the matrix used, with no significant differences when polymerized against the three different matrices. Trim II polymerized against Cavex CA37 had a rougher (p < 0.05) surface than Aquasil or vacuum-formed matrix. Tab2000 had the smoothest surface (p < 0.05) when polymerized against a vacuum-formed matrix. Conclusions: There is no universal matrix that produces the smoothest surface: this depends on the compatibility between the resin and the matrix. Protemp II Garant polymerized against Cavex CA37 matrix yields a surface that is smooth enough not to require polishing unless this surface is adjusted. [source] Comparison of the Temperature-Dependent Ferroelastic Behavior of Hard and Soft Lead Zirconate Titanate CeramicsJOURNAL OF THE AMERICAN CERAMIC SOCIETY, Issue 9 2010Mie Marsilius The ferroelastic properties of a hard acceptor-doped lead zirconate titanate (PZT) ceramic are investigated between room temperature and 300°C. Comparison with a soft PZT shows that acceptor doping has a stronger influence on mechanically induced domain switching than on switching caused by electric fields. A quantitative analysis of spontaneous and remanent strain and polarization indicates that poling in the soft material is dominated by 180° domain processes, while non-180° processes dominate the strain behavior. If the mechanical load exceeds a threshold level, the "hardening" effect of the acceptor doping vanishes, and hard and soft materials behave identically. The results are discussed based on the defect dipole model and the charge drift model for hardening and aging in acceptor-doped ferroelectric ceramics. [source] What Is Hard to Learn Is Easy to Forget: The Roles of Word Concreteness, Cognate Status, and Word Frequency in Foreign-Language Vocabulary Learning and ForgettingLANGUAGE LEARNING, Issue 1 2000Annette M. B. De Groot We looked at foreign-language (FL) vocabulary learning and forgetting in experienced FL learners, using a paired-associate training technique in which native-language words were paired with pseudowords. The training involved 6 presentations of the same 60 translation pairs, followed by a test after the 2nd, 4th, and 6th presentation round. A retest followed 1 week after training. The stimulus materials were manipulated on word concreteness, cognate status, and word frequency, and both productive and receptive testing took place. Cognates and concrete words were easier to learn and less susceptible to forgetting than noncognates and abstract words. Word frequency hardly affected performance. Overall, receptive testing showed better recall than productive testing. Theoretical accounts of these findings are proposed. [source] Old Habits Are Hard to Change: A Case Study of Israeli Real Estate ContractsLAW & SOCIETY REVIEW, Issue 2 2010Doron Teichman This article presents a case study on the persistent dollarization norm in the Israeli real estate market. For many years Israeli real estate contracts have been denominated in American dollars. This contracting norm has remained surprisingly stable despite tremendous changes in the structure of the Israeli foreign currency market that severed the connection between the dollar and local inflation and added significant risks to exchange rates. Using an array of theoretical tools, I explain this puzzling phenomenon and demonstrate the centrality of social norms to the design of high-stakes contracts. Finally, I explore the interaction between social norms and the law and highlight the potential obstacles to regulating contracting norms. [source] The minimum evolution problem: Overview and classificationNETWORKS: AN INTERNATIONAL JOURNAL, Issue 2 2009Daniele Catanzaro Abstract Molecular phylogenetics studies the hierarchical evolutionary relationships among organisms by means of molecular data. These relationships are typically described by means of weighted trees, or phylogenies, whose leaves represent the observed organisms, internal vertices the intermediate ancestors, and edges the evolutionary relationships between pairs of organisms. Molecular phylogenetics provides several criteria for selecting one phylogeny from among plausible alternatives. Usually, such criteria can be expressed in terms of objective functions, and the phylogenies that optimize them are referred to as optimal. One of the most important criteria is the minimum evolution (ME) criterion, which states that the optimal phylogeny for a given set of organisms is the one whose sum of edge weights is minimal. Finding the phylogeny that satisfies the ME criterion involves solving an optimization problem, called the minimum evolution problem (MEP), which is notoriously -Hard. This article offers an overview of the MEP and discusses the different versions of it that occur in the literature. © 2008 Wiley Periodicals, Inc. NETWORKS, 2009 [source] The linear arrangement problem on recursively constructed graphsNETWORKS: AN INTERNATIONAL JOURNAL, Issue 3 2003S. B. Horton Abstract The linear arrangement problem on graphs is a relatively old and quite well-known problem. Hard in general, it remains open on general recursive graphs (i.e., partial k -trees, etc.), which is somewhat frustrating since most hard graph problems are easily solved on recursive graphs. In this paper, we examine the linear arrangement problem with respect to these structures. Included are some negative (complexity) results as well as a solvable case. © 2003 Wiley Periodicals, Inc. [source] How Hard Are the Sceptical Paradoxes?NOUS, Issue 2 2004Alex Byrne First page of article [source] Why Racial Profiling Is Hard to Justify: A Response to Risse and ZeckhauserPHILOSOPHY AND PUBLIC AFFAIRS, Issue 1 2005ANNABELLE LEVER First page of article [source] Why do Americans Work so Hard?PUBLIC POLICY RESEARCH, Issue 3 2005Alberto Alesina First page of article [source] Live-in Domestics, Seasonal Workers, and Others Hard to Locate on the Map of Democracy*THE JOURNAL OF POLITICAL PHILOSOPHY, Issue 4 2008Joseph H. Carens First page of article [source] Work Hard, Play Hard?: A Comparison of Male and Female Lawyers' Time in Paid and Unpaid Work and Participation in Leisure ActivitiesCANADIAN REVIEW OF SOCIOLOGY/REVUE CANADIENNE DE SOCIOLOGIE, Issue 1 2010JEAN E. WALLACE Les auteures tentent de déterminer le temps que les professionnels, hommes et femmes, passent à effectuer du travail rémunéré ou non, et la façon dont cela influe sur leur participation à différentes activités de loisirs. Elles se fondent sur des données provenant d'avocats professant dans différents milieux juridiques. Elles constatent que les hommes rapportent consacrer plus de temps au travail rémunéré et aux loisirs, alors que les femmes accordent plus de temps aux travaux ménagers ainsi qu'aux soins des enfants. Les résultats semblent démontrer que les occasions dans l'ensemble plus importantes de loisirs chez les hommes comparées à celles des femmes seraient attribuables à des relations inattendues entre la participation des hommes aux travaux domestiques et aux soins des enfants, et leurs activités de loisirs. Les auteures présentent différentes explications à ces résultats. There has been a considerable amount of research that documents how women and men spend their time in different work and home tasks. We examine how much time professional women and men spend in paid and unpaid work and how this relates to their participation in different leisure activities. We also explore whether time in paid and unpaid work has gender-specific effects on leisure participation. In examining these issues, we rely on data from lawyers working in different legal settings. Our results show that, as hypothesized, men report more time in paid work and leisure whereas women devote more time to housework and childcare. An unexpected finding is that the time men spend in housework or childcare is either unrelated or positively related to their leisure participation. These results suggest that men's greater overall opportunities for leisure compared with women's appear to stem from the unanticipated relationships between men's involvement in housework and childcare and their leisure activities. We raise several possible explanations for these findings. [source] IMPSPEC , Ein Hard- und Softwarekonzept für die (Bio)-ImpedanzspektroskopieCHEMIE-INGENIEUR-TECHNIK (CIT), Issue 12 2005A. Barthel Dipl.-Ing. (FH) Abstract Messungen der frequenzabhängigen dielektrischen und elektrischen Wechselstromeigenschaften wie Impedanz, Leitfähigkeit, Dielektrizitätszahl (Permittivität) u.,a. ermöglichen Aussagen über die Materialeigenschaften der Untersuchungsobjekte. Ändern sich die elektrischen Eigenschaften durch biologische und chemische Reaktionen, so können qualitative und quantitative Aussagen zu diesen Veränderungen mit Hilfe der Impedanzspektroskopie erfolgen. Aus diesem Grund gewinnt die Impedanzspektroskopie eine zunehmende Bedeutung auch für die Prozessmesstechnik. Ihr Anwendungsbereich umfasst neben Grundlagenuntersuchungen zur Materialanalytik auch Applikationen in der chemischen Verfahrenstechnik und in der Biomedizintechnik. In diesem Beitrag wird ein neues Konzept vorgestellt, dass eine hoch auflösende und schnelle Messung des Impedanzspektrums im Labormaßstab und auch in großtechnischen Anlagen unter ökonomischen Gesichtspunkten ermöglicht. [source] Adsorption of Small Molecules in Zeolites: A Local Hard,Soft Acid,Base Approach.CHEMINFORM, Issue 48 2003Ramesh Ch. No abstract is available for this article. [source] Perceived competence and school adjustment of hearing impaired children in mainstream primary school settingsCHILD: CARE, HEALTH AND DEVELOPMENT, Issue 6 2008N. Hatamizadeh Abstract Background Although educational main streaming of children with special needs formally began in Iran since 1992 there is little information whether hearing impaired children feel competent in regular schools. Methods To determine the perceived competence and school adjustment of hearing impaired children in mainstream primary school settings, the self-perception profile was administered to 60 mainstreamed hard of hearing children and 60 classmates with normal hearing matched for gender by a single interviewer. The instrument comprised 28 items, 23 of which were similar to those of ,adapted test Image for children with cochlear implants' asking children about their feelings about their own cognitive, physical, socio-emotional and communication competence and school adjustment. The Cronbach alpha coefficient for the instrument was 0.93. Results Hard of hearing children rated their competence significantly poorer than their hearing classmates for all domains. Mean differences for the five domains ranged from 0.48 (for physical competence) to 0.90 (for school adjustment) on a scale of 1,4. There were no significant differences between girls' and boys' competence, in either the hearing or the hearing impaired groups. Classifying overall scores for perceived competence into four groups (,poor competence', ,low competence', ,moderate competence' and ,high competence'), 23.4% of hearing impaired children but none of the hearing classmates rated themselves as having low or poor competence. On the other hand 85% of hearing children and only 18.3% of hearing impaired children rated themselves as highly competent. Conclusion We suggest that periodical assessments of mainstreamed children might help to identify those children who are having difficulty adapting to their environment. [source] Hard to swallow,emerging and re-emerging issues in foodborne infectionCLINICAL MICROBIOLOGY AND INFECTION, Issue 1 2010P. T. Tassios No abstract is available for this article. [source] Strategies in Human Nonmonotonic ReasoningCOMPUTATIONAL INTELLIGENCE, Issue 3 2000Marilyn Ford Although humans seem adept at drawing nonmonotonic conclusions, the nonmonotonic reasoning systems that researchers develop are complex and do not function with such ease. This paper explores people's reasoning processes in nonmonotonic problems. To avoid the problem of people's conclusions being based on knowledge rather than on some reasoning process, we developed a scenario about life on another planet. Problems were chosen to allow the systematic study of people's understanding of strict and nonstrict rules and their interactions. We found that people had great difficulty reasoning and we identified a number of negative factors influencing their reasoning. We also identified three positive factors which, if used consistently, would yield rational and coherent reasoning,but no subject achieved total consistency. (Another possible positive factor, specificity, was considered but we found no evidence for its use.) It is concluded that nonmonotonic reasoning is hard. When people need to reason in a domain where they have no preconceived ideas, the foundation for their reasoning is neither coherent nor rational. They do not use a nonmonotonic reasoning system that would work regardless of content. Thus, nonmonotonic reasoning systems that researchers develop are expected to do more reasoning than humans actually do! [source] Guidelines for treatment of neonatal jaundice.ACTA PAEDIATRICA, Issue 3 2001Is there a place for evidence-based medicine? Treatment of neonatal jaundice continues to be a controversial issue. Arguments that traditional practice results in over-treatment have led to the adoption of more liberal guidelines in some countries. The importation of liberal guidelines from one country to the next, however, is fraught with danger, because differences in epidemiology, sociology and healthcare delivery systems between countries may not be adequately reflected. The unreflected extension of liberalization to non-target groups of patients can expose the latter to significant risk. It is not clear that the evidence on which guidelines for treatment of neonatal jaundice are based satisfy the requirements for evidence-based medicine. Evidence of adequate quality may be hard to obtain. Conclusions: Introduction of more liberal guidelines for the treatment of neonatal jaundice, if at all contemplated, must be adapted to local circumstances, and any available evidence pertaining to local epidemiology, sociology and healthcare organization has to be carefully weighed and incorporated. The time is ripe for a joint international effort to secure adequate funding for basic and applied research within the mechanisms of bilirubin encephalopathy in the newborn. [source] A web-based tool for teaching neural network conceptsCOMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 3 2010Aybars Ugur Abstract Although neural networks (NN) are important especially for engineers, scientists, mathematicians and statisticians, they may also be hard to understand. In this article, application areas of NN are discussed, basic NN components are described and it is explained how an NN work. A web-based simulation and visualization tool (EasyLearnNN) is developed using Java and Java 2D for teaching NN concepts. Perceptron, ADALINE, Multilayer Perceptron, LVQ and SOM models and related training algorithms are implemented. As a result, comparison with other teaching methods of NN concepts is presented and discussed. © 2009 Wiley Periodicals, Inc. Comput Appl Eng Educ 18: 449,457, 2010; View this article online at wileyonlinelibrary.com; DOI 10.1002/cae.20184 [source] Centrality Based Visualization of Small World GraphsCOMPUTER GRAPHICS FORUM, Issue 3 2008F. Van Ham Abstract Current graph drawing algorithms enable the creation of two dimensional node-link diagrams of huge graphs. However, for graphs with low diameter (of which "small world" graphs are a subset) these techniques begin to break down visually even when the graph has only a few hundred nodes. Typical algorithms produce images where nodes clump together in the center of the screen, making it hard to discern structure and follow paths. This paper describes a solution to this problem, which uses a global edge metric to determine a subset of edges that capture the graph's intrinsic clustering structure. This structure is then used to create an embedding of the graph, after which the remaining edges are added back in. We demonstrate applications of this technique to a number of real world examples. [source] The Synthesis of Rock Textures in Chinese Landscape PaintingCOMPUTER GRAPHICS FORUM, Issue 3 2001Der-Lor Way In Chinese landscape painting, rock textures portray the orientation of mountains and contribute to the atmosphere. Many landscape-painting skills are required according to the type of rock. Landscape painting is the major theme of Chinese painting. Over the centuries, masters of Chinese landscape painting developed various texture strokes. Hemp-fiber and axe-cut are two major types of texture strokes. A slightly sinuous and seemingly broken line, the hemp-fiber stroke is used for describing the gentle slopes of rock formations whereas the axe-cut stroke best depicts hard, rocky surfaces. This paper presents a novel method of synthesizing rock textures in Chinese landscape painting, useful not only to artists who want to paint interactively, but also in automated rendering of natural scenes. The method proposed underwrites the complete painting process after users have specified only the contour and parameters. [source] Tour Into the Picture using a Vanishing Line and its Extension to Panoramic ImagesCOMPUTER GRAPHICS FORUM, Issue 3 2001Hyung Woo Kang Tour into the picture (TIP) proposed by Horry et al.13 is a method for generating a sequence of walk-through images from a single reference picture (or image). By navigating a 3D scene model constructed from the picture, TIP produces convincing 3D effects. Assuming that the picture has one vanishing point, they proposed the scene modeling scheme called spidery mesh. However, this scheme has to go through major modification when the picture contains multiple vanishing points or does not have any well-defined vanishing point. Moreover, the spidery mesh is hard to generalize for other types of images such as panoramic images. In this paper, we propose a new scheme for TIP which is based on a single vanishing line instead of a vanishing point. Based on projective geometry, our scheme is simple and yet general enough to address the problems faced with the previous method. We also show that our scheme can be naturally extended to a panoramic image. [source] A method for verifying concurrent Java components based on an analysis of concurrency failuresCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 3 2007Brad Long Abstract The Java programming language supports concurrency. Concurrent programs are harder to verify than their sequential counterparts due to their inherent non-determinism and a number of specific concurrency problems, such as interference and deadlock. In previous work, we have developed the ConAn testing tool for the testing of concurrent Java components. ConAn has been found to be effective at testing a large number of components, but there are certain classes of failures that are hard to detect using ConAn. Although a variety of other verification tools and techniques have been proposed for the verification of concurrent software, they each have their strengths and weaknesses. In this paper, we propose a method for verifying concurrent Java components that includes ConAn and complements it with other static and dynamic verification tools and techniques. The proposal is based on an analysis of common concurrency problems and concurrency failures in Java components. As a starting point for determining the concurrency failures in Java components, a Petri-net model of Java concurrency is used. By systematically analysing the model, we come up with a complete classification of concurrency failures. The classification and analysis are then used to determine suitable tools and techniques for detecting each of the failures. Finally, we propose to combine these tools and techniques into a method for verifying concurrent Java components. Copyright © 2006 John Wiley & Sons, Ltd. [source] JAC: declarative Java concurrencyCONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 5 2006Max Haustein Abstract The Java programming language has a low-level concurrency model which is hard to use and does not blend well with inheritance. JAC is an extension of Java that introduces a higher level of concurrency, hiding threads and separating thread synchronization from application logic in a declarative fashion. The emphasis is on limiting the differences between sequential and concurrent code, thus furthering code reuse, and on avoiding inheritance anomalies. This is achieved by taking a middle road between concurrent code on the one hand and complete separation of sequential application logic from concurrency mechanisms on the other. An extensive comparison with related approaches is given for motivating our design decisions. Copyright © 2005 John Wiley & Sons, Ltd. [source] The Sieve Model: An innovative process for identifying alternatives to custody evaluationsCONFLICT RESOLUTION QUARTERLY, Issue 3 2009Robert B. Silver This article reviews the development of the Sieve Model, conceived from dissatisfaction with adversarial processes that encouraged endless destructive fighting and depletion of financial and emotional family resources. Adversarial approaches discourage constructive problem solving and cooperation and are very hard on children. Rather than a piecemeal approach toward divorce, a systemic model was conceived. The Sieve Model is being implemented in the 20th Judicial Circuit of the State of Florida through differentiated case management, after a study revealed that protracted cases primarily involved disputes over children. Families are invited to use pertinent elements in an individualized fashion. Family law professionals are challenged to develop other solution-based efforts akin to mediation to assist families of divorce. The Sieve Model encourages participants to practice solving problems rather than creating them, decreasing divorce brutality and postjudgment conflicts. [source] Why Adopt Codes of Good Governance?CORPORATE GOVERNANCE, Issue 1 2008A Comparison of Institutional, Efficiency Perspectives ABSTRACT Manuscript Type: Empirical Research Question/Issue: Given the global diffusion and the relevance of codes of good governance, the aim of this article is to investigate if the main reason behind their proliferation in civil law countries is: (i) the determination to improve the efficiency of the national governance system; or (ii) the will to "legitimize" domestic companies in the global financial market without radically improving the governance practices. Research Findings/Insights: We collected corporate governance codes developed worldwide at the end of 2005, and classified them according to the country's legal system (common or civil law). Then, we made a comparative analysis of the scope, coverage, and strictness of recommendations of the codes. We tested differences between common law and civil law countries using t-tests and probit models. Our findings suggest that the issuance of codes in civil law countries be prompted more by legitimation reasons than by the determination to improve the governance practices of national companies. Theoretical/Academic Implications: The study contributes to enriching our knowledge on the process of reinvention characterizing the diffusion of new practices. Our results are consistent with a symbolic perspective on corporate governance, and support the view that diffusing practices are usually modified or "reinvented" by adopters. Practitioner/Policy Implications: Our results support the idea that the characteristics of the national corporate governance system and law explain the main differences among the coverage of codes. This conclusion indicates the existence of a strong interplay between hard and soft law. [source] Oral trauma, mouthguard awareness, and use in two contact sports in TurkeyDENTAL TRAUMATOLOGY, Issue 5 2006Ibrahim Tulunoglu Abstract,,, The purpose of the present study was to evaluate the occurrence of dental hard and soft tissue injuries during participation in contact sports, and the awareness and use of mouthguards in a young adult sample of semi-professional or amateur boxers and tae kwon do participants in Turkey. The samples consisted of 274 young adults [174 male (63.5%) and 100 female (36.5%)] aged between 17 and 27 years of which 185 (67.5%) were tae kwon do practitioners, and 89 (32.5%) were boxers. The participants answered a standard questionnaire. All answers were evaluated and then statistical analyses were performed. Of the total sample, 61 of the subjects (22.3%) suffered dental trauma. Of these sufferers, 32 (17.3%) were boxers and 29 (32.6%) were tae kwon do practitioners. It was found that 19 (6.9%) athletes lost their teeth post-trauma. Of the 54 subjects (19.7%) suffering soft tissue injuries, 44 were female (81.5%), while only 10 were male (18.5%), of which 40 (74.1%) were tae kwon do practitioners and 14 (25.9%) were boxers. Of the total sample of 274 subjects, 228 (83.2%) were well informed about mouthguard usage. Of the total sample, 153 (55.8%) of the subjects used mouthguards, all of which were boil-and-bite type. The results of our study indicate that dentists and sports authorities in Turkey should promote the use of mouthguards in contact sports such as tae kwon do and boxing, which have a serious risk for dental and oral soft tissue trauma and tooth loss. [source] |