Home About us Contact | |||
Additional Problems (additional + problem)
Selected AbstractsPerformance of an Autonomous Telemonitoring System in Children and Young Adults with Congenital Heart DiseasesPACING AND CLINICAL ELECTROPHYSIOLOGY, Issue 10 2008PETER ZARTNER M.D. Background:Integrated telemonitoring systems controlling circulatory and electrical parameters in adults with an implanted pacemaker have shown to be advantageous during follow-up of this patient group. In children and young adults with a congenital heart disease (CHD), these systems have to cope with a diversity of varying arrhythmias and a broad range of intrinsic cardiac parameters. Additional problems arise from the patients' growth and anatomic anomalies. Methods:Since 2005, eight young patients (age 4.1, 37 years, mean 15.5 years) with a CHD received a pacemaker or implantable cardioverter defibrillator with an autonomous telemonitoring system at our clinic. The mean follow-up time was 395 days (range 106,834 days, 8.7 patient years). Results:In seven of eight patients the system transmitted information, which led to beneficial modifications of the current antiarrhythmic therapy. In three patients the reported events were of a critical nature. One patient remained event-free for 192 days after implantation. During follow-up, 96% of the days were covered. The system also transferred additional information on the effectiveness of antiarrhythmic medication and the impact of physical activity. Conclusions:Young patients with an insufficient intrinsic heart rate or progressing arrhythmia, in addition to the conventional indications for pacemaker or defibrillator implantation, seem to profit to a high percentage from a telemetric surveillance system. The fully automated procedure of device interrogation and information transmission gives a daily overview on system function and specific arrhythmic events, especially in children who are unaware of any symptoms. [source] Beyond treatment of individual behavior problems: an effective residential continuum of care for individuals with severe behavior problemsBEHAVIORAL INTERVENTIONS, Issue 1 2007Terry J. Page We report on one component of a residential continuum of care designed to integrate children and adolescents with severe behavior problems into community residential and educational placements. The continuum featured three components: a behavior stabilization unit, campus apartments, and community group homes. Data are reported for a 5-year period, during which 116 children and adolescents were admitted to a behavior stabilization program for treatment of severe self-injury, aggression, and/or property destruction, and non-compliance. Additional problems for some admissions included stereotypy, feeding disorders, sleeping disorders, seizure disorders, and other medical conditions. Archival data were collected retrospectively on age, gender, length of stay, prescribed medications, function of problem behaviors, acuity level of behavioral interventions, and discharge site. Analysis of data indicated (1) the behavior stabilization unit was successful in reducing occurrence of severe behavior problems, and increasing adaptive behaviors, (2) Seventy-three individuals were able to successfully transition to a campus apartment program that had been designed as a step-down program from the behavioral stabilization unit, and (3) Sixty-seven individuals were able to transition to community group homes. The benefits of a residential continuum of care for individuals with severe behavior problems are discussed. Copyright © 2007 John Wiley & Sons, Ltd. [source] Characterizing closely spaced, complex disulfide bond patterns in peptides and proteins by liquid chromatography/electrospray ionization tandem mass spectrometryJOURNAL OF MASS SPECTROMETRY (INCORP BIOLOGICAL MASS SPECTROMETRY), Issue 1 2002Ten-Yang Yen Abstract Identifying the Cys residues involved in disulfide linkages of peptides and proteins that contain complex disulfide bond patterns is a significant analytical challenge. This is especially true when the Cys residues involved in the disulfide bonds are closely spaced in the primary sequence. Peptides and proteins that contain free Cys residues located near disulfide bonds present the additional problem of disulfide shuffling via the thiol,disulfide exchange reaction. In this paper, we report a convenient method to identify complex disulfide patterns in peptides and proteins using liquid chromatography/electrospray ionization tandem mass spectrometry (LC/ESI-MS/MS) in combination with partial reduction by tris(2-carboxyethyl)phosphine (TCEP). The method was validated using well-characterized peptides and proteins including endothelin, insulin, ,-conotoxin SI and immunoglobulin G (IgG2a, mouse). Peptide or protein digests were treated with TCEP in the presence of an alkylation reagent, maleimide-biotin (M-biotin) or N -ethylmaleimide (NEM), followed by complete reduction with dithiothreitol and alkylation by iodoacetamide (IAM). Subsequently, peptides that contained alkylated Cys were analyzed by capillary LC/ESI-MS/MS to determine which Cys residues were modified with M-biotin/NEM or IAM. The presence of the alkylating reagent (M-biotin or NEM) during TCEP reduction was found to minimize the occurrence of the thiol,disulfide exchange reaction. A critical feature of the method is the stepwise reduction of the disulfide bonds and the orderly, sequential use of specific alkylating reagents. Copyright © 2001 John Wiley & Sons, Ltd. [source] The Search for the Source of Epistemic GoodMETAPHILOSOPHY, Issue 1-2 2003Linda Zagzebski Knowledge has almost always been treated as good, better than mere true belief, but it is remarkably difficult to explain what it is about knowledge that makes it better. I call this "the value problem." I have previously argued that most forms of reliabilism cannot handle the value problem. In this article I argue that the value problem is more general than a problem for reliabilism, infecting a host of different theories, including some that are internalist. An additional problem is that not all instances of true belief seem to be good on balance, so even if a given instance of knowing p is better than merely truly believing p, not all instances of knowing will be good enough to explain why knowledge has received so much attention in the history of philosophy. The article aims to answer two questions: (1) What makes knowingp better than merely truly believing p? The answer involves an exploration of the connection between believing and the agency of the knower. Knowing is an act in which the knower gets credit for achieving truth. (2) What makes some instances of knowing good enough to make the investigation of knowledge worthy of so much attention? The answer involves the connection between the good of believing truths of certain kinds and a good life. In the best kinds of knowing, the knower not only gets credit for getting the truth but also gets credit for getting a desirable truth. The kind of value that makes knowledge a fitting object of extensive philosophical inquiry is not independent of moral value and the wider values of a good life. [source] Understanding the patterns and distribution of opioid analgesic dependence symptoms using a latent empirical approachINTERNATIONAL JOURNAL OF METHODS IN PSYCHIATRIC RESEARCH, Issue 2 2008L.A. Ghandour Abstract Prevalence of extramedical opioid analgesic use in the US is rising, yet little is known about the nature and extent of problems of dependence related to the use of these drugs. This study uses Latent Class Analysis to empirically define classes of past-year extramedical opioid analgesic users based on observed clustering of DSM-IV defined clinical dependence features; multinomial logistic regression is used to describe differences across these groups. The 2002,2003 public data-files of the National Survey on Drug Use and Health were used to identify 7810 extramedical opioid analgesic users in the past-year. The best-fitting four-class model identified classes that differed quantitatively and qualitatively, with 2% of the users in Class 4 (most severe) and 84% in Class 1 (least severe). Classes 2 and 3 had parallel symptom profiles, but those in Class 3 reported additional problems. Adolescents (12,17 year olds) were at higher odds of being in Class 3 versus older age groups; females were two times as likely to be in Classes 2 and 4, and those with mental health problems were at higher odds of belonging to the more severe classes. Differences by type of past year opioid users were also detected. This study sheds light on the classification and distribution of extramedical opioid analgesic dependence symptoms in the US general population, identifying subgroups that warrant immediate attention. Copyright © 2008 John Wiley & Sons, Ltd. [source] Impact of periodontal preventive programmes on the data from epidemiologic studiesJOURNAL OF CLINICAL PERIODONTOLOGY, Issue 2005Per E. Gjermo Abstract This report provides only circumstantial evidence for the impact of programmes on periodontal epidemiology. The prerequisites for programmes and campaigns are described, and epidemiologic data on periodontal disease are compared with known changes in factors that may be affected by such activities. Unfortunately, parameters for periodontal disease as a process are not available. Only variables indicating irreversible effects on the periodontal status can be obtained. A lack of appropriate studies creates additional problems. This review indicates that preventive programmes and campaigns to improve oral hygiene have affected periodontal epidemiologic data concerning gingivitis and mild/moderate periodontitis favourably. Severe periodontitis seems not to have been influenced by such activities. Smoking is strongly associated with the severity of periodontitis. Therefore, a positive effect may be anticipated following the smoking cessation campaigns currently introduced worldwide. However, because of the irreversible nature of our epidemiologic parameters, it will take decades before any effect may be evident. It is recommended that periodontal epidemiology should be revitalized by introducing a nominalistic categorization instead of the changing essentialistic approaches used so far in order to facilitate the interpretation of data. [source] A publication power approach for identifying premier information systems journalsJOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 2 2008Clyde W. Holsapple Stressing that some universities have adopted unrealistic requirements for tenure of information systems (IS) faculty members, a recent editorial in MIS Quarterly contends that the group of premier IS journals needs to be generally recognized as having more than just two members. This article introduces the publication power approach to identifying the premier IS journals, and it does indeed find that there are more than two. A journal's publication power is calculated from the actual publishing behaviors of full-time, tenured IS faculty members at a sizable set of leading research universities. The underlying premise is that these researchers produce excellent work, collectively spanning the IS field's subject matter, and that the greatest concentrations of their collective work appear in highest visibility, most important journals suitable for its subject matter. The new empirically based approach to identifying premier IS journals (and, more broadly, identifying journals that figure most prominently in publishing activity of tenured IS researchers) offers an attractive alternative to promulgations by individuals or cliques (possibly based on outdated tradition or vested interests), to opinion surveys (subjective, possibly ill-informed, vague about rating criteria, and/or biased in various ways), and to citation analyses (which ignore semantics of references and, in the case of ISI impact factors, have additional problems that cast considerable doubt on their meaningfulness within the IS field and its subdisciplines). Results of the publication power approach can be applied and supplemented according to needs of a particular university in setting its evaluation standards for IS tenure, promotion, and merit decisions. [source] Teamwork for innovation , the ,troika' of promotorsR & D MANAGEMENT, Issue 1 2001Jürgen Hauschildt The management of innovation requires ,champions' or ,promotors' who commit with enthusiasm to the new product or the new process idea. More complex innovations will require more than one promotor. Division of labour becomes an essential success factor. According to the promotor model, at least a dyad of a ,power promotor' and a ,technology promotor' is necessary to overcome the barriers of unwillingness and of ignorance. With growing complexity, additional problems of communication and process management will occur. This will demand a third team member, the ,process promotor', who is needed to overcome the barriers of non-responsibility and non-communication between the organisational units involved and to act as navigator of the process. In this article, we present an empirical investigation of 133 innovations in the German plant construction and engineering industry. The results strengthen the hypothesis that the level of success of an innovation depends on the existence of a ,troika' of promotors. [source] ARCHAEOMETRY AND MUSEUMS: FIFTY YEARS OF CURIOSITY AND WONDER,ARCHAEOMETRY, Issue 6 2008M. F. GUERRA Artefacts and works of art kept in museum collections originated in many cases from ancient private collections. In such cases, a partial or total absence of historical information may create additional problems concerning their authenticity. The study of museum collections and their preservation requires the use of analytical techniques but also combined examination techniques not commonly necessary for the study of archaeological objects. This paper gives an overview of the importance of museum items for the understanding of the past, the difficulties relating to their authentication and the significant advances brought about by science-based techniques, in particular those cases discussed in Archaeometry during the past 50 years. [source] Social democracy and globalisation: the limits of social democracy in historical perspectiveBRITISH JOURNAL OF POLITICS & INTERNATIONAL RELATIONS, Issue 3 2002John Callaghan This article argues that social democratic governments throughout the 20th century faced internal and international constraints arising from the operation of capitalist economies and that the evidence for a qualitative deepening of such constraints since the collapse of the Bretton Woods system is far from unequivocal. Financial markets were already big enough and fast enough to deter such governments from the pursuit of egalitarian policies in the interwar years or to destabilise them if they ignored the warning signs. This article also shows that the efficacy of Keynesian macroeconomic policy in the Golden Age has been exaggerated and that the problem of short,term movements of speculative capital persisted throughout this era in a country such as Britain. Keynesianism never worked in the face of mass unemployment and it is misleading to suggest that its breakdown in the 1970s somehow robbed social democracy of the policy tools that had maintained full employment in the 1950s and 1960s. A host of additional problems have indeed beset social democratic governments since 1973, but the analysis of such problems is hindered rather than helped by much of the literature which invokes economic globalisation. Globalisation theory is in need of further specification before it can be useful and arguments about the economic consequences of globalisation since 1973 need to distinguish its effects from those of the many conjunctural problems of the period as well as the policies that important agencies have pursued in search of solutions to them. [source] Should You Spin Off Your Internet Business?BUSINESS STRATEGY REVIEW, Issue 2 2000Rick Chavez The purpose of most corporate spin-offs is to unlock the shareholder value of a business unit that is not critical to the parent company's success. Internet spin-offs raise additional problems, partly because they are so new. This article sets out a multi-dimensional framework to help managers decide how to structure their internet businesses: whether to keep them integrated into the parent company, to establish them as wholly-owned subsidiaries or to spin them off ,wholly or partially'. The authors argue that companies must weigh the trade-offs between what they call the ,three Cs': control, currency and culture. The collapse of internet stock prices in April/May 2000 reduced but did not eliminate then ,currency' attraction of the spin-off option. But issues of control and culture were always just as important. Above all, the decision must be made in the context of a company's total ,digital agenda': that is, as part of a company's overall strategy for creating and sustaining value in the new economy. [source] |