Home About us Contact | |||
Inherent Problems (inherent + problem)
Selected AbstractsThe embedded ion method: A new approach to the electrostatic description of crystal lattice effects in chemical shielding calculationsCONCEPTS IN MAGNETIC RESONANCE, Issue 5 2006Dirk Stueber Abstract The nuclear magnetic shielding anisotropy of NMR active nuclei is highly sensitive to the nuclear electronic environment. Hence, measurements of the nuclear magnetic shielding anisotropy represent a powerful tool in the elucidation of molecular structure for a wide variety of materials. Quantum mechanical ab initio nuclear magnetic shielding calculations effectively complement the experimental NMR data by revealing additional structural information. The accuracy and capacity of these calculations has been improved considerably in recent years. However, the inherent problem of the limitation in the size of the systems that may be studied due to the relatively demanding computational requirements largely remains. Accordingly, ab initio shielding calculations have been performed predominantly on isolated molecules, neglecting the molecular environment. This approach is sufficient for neutral nonpolar systems, but leads to serious errors in the shielding calculations on polar and ionic systems. Conducting ab initio shielding calculations on clusters of molecules (i.e., including the nearest neighbor interactions) has improved the accuracy of the calculations in many cases. Other methods of simulating crystal lattice effects in shielding calculations that have been developed include the electrostatic representation of the crystal lattice using point charge arrays, full ab initio methods, ab initio methods under periodic boundary conditions, and hybrid ab initio/molecular dynamics methods. The embedded ion method (EIM) discussed here follows the electrostatic approach. The method mimics the intermolecular and interionic interactions experienced by a subject molecule or cluster in a given crystal in quantum mechanical shielding calculations with a large finite, periodic, and self-consistent array of point charges. The point charge arrays in the EIM are generated using the Ewald summation method and embed the molecule or ion of interest for which the ab initio shielding calculations are performed. The accuracy with which the EIM reproduces experimental nuclear magnetic shift tensor principal values, the sensitivity of the EIM to the parameters defining the point charge arrays, as well as the strengths and limitations of the EIM in comparison with other methods that include crystal lattice effects in chemical shielding calculations, are presented. © 2006 Wiley Periodicals, Inc. Concepts Magn Reson Part A 28A: 347,368, 2006 [source] Understanding and reducing web delaysINTERNATIONAL JOURNAL OF NETWORK MANAGEMENT, Issue 2 2005Kevin Curran Over the years the number of web users has increased dramatically unfortunately leading to the inherent problem of congestion. This can affect each user's surfing experience. This paper investigates download times associated with a web request, identifies where delays occur, and provides guidelines which can be followed by web developers to enable a faster and more efficient download and service for their users.,Copyright © 2005 John Wiley & Sons, Ltd. [source] Using habitat distribution models to evaluate large-scale landscape priorities for spatially dynamic speciesJOURNAL OF APPLIED ECOLOGY, Issue 1 2008Regan Early Summary 1Large-scale conservation planning requires the identification of priority areas in which species have a high likelihood of long-term persistence. This typically requires high spatial resolution data on species and their habitat. Such data are rarely available at a large geographical scale, so distribution modelling is often required to identify the locations of priority areas. However, distribution modelling may be difficult when a species is either not recorded, or not present, at many of the locations that are actually suitable for it. This is an inherent problem for species that exhibit metapopulation dynamics. 2Rather than basing species distribution models on species locations, we investigated the consequences of predicting the distribution of suitable habitat, and thus inferring species presence/absence. We used habitat surveys to define a vegetation category which is suitable for a threatened species that has spatially dynamic populations (the butterfly Euphydryas aurinia), and used this as the response variable in distribution models. Thus, we developed a practical strategy to obtain high resolution (1 ha) large scale conservation solutions for E. aurinia in Wales, UK. 3Habitat-based distribution models had high discriminatory power. They could generalize over a large spatial extent and on average predicted 86% of the current distribution of E. aurinia in Wales. Models based on species locations had lower discriminatory power and were poorer at generalizing throughout Wales. 4Surfaces depicting the connectivity of each grid cell were calculated for the predicted distribution of E. aurinia habitat. Connectivity surfaces provided a distance-weighted measure of the concentration of habitat in the surrounding landscape, and helped identify areas where the persistence of E. aurinia populations is expected to be highest. These identified successfully known areas of high conservation priority for E. aurinia. These connectivity surfaces allow conservation planning to take into account long-term spatial population dynamics, which would be impossible without being able to predict the species' distribution over a large spatial extent. 5Synthesis and applications. Where species location data are unsuitable for building high resolution predictive habitat distribution models, habitat data of sufficient quality can be easier to collect. We show that they can perform as well as or better than species data as a response variable. When coupled with a technique to translate distribution model predictions into landscape priority (such as connectivity calculations), we believe this approach will be a powerful tool for large-scale conservation planning. [source] Physical and chemical considerations of damage induced in protein crystals by synchrotron radiation: a radiation chemical perspectiveJOURNAL OF SYNCHROTRON RADIATION, Issue 6 2002Peter O'Neill Radiation-induced degradation of protein or DNA samples by synchrotron radiation is an inherent problem in X-ray crystallography, especially at the `brighter' light sources. This short review gives a radiation chemical perspective on some of the physical and chemical processes that need to be considered in understanding potential pathways leading to the gradual degradation of the samples. Under the conditions used for X-ray crystallography at a temperature of <100,K in the presence of cryoprotectant agents, the majority of radiation damage of the protein samples arises from direct ionization of the amino acid residues and their associated water molecules. Some of the chemical processes that may occur at these protein centres, such as bond scission, are discussed. Several approaches are discussed that may reduce radiation damage, using agents known from radiation chemistry to minimize radical-induced degradation of the sample. [source] A public-key based authentication and key establishment protocol coupled with a client puzzleJOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 9 2003M.C. Lee Network Denial-of-Service (DoS) attacks, which exhaust server resources and network bandwidth, can cause the target servers to be unable to provide proper services to the legitimate users and in some cases render the target systems inoperable and/or the target networks inaccessible. DoS attacks have now become a serious and common security threat to the Internet community. Public Key Infrastructure (PKI) has long been incorporated in various authentication protocols to facilitate verifying the identities of the communicating parties. The use of PKI has, however, an inherent problem as it involves expensive computational operations such as modular exponentiation. An improper deployment of the public-key operations in a protocol could create an opportunity for DoS attackers to exhaust the server's resources. This paper presents a public-key based authentication and key establishment protocol coupled with a sophisticated client puzzle, which together provide a versatile solution for possible DoS attacks and various other common attacks during an authentication process. Besides authentication, the protocol also supports a joint establishment of a session key by both the client and the server, which protects the session communications after the mutual authentication. The proposed protocol has been validated using a formal logic theory and has been shown, through security analysis, to be able to resist, besides DoS attacks, various other common attacks. [source] Can Nutritional Label Use Influence Body Weight Outcomes?KYKLOS INTERNATIONAL REVIEW OF SOCIAL SCIENCES, Issue 4 2009Andreas C. Drichoutis SUMMARY Many countries around the world have already mandated, or plan to mandate, the presence of nutrition related information on most pre-packaged food products. Health advocates and lobbyists would like to see similar laws mandating nutrition information in the restaurant and fast-food market as well. In fact, New York City has already taken a step forward and now requires all chain restaurants with 15 or more establishments anywhere in US to show calorie information on their menus and menu board. The benefits were estimated to be as much as 150,000 fewer obese New Yorkers over the next five years. The implied benefits of the presence of nutrition information are that consumers will be able to observe such information and then make informed (and hopefully healthier) food choices. In this study, we use the latest available dataset from the US National Health and Nutrition Examination Survey (2005,2006) to explore whether reading such nutrition information really has an effect on body weight outcomes. In order to deal with the inherent problem of cross-sectional datasets, namely self-selection, and the possible occurrence of reverse causality we use a propensity score matching approach to estimate causal treatment effects. We conducted a series of tests related to variable choice of the propensity score specification, quality of matching indicators, robustness checks, and sensitivity to unobserved heterogeneity, using Rosenbaum bounds to validate our propensity score exercise. Our results generally suggest that reading nutrition information does not affect body mass index. The implications of our findings are also discussed. [source] Soft Governance, Hard Consequences: The Ambiguous Status of Unofficial GuidelinesPUBLIC ADMINISTRATION REVIEW, Issue 4 2006Taco Brandsen Soft governance is an approach to policy implementation in which the central government relies less on hierarchy than on information to steer local organizations. This allows for a combination of formal accountability and professional autonomy that improves the quality of public services in both the short and the long term. Guidelines of an advisory, unofficial status are one tool that central government can use for this purpose. However, an inherent problem with this approach is that even though guidelines have no official legal status, in practice, they can take on the character of formal regulation when local organizations suspect that they cannot choose alternative courses of action, however well reasoned, without being sanctioned. It is a situation that encourages conformist behavior and diminishes the long-term potential for innovation. This phenomenon is illustrated with an analysis of disaster management in the Netherlands. [source] THE INFLUENCE OF EXPERT REVIEWS ON CONSUMER DEMAND FOR EXPERIENCE GOODS: A CASE STUDY OF MOVIE CRITICS*THE JOURNAL OF INDUSTRIAL ECONOMICS, Issue 1 2005David A. Reinstein An inherent problem in measuring the influence of expert reviews on the demand for experience goods is that a correlation between good reviews and high demand may be spurious, induced by an underlying correlation with unobservable quality signals. Using the timing of the reviews by two popular movie critics, Siskel and Ebert, relative to opening weekend box office revenue, we apply a difference-in-differences approach to circumvent the problem of spurious correlation. After purging the spurious correlation, the measured influence effect is smaller though still detectable. Positive reviews have a particularly large influence on the demand for dramas and narrowly-released movies. [source] An Advanced Physiological Controller Design for a Left Ventricular Assist Device to Prevent Left Ventricular CollapseARTIFICIAL ORGANS, Issue 10 2003Yi Wu Abstract: A continuous flow left ventricle assist device (LVAD), which is mainly composed of a continuous flow blood pump and a physiological controller, has only one control input, the rotational speed of the pump, but at least three performance criteria to meet. The challenge for the physiological controller of a long-term continuous flow LVAD is the adaptability to different cardiovascular loading situations and the ability to handle systemic and parametric uncertainties with only one control input. The physiological LVAD controller presented in this article exhibits good performance in terms of the three performance criteria in different physiological loading conditions, such as disturbance, resting, and moderate exercise, for a patient with congestive heart failure. The collapse of the left ventricle, which is an inherent problem for a continuous flow LVAD, has been prevented because of the control algorithm design. [source] News and Nuances of the Entrepreneurial Myth and Metaphor: Linguistic Games in Entrepreneurial Sense-Making and Sense-GivingENTREPRENEURSHIP THEORY AND PRACTICE, Issue 2 2005Louise Nicholson This article describes a social construction of entrepreneurship by exploring the constructionalist building blocks of communication, myth, and metaphor presented in a major British middle range broadsheet newspaper with no particular party political allegiance. We argue that the sense-making role of figurative language is important because of the inherent problems in defining and describing the entrepreneurial phenomena. Myth and metaphor in newspapers create an entrepreneurial appreciation that helps define our understanding of the world around us. The content analysis of articles published in this newspaper revealed images of male entrepreneurs as dynamic wolfish charmers, supernatural gurus, successful skyrockets or community saviors and corrupters. Finally, this article relates the temporal construction of myth and metaphor to the dynamics of enterprise culture. [source] Public Provision for Urban Water: Getting Prices and Governance RightGOVERNANCE, Issue 4 2008EDUARDO ARARAL Public sector monopolies are often associated with inefficiencies and inability to meet rising demand. Scholars attribute this to fundamental problems associated with public provision: (1) a tradition of below-cost pricing due to populist pressures, (2) owner,regulator conflicts of interest, and (3) perverse organizational incentives arising from non-credible threat of bankruptcy, weak competition, rigidities, and agency and performance measurement problems. Many governments worldwide have shifted to private provision, but recent experience in urban water utilities in developing countries has shown their limitations because of weak regulatory regimes compounded by inherent problems of information, incentives, and commitment. This article examines the paradoxical case of the Phnom Penh Water Supply in Cambodia to illustrate how public provision of urban water can be substantially improved by getting prices and governance right. Findings have implications for the search for solutions to provide one billion people worldwide with better access to potable water. [source] Directional asymmetry of long-distance dispersal and colonization could mislead reconstructions of biogeographyJOURNAL OF BIOGEOGRAPHY, Issue 5 2005Lyn G. Cook Abstract Aim, Phylogenies are increasingly being used to attempt to answer biogeographical questions. However, a reliance on tree topology alone has emerged without consideration of earth processes or the biology of the organisms in question. Most ancestral-state optimization methods have inherent problems, including failure to take account of asymmetry, such as unequal probabilities of losses and gains, and the lack of use of independent cost estimates. Here we discuss what we perceive as shortcomings in most current tree-based biogeography interpretation methods and show that consideration of processes and their likelihoods can turn the conventional biogeographical interpretation on its head. Location, Southern hemisphere focus but applicable world-wide. Methods, The logic of existing methods is reviewed with respect to their adequacy in modelling processes such as geographical mode of speciation and likelihood of dispersal, including directional bias. Published reconstructions of dispersal of three plant taxa between Australia and New Zealand were re-analysed using standard parsimony and maximum likelihood (ML) methods with rate matrices to model expected asymmetry of dispersal. Results, Few studies to date incorporate asymmetric dispersal rate matrices or question the simplistic assumption of equal costs. Even when they do, cost matrices typically are not derived independently of tree topology. Asymmetrical dispersal between Australia and New Zealand could be reconstructed using parsimony but not with ML. Main conclusions, The inadequacy of current models has important consequences for our interpretation of southern hemisphere biogeography, particularly in relation to dispersal. For example, if repeated directional dispersals and colonization in the direction of prevailing winds have occurred, with intervening periods of speciation, then there is no need to infer dispersals against those winds. Failure to take account of directionality and other biases in reconstruction methods has implications beyond the simple misinterpretation of the biogeography of a taxonomic group, such as calibration of molecular clocks, the dating of vicariance events, and the prioritization of areas for conservation. [source] RAFT polymerization of styrenic-based phosphonium monomers and a new family of well-defined statistical and block polyampholytesJOURNAL OF POLYMER SCIENCE (IN TWO SECTIONS), Issue 12 2007Ran Wang Abstract We describe herein the first example of the controlled reversible addition-fragmentation chain transfer (RAFT) radical homo- and copolymerization of phosphonium-based styrenic monomers mediated with a trithiocarbonate-based RAFT chain transfer agent (CTA) directly in aqueous media. In the case of homopolymer syntheses the polymerizations proceed in a controlled fashion yielding materials with predetermined molecular characteristics as evidenced from the narrow molecular mass distributions (MMD) and the excellent agreement between the theoretical and experimentally determined molecular masses (MM). We also demonstrate the controlled nature of the homopolymerization of 4-vinylbenzoic acid with the same CTA in DMSO. We subsequently prepared both statistical and block copolymers from the phosphonium/4-vinylbenzoic acid monomers to yield the first examples of polyampholytes in which the cationic functional group is a quaternary phosphonium species. We show that the kinetic characteristics of the statistical copolymerizations are different from the homopolymerizations and proceed, generally, at a significantly faster rate although there appears to be a composition dependence on the rate. Given the inherent problems in characterizing such polyampholytic copolymers via aqueous size exclusion chromatography we have qualitatively proved their successful formation via FTIR spectroscopy. Finally, in a preliminary experiment we qualitatively demonstrate the ability of such pH-responsive block copolymers to undergo supramolecular self-assembly. © 2007 Wiley Periodicals, Inc. J Polym Sci Part A: Polym Chem 45: 2468,2483, 2007 [source] Does topic metadata help with Web search?JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 5 2007David Hawking It has been claimed that topic metadata can be used to improve the accuracy of text searches. Here, we test this claim by examining the contribution of metadata to effective searching within Web sites published by a university with a strong commitment to and substantial investment in metadata. The authors use four sets of queries, a total of 463, extracted from the university's official query logs and from the university's site map. The results are clear: The available metadata is of little value in ranking answers to those queries. A follow-up experiment with the Web sites published in a particular government jurisdiction confirms that this conclusion is not specific to the particular university. Examination of the metadata present at the university reveals that, in addition to implementation deficiencies, there are inherent problems in trying to use subject and description metadata to enhance the searchability of Web sites. Our experiments show that link anchor text, which can be regarded as metadata created by others, is much more effective in identifying best answers to queries than other textual evidence. Furthermore, query-independent evidence such as link counts and uniform resource locator (URL) length, unlike subject and description metadata, can substantially improve baseline performance. [source] Researching emotion: the need for coherence between focus, theory and methodologyNURSING INQUIRY, Issue 1 2004Jan Savage There is a longstanding awareness of the significance of emotion in nursing and yet it remains one of the more elusive areas of practice. Surprisingly, there has been little discussion in the nursing literature of how the phenomenon of emotion might be understood or studied. This paper gives an overview of theoretical and methodological approaches to emotion, and how the researcher's emotions may inform the research process. In addition, it draws on ethnographic research exploring the role of emotion in the practice and clinical supervision of a group of psychosexual nurses undergoing Balint seminar training to help highlight some of the inherent problems of researching emotion. The paper argues that these sorts of problems may be avoided or reduced by ensuring coherence between the research focus, the way emotion is theorised, and the methodological approach of the study. [source] Residual Claims in Co-operatives: Design IssuesANNALS OF PUBLIC AND COOPERATIVE ECONOMICS, Issue 3 2003R. Srinivasan This paper examines issues in the design of a co-operative member's contractual relationship with the other agents (including the remaining members) using organizational economics. The paper assumes that the central defining characteristic of a co-op is the residual claim specification. Agency theory identifies certain inherent problems of the co-op form, the horizon problem, common property problem, and non-transferability. Non-transferability both reduces the incentive to monitor and imposes limits on portfolio diversification. This paper argues that features such as claim incompleteness and non-transferability are not inherent to the co-op but may be transaction-cost economizing. The paper also argues that the pre-emptive payoff feature by which the residual claimants (the co-op members) also become fixed payoff agents can affect the risk of other agents, and is an important determinant of co-op risk. A co-op may have more than one potential residual claim base. Five generic design choices are available for handling possible multiple claim bases: battleground, pre-specified allocation, limited return, alignment, and fixed payoff. The paper uses the design of residual claims in sugar co-ops to show how a co-op can partly overcome some of the problems identified by agency theory. This illustration ties together the issues of claim incompleteness and non-transferability, pre-emptive payoff, and multiple claim bases. [source] The dilemma of conducting research back in your own country as a returning student , reflections of research fieldwork in ZimbabweAREA, Issue 1 2009David Mandiyanike The research process is more like finding one's way through a complex maze. ,Home is where the heart is', but foreign students face a number of problems upon their return home to do research. This paper chronicles the dilemma of a Zimbabwean student conducting fieldwork for his UK-based doctoral studies in his own country. The dilemmas were critical in that the fieldwork was undertaken during the ,Zimbabwe crisis' and the inherent problems of researching government-related organisations. This has a bearing on any research process and invokes use of the etic/emic dilemma. This paper contributes to the gaps and growing literature on methods and techniques for conducting qualitative research in human geography. [source] Comparative studies of brain evolution: a critical insight from the ChiropteraBIOLOGICAL REVIEWS, Issue 1 2009Dina K. N Dechmann Abstract Comparative studies of brain size have a long history and contributed much to our understanding of the evolution and function of the brain and its parts. Recently, bats have been used increasingly as model organisms for such studies because of their large number of species, high diversity of life-history strategies, and a comparatively detailed knowledge of their neuroanatomy. Here, we draw attention to inherent problems of comparative brain size studies, highlighting limitations but also suggesting alternative approaches. We argue that the complexity and diversity of neurological tasks that the brain and its functional regions (subdivisions) must solve cannot be explained by a single or few variables representing selective pressures. Using an example we show that by adding a single relevant variable, morphological adaptation to foraging strategy, to a previous analysis a correlation between brain and testes mass disappears completely and changes entirely the interpretation of the study. Future studies should not only look for novel determinants of brain size but also include known correlates in order to add to our current knowledge. We believe that comparisons at more detailed anatomical, taxonomic, and geographical levels will continue to contribute to our understanding of the function and evolution of mammalian brains. [source] Double-Observer Line Transect Methods: Levels of IndependenceBIOMETRICS, Issue 1 2010Stephen T. Buckland Summary Double-observer line transect methods are becoming increasingly widespread, especially for the estimation of marine mammal abundance from aerial and shipboard surveys when detection of animals on the line is uncertain. The resulting data supplement conventional distance sampling data with two-sample mark,recapture data. Like conventional mark,recapture data, these have inherent problems for estimating abundance in the presence of heterogeneity. Unlike conventional mark,recapture methods, line transect methods use knowledge of the distribution of a covariate, which affects detection probability (namely, distance from the transect line) in inference. This knowledge can be used to diagnose unmodeled heterogeneity in the mark,recapture component of the data. By modeling the covariance in detection probabilities with distance, we show how the estimation problem can be formulated in terms of different levels of independence. At one extreme, full independence is assumed, as in the Petersen estimator (which does not use distance data); at the other extreme, independence only occurs in the limit as detection probability tends to one. Between the two extremes, there is a range of models, including those currently in common use, which have intermediate levels of independence. We show how this framework can be used to provide more reliable analysis of double-observer line transect data. We test the methods by simulation, and by analysis of a dataset for which true abundance is known. We illustrate the approach through analysis of minke whale sightings data from the North Sea and adjacent waters. [source] From 3D to 2D: A Review of the Molecular Imprinting of ProteinsBIOTECHNOLOGY PROGRESS, Issue 6 2006Nicholas W. Turner Molecular imprinting is a generic technology that allows for the introduction of sites of specific molecular affinity into otherwise homogeneous polymeric matrices. Commonly this technique has been shown to be effective when targeting small molecules of molecular weight <1500, while extending the technique to larger molecules such as proteins has proven difficult. A number of key inherent problems in protein imprinting have been identified, including permanent entrapment, poor mass transfer, denaturation, and heterogeneity in binding pocket affinity, which have been addressed using a variety of approaches. This review focuses on protein imprinting in its various forms, ranging from conventional bulk techniques to novel thin film and monolayer surface imprinting approaches. [source] Cars before Kids: Automobility and the Illusion of School Traffic SafetyCANADIAN REVIEW OF SOCIOLOGY/REVUE CANADIENNE DE SOCIOLOGIE, Issue 2 2010SYLVIA PARUSEL La sécurité routière constitue une question d'intérêt public très discutée, et ses pratiques fortement débattues exigent une analyse sociologique et l'attention systématique des politiques publiques. Dans cette étude, les auteurs analysent les programmes de sécurité routière dans les écoles primaires de Vancouver, en Colombie-Britannique. Ils illustrent comment de tels programmes supposent une politique de la responsabilité visant grandement les enfants et les parents pour en faire des personnes sécuritaires sur la route dans un environnement institutionnel qui ne fournit pourtant aux programmes qu'un soutien et des fonds sporadiques pour administrer les risques de la circulation. Alors que ce contexte de programmes de sécurité routière à l'école aide à maintenir une certaine « illusion de sécurité», elle ne remet pas fondamentalement en question la structure dominante actuelle de la mobilité et les problèmes qui y sont inhérents. Traffic safety is a contested public issue and highly negotiated practice that requires sociological analysis and systematic public policy attention. In our case study, we examine elementary school traffic safety programs in Vancouver, British Columbia. We illustrate how such programs assume a politics of responsibility that largely targets children and parents for traffic safekeeping within an institutional environment that gives programs only sporadic support and funding to manage traffic risks. While this context of school traffic safety programs helps to maintain an "illusion of safety," it does not challenge the current auto-dominant mobility structure and its inherent problems. [source] Variations in the evaluation of colorectal cancer riskCOLORECTAL DISEASE, Issue 3 2005R. J. Hodder Abstract Objectives, To test the variability in estimating cancer risk and demonstrate the consequences that subjectivity has on patient care. Subjects and methods, Forty-three clinicians were each asked to assess 40 symptomatic colorectal referrals. Each clinician was provided with a comprehensive history on the 40 patients. The clinicians graded the referral according to a malignancy risk score, decided on the required first line investigation and the priority of that investigation. The main outcome measures used was accuracy in cancer detection and appropriateness of investigations selected. Results, There was a wide degree of variation among all clinicians grading both benign and malignant disease with the overall correct classification of 54% (P -value of <0.001). On average, the clinicians correctly diagnosed 71.3% of the cancer patients as compared to 44% of the benign patients. Of the cancer patients, 47% were correctly classified as an urgent referral whilst 52% of the benign patients were over classified and graded as an urgent referral. The mean number chosen by clinicians to have a flexible sigmoidoscopy as the appropriate first investigation was 13 (of 40 patients); this was despite the diagnosis being possible in all cases with a flexible sigmoidoscopy. The choice to use full colonic investigation was seen throughout all disciplines. Junior doctors demonstrated the highest tendency choosing full colonic investigation in 92.3%. Consultants and senior grades showed the least tendency to choose full colonic imaging although even here colonoscopy or barium enema represented 48.5%. Conclusion, Subjective assessment of cancer referrals is a significant problem that needs to be confronted. Improvements are needed to resolve the inherent problems of subjectivity and operator bias if uniform quality of patient care and best use of resources is to be achieved. [source] |