Home About us Contact | |||
Logic
Kinds of Logic Terms modified by Logic Selected AbstractsTHE LOGIC OF AUTHORITARIAN BARGAINSECONOMICS & POLITICS, Issue 1 2009RAJ M. DESAI Dictatorships do not survive by repression alone. Rather, dictatorial rule is often explained as an "authoritarian bargain" by which citizens relinquish political rights for economic security. The applicability of the authoritarian bargain to decision-making in non-democratic states, however, has not been thoroughly examined. We conceptualize this bargain as a simple game between a representative citizen and an autocrat who faces the threat of insurrection, and where economic transfers and political influence are simultaneously determined. Our model yields implications for empirical patterns that are expected to exist. Tests of a system of equations with panel data comprising 80 non-democratic states between 1975 and 1999 generally confirm the predictions of the authoritarian-bargain thesis, with some variation across different categories of dictatorship. [source] WELL LOG CALIBRATION OF KOHONEN-CLASSIFIED SEISMIC ATTRIBUTES USING BAYESIAN LOGICJOURNAL OF PETROLEUM GEOLOGY, Issue 4 2001M. T. Taner We present a new method for calibrating a classified 3D seismic volume. The classification process employs a Kohonen self-organizing map, a type of unsupervised artificial neural network; the subsequent calibration is performed using one or more suites of well logs. Kohonen self-organizing maps and other unsupervised clustering methods generate classes of data based on the identification of various discriminating features. These methods seek an organization in a dataset and form relational organized clusters. However, these clusters may or may not have any physical analogues in the real world. In order to relate them to the real world, we must develop a calibration method that not only defines the relationship between the clusters and real physical properties, but also provides an estimate of the validity of these relationships. With the development of this relationship, the whole dataset can then be calibrated. The clustering step reduces the multi-dimensional data into logically smaller groups. Each original data point defined by multiple attributes is reduced to a one- or two-dimensional relational group. This establishes some logical clustering and reduces the complexity of the classification problem. Furthermore, calibration should be more successful since it will have to consider less variability in the data. In this paper, we present a simple calibration method that employs Bayesian logic to provide the relationship between cluster centres and the real world. The output will give the most probable calibration between each self-organized map node and wellbore-measured parameters such as lithology, porosity and fluid saturation. The second part of the output comprises the calibration probability. The method is described in detail, and a case study is briefly presented using data acquired in the Orange River Basin, South Africa. The method shows promise as an alternative to current techniques for integrating seismic and log data during reservoir characterization. [source] LOGIC AND THEORY IN ARISTOTLE, STOICISM, HEGELPHILOSOPHICAL FORUM, Issue 3 2006KENLEY R. DOVE First page of article [source] LOGIC's CARETAKER,WITTGENSTEIN, LOGIC, AND THE VANISHMENT OF RUSSELL's PARADOXPHILOSOPHICAL FORUM, Issue 3 2004KELLY DEAN JOLLEY First page of article [source] On Resolving Conflicts Between ArgumentsCOMPUTATIONAL INTELLIGENCE, Issue 3 2000Nico Roos Argument systems are based on the idea that one can construct arguments for propositions,structured reasons justifying the belief in a proposition. Using defeasible rules, arguments need not be valid in all circumstances, therefore, it might be possible to construct an argument for a proposition as well as its negation. When arguments support conflicting propositions, one of the arguments must be defeated, which raises the question of which (sub-) arguments can be subject to defeat. In legal argumentation, metarules determine the valid arguments by considering the last defeasible rule of each argument involved in a conflict. Since it is easier to evaluate arguments using their last rules, can a conflict be resolved by considering only the last defeasible rules of the arguments involved? We propose a new argument system where, instead of deriving a defeat relation between arguments, arguments for the defeat of defeasible rules are constructed. This system allows us to determine a set of valid (undefeated) arguments in linear time using an algorithm based on a JTMS, allows conflicts to be resolved using only the last rules of the arguments, allows us to establish a relation with Default Logic, and allows closure properties such as cumulativity to be proved. We propose an extension of the argument system based on a proposal for reasoning by cases in default logic. [source] A Neuro-Fuzzy Logic for ATIS Stand-Alone Control Systems: Structure, Calibration, and AnalysesCOMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 3 2004Yaser E. Hawas The scheme logic attempts to optimize the network overall travel time by adjusting the path proportions while guessing the signal phase split decisions. An approximate simulation-based optimization algorithm is devised as an example of the logic operating this scheme. The logic is then replicated by a fuzzy-logic control system. Neural nets are utilized to develop the knowledge base of the fuzzy system and to calibrate the fuzzy set parameters. The neural nets utilize data replicates generated by the approximate simulation-based optimization algorithm. The calibration and effectiveness results of the fuzzy control system are presented. [source] Towards a New Logic for Front End Management: From Drug Discovery to Drug Design in Pharmaceutical R&DCREATIVITY AND INNOVATION MANAGEMENT, Issue 2 2007Maria Elmquist Under pressure to innovate and be cost-effective at the same time, R&D departments are being challenged to develop new organizations and processes for Front End activities. This is especially true in the pharmaceutical industry. As drug development becomes more risky and costly, the discovery departments of pharmaceutical companies are increasingly being compelled to provide strong drug candidates for efficient development processes and quick market launches. It is argued that the Fuzzy Front End consists less of the discovery or recognition of opportunities than of the building of expanded concepts: the notion of concept generation is revisited, suggesting the need for a new logic for organizing Front End activities in order to support sustainable innovative product development. Based on an in-depth empirical study at a European pharmaceutical company, this paper contributes to improved understanding of the actual management practices used in the Front End. Using a design reasoning model (the C-K model), it also adds to the growing body of literature on the management of Front End activities in new product development processes. [source] Modelling Product Innovation Processes, from Linear Logic to Circular ChaosCREATIVITY AND INNOVATION MANAGEMENT, Issue 2 2003Jan Buijs Product innovation is the focal point of the Delft Design School in the Netherlands. During its more than thirty years of existence different models of the product innovation process were and are used for education and for research. This paper will describe the development of these models. The first models tried to describe the product innovation process in a logical linear order, but recently this logical order has come under discussion. The most recent models try to show the more chaotic character of the product innovation processes in real corporate life. Although this chaotic model better reflects the product innovation practice, for educational purposes it seems to be less useful than the original logical ones. For our teaching we propose the two versions (logic and chaos) of our innovation model as two sides of one coin. This innovation coin is without proper value with one side left blank. [source] Exploring the Link between Dominant Logic and Company PerformanceCREATIVITY AND INNOVATION MANAGEMENT, Issue 2 2000Georg Von Krogh To revitalize the discussion on dominant logic our paper aims to establish the forgotten link between dominant logic and firm performance. To do so, the concept is enhanced conceptually and operationalized by developing a framework including firms' conceptualization of the business (external environment) and of themselves (internal environment) and performance. The framework is applied to a longitudinal study of two consumer electronics firms. The empirical evidence shows that differences in dominant logic lead to different strategic reactions to developments in the industry, and thus result in performance differences. [source] Was Mancur a Maoist?ECONOMICS & POLITICS, Issue 2 2003An Essay on Kleptocracy, Political Stability Much of Mancur Olson's work explored the link between government structure and economic development. This paper provides a framework for thinking about this link that exposes both the powerful insights and the deep tensions in Olson's work. In The Rise and Decline of Nations Olson argued that instability was good for democratic accountability because it upset entrenched interests. In contrast, after the fall of the socialist regimes in Europe and the Soviet Union, Olson argued that the stability of a single autocrat or "stationary bandit" was superior to the competitive rent seeking of competing "roving bandits." I argue that there is a real inconsistency in Olson's thinking on the role of stability and change in political life; I do this by developing the connections between Olson's classic Logic of Collective Action and his subsequent writing. The paper concludes by building on Olson's insights to point the way to a more complete analysis of democracy and transition. [source] Entrepreneurs, Effectual Logic, and Over-TrustENTREPRENEURSHIP THEORY AND PRACTICE, Issue 4 2006Sanjay Goel This article complements extant literature on entrepreneurship and trust by proposing a model of over-trust (the tendency to trust more than what is warranted) using entrepreneurial characteristics and effectual logic. We trace how entrepreneurs following effectual processes may tend to over-trust. More formally, we propose that specific personality characteristics of the entrepreneur interact with effectual logic to make the entrepreneur more susceptible to over-trust. The proposed model is value neutral in that we do not imply that over-trust has negative consequences for entrepreneurs. In fact, it may be part of the overall risk that entrepreneurs assume in a new venture creation. [source] Gender Logic and (Un)doing Gender at WorkGENDER, WORK & ORGANISATION, Issue 2 2010Elisabeth K. Kelan Doing gender is a popular concept in studies on work and organizations that is used to show how gender is constructed through interactions in organizations. Recently researchers have also started looking at how gender can be undone. This article elucidates two understandings of doing gender based on ethnomethodological and poststructural and discursive approaches and shows how these theoretical approaches lead to diverging ways of undoing gender. These two approaches are critically explored by drawing on qualitative research with information communication technology workers. The article thereby examines how gender might be undone within both ethnomethodological and poststructural and discursive traditions. It makes a contribution towards understanding (un)doing gender approaches at work by highlighting the implications for research on gender, work and organization. [source] Frege and the Surprising History of Logic: Introduction to Claude Imbert, "Gottlob Frege, One More Time"HYPATIA, Issue 4 2000EMILY GROSHOLZ Convinced that logic has a history and that its history always manages to surprise the philosophers, Claude Imbert has devoted much of her work to the study of the Stoic school and of the late-nineteenth-century German logician Gottlob Frege. In the fifth chapter of her book Pour une histoire de la logique, she examines the trajectory of Frege's awareness of what his new logic entails, in particular the way it subverts the project of Kant. [source] Modelling and design considerations on CML gates under high-current effectsINTERNATIONAL JOURNAL OF CIRCUIT THEORY AND APPLICATIONS, Issue 6 2005M. Alioto Abstract In this paper, the effect of the transit time degradation of bipolar transistors on the power-delay trade-off in CML gates and their design is dealt with. A delay model which accounts for the transit time increase due to the high bias current values used in high-speed applications is derived by generalizing an approach previously proposed by the same authors (IEEE Trans. CAD 1999; 18(9):1369,1375; Model and Design of Bipolar and MOS Current,Mode Logic (CML, ECL and SCL Digital Circuits), Kluwer Academic Publisher: Dordrecht, 2005). The resulting closed-form delay expression is achieved by properly simplifying the SPICE model, and has an explicit dependence on the bias current which determines the power consumption of CML gates. Accordingly, the delay model is used to gain insight into the power-delay trade-off by considering the effect of the transit time degradation in high-speed designs. In particular, the cases where such effects can be neglected are identified, to better understand how the transit time degradation affects the performance of CML gates for current bipolar technologies. The proposed model has a simple and compact expression, thus it turns out to be suitable for pencil-and-paper evaluations, as well as fast timing analysis. Simulations of CML circuits with a 20-GHz bipolar process show that the model has a very good accuracy in a wide range of current and loading conditions. Copyright © 2005 John Wiley & Sons, Ltd. [source] Continuum Logic: A Chinese Contribution to Knowledge and Understanding in Philosophy and ScienceJOURNAL OF CHINESE PHILOSOPHY, Issue 4 2002Walter Benesch [source] From National Service to Global Player: Transforming the Organizational Logic of a Public BroadcasterJOURNAL OF MANAGEMENT STUDIES, Issue 6 2010André Spicer abstract We present organizational logics as a meso-level construct that lies between institutional theory's field-level logics and the sense-making activities of individual agents in organizations. We argue that an institutional logic can be operationalized empirically using the concept of a discourse , that is, a coherent symbolic system articulating what constitutes legitimate, reasonable, and effective conduct in, around, and by organizations. An organization may, moreover, be simultaneously exposed to several institutional logics that make up its broader ideational environment. Taking these three observations together enables us to consider an organizational logic as a spatially and temporally localized configuration of diverse discourses. We go on to show how organizational logics were transformed in the Australian Broadcasting Corporation between 1953 and 1999 by examining the changing discourses that appeared in the Corporation's annual reports. We argue that these discourses were modified through three main forms of discursive agency: (1) undertaking acts of ironic accommodation between competing discourses; (2) building chains of equivalence between the potentially contradictory discourses; and (3) reconciling new and old discourses through pragmatic acts of ,bricolage'. We found that, using these forms of discursive agency, a powerful coalition of actors was able to transform the dominant organizational logic of the ABC from one where the Corporation's initial mission was to serve national interests through public service to one that was ultimately focused on participating in a globalized media market. Finally, we note that discursive resources could be used as the basis for resistance by less powerful agents, although further research is necessary to determine exactly how more powerful and less powerful agents interact around the establishment of an organizational logic. [source] I, Too, Sail Past: Odysseus and the Logic of Self-ControlKYKLOS INTERNATIONAL REVIEW OF SOCIAL SCIENCES, Issue 2 2000David Sally First page of article [source] Psychologism Revisited in Logic, Metaphysics, and EpistemologyMETAPHILOSOPHY, Issue 3 2001Dale Jacquette Psychologism is a philosophical ideology that seeks to explain the principles of logic, metaphysics, and epistemology as psychological phenomena. Psychologism has been the storm center of concerted criticisms since the nineteenth century, and is thought by many to have been refuted once and for all by Kant, Frege, Husserl, and others. The project of accounting for objective philosophical or mathematical truths in terms of subjective psychological states has been largely discredited in mainstream analytic thought. Ironically, psychologism has resurfaced in unexpected guises in the form of intuitionistic logic and mathematics, cognitivism, and naturalized epistemology. I examine some of the principal objections to psychologism , distinguishing roughly between good and bad or philosophically acceptable versus unacceptable psychologism , and consider the extent to which a new wave of psychologism may be gaining prominence in contemporary philosophy, and the light its successes and failures may shed on the original concept and underlying perspective of classical psychologism. [source] An infinitary variant of Metric Temporal Logic over dense time domainsMLQ- MATHEMATICAL LOGIC QUARTERLY, Issue 3 2004Stefano Baratella Abstract We introduce a complete and cut-free proof system for a sufficiently expressive fragment of Metric Temporal Logic over dense time domains in which a schema of induction is provable. So doing we extend results previously obtained by Montagna et al. to unbounded temporal operators. (© 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source] Sequent systems for compact bilinear logicMLQ- MATHEMATICAL LOGIC QUARTERLY, Issue 5 2003Wojciech Buszkowski Abstract Compact Bilinear Logic (CBL), introduced by Lambek [14], arises from the multiplicative fragment of Noncommutative Linear Logic of Abrusci [1] (also called Bilinear Logic in [13]) by identifying times with par and 0 with 1. In this paper, we present two sequent systems for CBL and prove the cut-elimination theorem for them. We also discuss a connection between cut-elimination for CBL and the Switching Lemma from [14]. [source] Supervenience and Infinitary LogicNOUS, Issue 3 2001Michael Glanzberg [source] Intrinsic Motivation and the Logic of Collective Action: The Impact of Selective IncentivesAMERICAN JOURNAL OF ECONOMICS AND SOCIOLOGY, Issue 2 2010Andreas P. Kyriacou I integrate the notion of intrinsic motivation, applied to economics most notably by Frey (1997), into the logic of individual contributions toward collective goods as analyzed since Olson ([1965] 1971). This illuminates the many and various ways through which the intrinsic motivation to contribute toward such goods can be crowded out by the application of selective incentives. I suggest that the crowding-out effect increases the cost to society of organizing the provision of collective goods and argue in favor of designing selective incentives that mitigate this effect. [source] Logic of experiments in ecology: is pseudoreplication a pseudoissue?OIKOS, Issue 1 2001Lauri Oksanen Hurlbert divides experimental ecologist into ,those who do not see any need for dispersion (of replicated treatments and controls), and those who do recognize its importance and take whatever measures are necessary to achieve a good dose of it'. Experimental ecologists could also be divided into those who do not see any problems with sacrificing spatial and temporal scales in order to obtain replication, and those who understand that appropriate scale must always have priority over replication. If an experiment is conducted in a spatial or temporal scale, where the predictions of contesting hypotheses are convergent or ambiguous, no amount of technical impeccability can make the work instructive. Conversely, replication can always be obtained afterwards, by conducting more experiments with basically similar design in different areas and by using meta-analysis. This approach even reduces the sampling bias obtained if resources are allocated to a small number of well-replicated experiments. For a strict advocate of the hypothetico-deductive method, replication is unnecessary even as a matter of principle, unless the predicted response is so weak that random background noise is a plausible excuse for a discrepancy between predictions and results. By definition, a prediction is an ,all-statement', referring to all systems within a well-defined category. What applies to all must apply to any. Hence, choosing two systems and assigning them randomly to a treatment and a control is normally an adequate design for a deductive experiment. The strength of such experiments depends on the firmness of the predictions and their a priori probability of corroboration. Replication is but one of many ways of reducing this probability. Whether the experiment is replicated or not, inferential statistics should always be used, to enable the reader to judge how well the apparent patterns in samples reflect real patterns in statistical populations. The concept ,pseudoreplication' amounts to entirely unwarranted stigmatization of a reasonable way to test predictions referring to large-scale systems. [source] Models for the Logic of Possible ProofsPACIFIC PHILOSOPHICAL QUARTERLY, Issue 1 2000Leon Horsten First page of article [source] The Unity of Language and Logic in Wittgenstein's TractatusPHILOSOPHICAL INVESTIGATIONS, Issue 1 2006Leo K. C. Cheung The purpose of this paper is to offer an interpretation of the Tractatus' proof of the unity of logic and language. The kernel of the proof is the thesis that the sole logical constant is the general propositional form. I argue that the Grundgedanke, the existence of the sole fundamental operation N and the analyticity thesis, together with the fact that the operation NN can always be seen as having no specific formal difference between its result and its base, imply that NN is intrinsic to every elementary proposition. I also argue that the picture theory of proposition is an account of the generation of propositions via naming, and that its crucial idea is that naming is the instantiation of the form of a name, which consists in arbitrarily picking out an object as the meaning of the name from those objects sorted out by the form of the name. It follows that the existential quantifier, that is, NN, is intrinsic to naming (and therefore to every elementary proposition). It is then proven that the sole logical constant is the general propositional form. This, together with the truth-functionality of logical necessity, implies that logic and language are unified via a general rule , logical syntax. [source] Logic in Action: Wittgenstein's Logical Pragmatism and the Impotence of ScepticismPHILOSOPHICAL INVESTIGATIONS, Issue 2 2003Daničle Moyal, Sharrock First page of article [source] Frege's Judgement Stroke and the Conception of Logic as the Study of Inference not ConsequencePHILOSOPHY COMPASS (ELECTRONIC), Issue 4 2009Nicholas J. J. Smith One of the most striking differences between Frege's Begriffsschrift (logical system) and standard contemporary systems of logic is the inclusion in the former of the judgement stroke: a symbol which marks those propositions which are being asserted, that is, which are being used to express judgements. There has been considerable controversy regarding both the exact purpose of the judgement stroke, and whether a system of logic should include such a symbol. This paper explains the intended role of the judgement stroke in a way that renders it readily comprehensible why Frege insisted that this symbol was an essential part of his logical system. The key point here is that Frege viewed logic as the study of inference relations amongst acts of judgement, rather than , as in the typical contemporary view , of consequence relations amongst certain objects (propositions or well-formed formulae). The paper also explains why Frege's use of the judgement stroke is not in conflict with his avowed anti-psychologism, and why Wittgenstein's criticism of the judgement stroke as ,logically quite meaningless' is unfounded. The key point here is that while the judgement stroke has no content, its use in logic and mathematics is subject to a very stringent norm of assertion. [source] The structural elucidation of a novel iridoid derivative from Tachiadenus longiflorus (Gentianaceae) using the LSD programme and quantum chemical computationsPHYTOCHEMICAL ANALYSIS, Issue 2 2006D. A. Mulholland Abstract Oleanolic acid, scoparone, scopoletin and a novel iridoid derivative, angelone, were isolated from Tachiadenus longiflorus (Gentianaceae). The structure of angelone was determined from NMR data, given as input to the Logic for Structure Determination Programme, and was finally confirmed by comparison of experimental 13C-NMR chemical shifts with those obtained by quantum mechanical calculations. Copyright © 2005 John Wiley & Sons, Ltd. [source] Disagreeing to Agree: Financial Crisis Management within the ,Logic of No Alternative'POLITICS, Issue 2 2009Huw Macartney The article argues that amid a cacophony of analyses of the causes of the current financial crisis, those daring to consider its implications and outcomes are decidedly cautious. Fundamentally, crisis managers appear intent on treating this as a minor glitch in an otherwise functioning market. This is a controversial claim. Nonetheless it is the legacy of the perception that neoliberalism is ,the only alternative'; it emphasises the need, however, for truly alternative voices in the ad hoc settlements and negotiations. The article argues that, through the lenses of historical materialism, this crisis is the inevitable result of the prolonged period of credit expansion and financial market reform in recent decades. With this in mind it suggests that the economists and state managers who established these conditions are themselves both unlikely to and incapable of reversing them. [source] Knowledge-based Diagnosis Aiding in Regulation ThermographyPROCEEDINGS IN APPLIED MATHEMATICS & MECHANICS, Issue 1 2003Hagen Knaf Dr. Regulation Thermography is a diagnostic tool in the medical science based on the measurement of the body's thermoregulation ability , the so-called thermogram. The expert's rules for the interpretation of a thermogram can be modelled using Fuzzy Logic. In the present article this modelling process is briefly explained; it leads to a Fuzzy Inference System capable of evaluating thermograms with respect to e.g. signals for the presence of Breast Cancer. Some of the main points of a comparison between the expert rules and the result of a stepwise linear discriminant analysis performed on classified thermograms are presented. [source] |