Automatic

Distribution by Scientific Domains

Terms modified by Automatic

  • automatic algorithm
  • automatic analysis
  • automatic classification
  • automatic control
  • automatic detection
  • automatic determination
  • automatic evaluation
  • automatic extraction
  • automatic generation
  • automatic identification
  • automatic interpretation
  • automatic measurement
  • automatic method
  • automatic procedure
  • automatic process
  • automatic processing
  • automatic system
  • automatic thought
  • automatic transmission
  • automatic weather stations

  • Selected Abstracts


    Effect of digoxin on circadian blood pressure values in patients with congestive heart failure

    EUROPEAN JOURNAL OF CLINICAL INVESTIGATION, Issue 4 2000
    Kirch
    Background The aim of the study was to investigate the effect of chronic digoxin treatment on circadian blood pressure profile in normotensive patients with mild congestive heart failure. Methods In a randomized double-blind, placebo-controlled cross-over protocol, 12 normotensive patients with mild congestive heart failure took digoxin or placebo for a total of 7 days. Automatic 24-h ambulatory blood pressure measurements were carried out at day 7, of either digoxin or placebo. Results Diastolic blood pressure significantly decreased and systolic blood pressure significantly increased during overnight sleep in the digoxin phase compared to placebo. Digoxin had no effect on either systolic or diastolic blood pressure during daytime. Heart rate decreased in the overnight sleeping phase but did not differ significantly between placebo and digoxin phase. Conclusions Digoxin significantly decreases diastolic blood pressure during overnight sleep in patients with congestive heart failure. This effect is likely to be caused by reduction of sympathetic activity or increase of parasympathetic activity. Increase of systolic blood pressure during sleep is probably caused by the positive inotropic effect of the drug. [source]


    Automatic and controlled processes in behavioural control: Implications for personality psychology

    EUROPEAN JOURNAL OF PERSONALITY, Issue 5 2010
    Philip J. Corr
    Abstract This paper highlights a number of unresolved theoretical issues that, it is argued, continue to impede the construction of a viable model of behavioural control in personality psychology. It is contended that, in order to integrate motivation, emotion, cognition and conscious experience within a coherent framework, two major issues need to be recognised: (a) the relationship between automatic (reflexive) and controlled (reflective) processing and (b) the lateness of controlled processing (including the generation of conscious awareness),phenomenally, such processing seems to ,control' behaviour, but experimentally it can be shown to postdate the behaviour it represents. The implications of these two major issues are outlined, centred on the need to integrate theoretical perspectives within personality psychology, as well as the greater unification of personality psychology with general psychology. A model of behavioural control is sketched, formulated around the concept of the behavioural inhibition system (BIS), which accounts for: (a) why certain stimuli are extracted for controlled processing (i.e. those that are not ,going to plan', as detected by an error mechanism) and (b) the function of controlled processing (including conscious awareness) in terms of adjusting the cybernetic weights of automatic processes (which are always in control of immediate behaviour) which, then, influence future automatically controlled behaviour. The relevance of this model is illustrated in relation to a number of topics in personality psychology, as well related issues of free-will and difficult-to-control behaviours. Copyright © 2010 John Wiley & Sons, Ltd. [source]


    On the meaning of meaning when being mean: commentary on Berkowitz's "On the Consideration of Automatic as Well as Controlled Psychological Processes in Aggression"

    AGGRESSIVE BEHAVIOR, Issue 2 2008
    Kenneth A. Dodge
    Abstract Berkowitz (this issue) makes a cogent case for his cognitive neo-associationist (CNA) model that some aggressive behaviors occur automatically, emotionally, and through conditioned association with other stimuli. He also proposes that they can occur without "processing," that is, without meaning. He contrasts his position with that of social information processing (SIP) models, which he casts as positing only controlled processing mechanisms for aggressive behavior. However, both CNA and SIP models posit automatic as well as controlled processes in aggressive behavior. Most aggressive behaviors occur through automatic processes, which are nonetheless rule governed. SIP models differ from the CNA model in asserting the essential role of meaning (often through nonconscious, automatic, and emotional processes) in mediating the link between a stimulus and an angry aggressive behavioral response. Aggr. Behav. 34:133,135, 2008. © 2008 Wiley-Liss, Inc. [source]


    Relationship Between Global Myocardial Index and Automatic Left Ventricular Border Detection Pattern to Identify Biventricular Pacing Candidates

    PACING AND CLINICAL ELECTROPHYSIOLOGY, Issue 2007
    DRAGOS COZMA M.D., Ph.D.
    Objective of the Study: to evaluate the relation between global myocardial index (GMI) and the pattern of left ventricular (LV) volume curves variation, using automatic border detection (ABD), and their role in assessing LV asynchrony. Methods: We studied 52 patients (mean age = 55 ± 17 years) with dilated cardiomyopathy. QRS duration (QRSd) and GMI were measured. Currently accepted TDI and M-mode parameters were used to indicate LV dyssynchrony. On-line continuous LV volume changes were recorded using ABD. Ejection time (ET ABD) was measured from the ABD wave-forms as time interval between maximal and minimal volume variation during LV electromechanical systole. We derived the ejection time index (ETiABD) as the ratio between ET ABD and RR interval (ETiABD = ET/RR). Results: 31 patients had a QRSd >120 ms and 21 patients had a QRSd <120 ms. Ventricular dyssynchrony was observed in 39 patients (29 patients had a QRSd > 120 ms). GMI was significantly higher in patients with, than in patients without ventricular dyssynchrony (1.06 ± 0.18 vs 0.73 ± 0.13, P = 0.0001), while ETABD was significantly smaller (233 ± 39 ms vs 321 ± 28 ms, P = 0.0001). The corresponding difference for ETiABD was 26.9 ± 6.8% vs 6.3 ± 4%, P < 0.0001. By simple regression analysis an inverse linear correlation was observed between GMI and ETiABD (r2=,0.51, P < 0.0001). The pattern of ABD waveforms showed increased isovolumic contraction and relaxation times in patients with LV asynchrony, similar to the GMI pattern. Conclusions: Regional delays in ventricular activation cause uncoordinated and prolonged ventricular contractions, with lengthening of the isovolumic contraction and relaxation times and shortening of the time available for filling and ejection. GMI explores these parameters and together with ABD might be useful to identify patients with ventricular asynchrony. [source]


    Automatic and controlled attentional processes in startle eyeblink modification: Effects of habituation of the prepulse

    PSYCHOPHYSIOLOGY, Issue 4 2000
    Anne M. Schell
    The effect of prehabituation of the prepulse on startle eyeblink modification was studied in two experiments. In Experiment 1, college student participants were either prehabituated or nonhabituated to a tone that served as a prepulse in a startle modification passive attention paradigm. Neither short lead interval (60 and 120 ms) prepulse inhibition (PPI) nor long lead interval (2,000 ms) prepulse facilitation (PPF) was affected by the prehabituation procedure. In Experiment 2, participants were presented with an active attention paradigm in which one of two tone prepulses was attended while the other was ignored. One group was prehabituated to the prepulses and the other was not. Unlike the results with the passive paradigm in Experiment 1, prehabituation did significantly diminish attentional modulation of PPI and PPF. These results are consistent with the hypothesis that passive PPI and PPF are primarily automatic processes, whereas attentional modulation involves controlled cognitive processing. [source]


    CASE REPORTS: Abnormal Sexual Behavior During Sleep

    THE JOURNAL OF SEXUAL MEDICINE, Issue 12 2009
    Giacomo Della Marca MD
    ABSTRACT Introduction., Automatic, uncontrolled, and unaware sexual behaviors during sleep have occasionally been described. The clinical and polysomnographic features of nocturnal sexual behavior allow it to be considered a distinct parasomnia named "sexsomnia". Recently, abnormal sexual behaviors during sleep have been evaluated in the forensic medical context because violent behaviors can be associated with this parasomnia. Aim., To describe the clinical and polysomnographic findings in three patients who referred to our sleep laboratory for sleep disorders and who reported episodes of sleep-related sexual activation. Main Outcome Measures., We analyzed video-polysomnographic recordings, sleep structure, sleep microstructure, and sleep-related respiratory events. Methods., The patients were three males aged 42, 32, and 46 years. All had unremarkable medical, neurological, and psychiatric histories. All underwent full-night polysomnography. Results., Each patient presented a distinct sleep disorder: one had severe obstructive sleep apnea syndrome (OSAS), one presented clinical and polysomnographic features of non-rapid eye movement (NREM) sleep parasomnia (somnambulism), and the third presented clinical and polysomnographic features of rapid eye movement behavior disorder. Conclusions., In our patients, the clinical and polysomnographic findings suggest that abnormal nocturnal sexual behavior can occur in association with distinct sleep disorders, characterized by different pathophysiologic mechanisms and distinctive treatments. Abnormal sexual behaviors during sleep should be investigated with polysomnography in order to define their pathophysiology and to establish appropriate treatments. Della Marca G, Dittoni S, Frusciante R, Colicchio S, Losurdo A, Testani E, Buccarella C, Modoni A, Mazza S, Mennuni GF, Mariotti P, and Vollono C. Abnormal sexual behavior during sleep. J Sex Med 2009;6:3490,3495. [source]


    Möbius Transformations For Global Intrinsic Symmetry Analysis

    COMPUTER GRAPHICS FORUM, Issue 5 2010
    Vladimir G. Kim
    The goal of our work is to develop an algorithm for automatic and robust detection of global intrinsic symmetries in 3D surface meshes. Our approach is based on two core observations. First, symmetry invariant point sets can be detected robustly using critical points of the Average Geodesic Distance (AGD) function. Second, intrinsic symmetries are self-isometries of surfaces and as such are contained in the low dimensional group of Möbius transformations. Based on these observations, we propose an algorithm that: 1) generates a set of symmetric points by detecting critical points of the AGD function, 2) enumerates small subsets of those feature points to generate candidate Möbius transformations, and 3) selects among those candidate Möbius transformations the one(s) that best map the surface onto itself. The main advantages of this algorithm stem from the stability of the AGD in predicting potential symmetric point features and the low dimensionality of the Möbius group for enumerating potential self-mappings. During experiments with a benchmark set of meshes augmented with human-specified symmetric correspondences, we find that the algorithm is able to find intrinsic symmetries for a wide variety of object types with moderate deviations from perfect symmetry. [source]


    Reconstructing head models from photographs for individualized 3D-audio processing

    COMPUTER GRAPHICS FORUM, Issue 7 2008
    M. Dellepiane
    Abstract Visual fidelity and interactivity are the main goals in Computer Graphics research, but recently also audio is assuming an important role. Binaural rendering can provide extremely pleasing and realistic three-dimensional sound, but to achieve best results it's necessary either to measure or to estimate individual Head Related Transfer Function (HRTF). This function is strictly related to the peculiar features of ears and face of the listener. Recent sound scattering simulation techniques can calculate HRTF starting from an accurate 3D model of a human head. Hence, the use of binaural rendering on large scale (i.e. video games, entertainment) could depend on the possibility to produce a sufficiently accurate 3D model of a human head, starting from the smallest possible input. In this paper we present a completely automatic system, which produces a 3D model of a head starting from simple input data (five photos and some key-points indicated by user). The geometry is generated by extracting information from images and accordingly deforming a 3D dummy to reproduce user head features. The system proves to be fast, automatic, robust and reliable: geometric validation and preliminary assessments show that it can be accurate enough for HRTF calculation. [source]


    Hierarchical Convex Approximation of 3D Shapes for Fast Region Selection

    COMPUTER GRAPHICS FORUM, Issue 5 2008
    Marco Attene
    Abstract Given a 3D solid model S represented by a tetrahedral mesh, we describe a novel algorithm to compute a hierarchy of convex polyhedra that tightly enclose S. The hierarchy can be browsed at interactive speed on a modern PC and it is useful for implementing an intuitive feature selection paradigm for 3D editing environments. Convex parts often coincide with perceptually relevant shape components and, for their identification, existing methods rely on the boundary surface only. In contrast, we show that the notion of part concavity can be expressed and implemented more intuitively and efficiently by exploiting a tetrahedrization of the shape volume. The method proposed is completely automatic, and generates a tree of convex polyhedra in which the root is the convex hull of the whole shape, and the leaves are the tetrahedra of the input mesh. The algorithm proceeds bottom-up by hierarchically clustering tetrahedra into nearly convex aggregations, and the whole process is significantly fast. We prove that, in the average case, for a mesh of n tetrahedra O(n log2 n) operations are sufficient to compute the whole tree. [source]


    An Adaptive Method for Indirect Illumination Using Light Vectors

    COMPUTER GRAPHICS FORUM, Issue 3 2001
    Xavier Serpaggi
    In computer graphics, several phenomema need to be taken into account when it comes to the field of photo-realism. One of the most relevant is obviously the notion of global, and more precisely indirect, illumination. In "classical" ray-tracing if you are not under the light, then you are in a shadow. A great amount of work has been carried out which proposes ray-tracing based solutions to take into account the fact that "there is a certain amount of light in shadows". All of these methods carry the same weaknesses: high computation time and a lot of parameters you need to manage to get something out of the method. This paper proposes a generic computation method of indirect illumination based on Monte Carlo sampling and on the sequential analysis theory, which is faster and more automatic than classical methods. [source]


    CAD-Based Photogrammetry for Reverse Engineering of Industrial Installations

    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, Issue 4 2003
    Johan W. H. Tangelder
    For instance, in the case of a servicing plant, such a library contains descriptions of simple components such as straight pipes, elbows, and T-junctions. A new installation is constructed by selecting and connecting the appropriate components from the library. This article demonstrates that one can use the same approach for reverse engineering by photogrammetry. In our technique, the operator interprets images and selects the appropriate CAD component from a library. By aligning the edges of the component's wire frame to the visible edges in the images, we implicitly determine the position, orientation, and shape of the real component. For a fast object reconstruction the alignment process has been split in two parts. Initially, the operator approximately aligns a component to the images. In a second step a fitting algorithm is invoked for an automatic and precise alignment. Further improvement in the efficiency of the reconstruction is obtained by imposing geometric constraints on the CAD components of adjacent object parts. [source]


    Simply and reliably integrating micro heaters/sensors in a monolithic PCR-CE microfluidic genetic analysis system

    ELECTROPHORESIS, Issue 8 2009
    Runtao Zhong
    Abstract A novel fabrication process was presented to construct a monolithic integrated PCR-CE microfluidic DNA analysis system as a step toward building a total genetic analysis microsystem. Microfabricated Titanium/Platinum (Ti/Pt) heaters and resistance temperature detectors (RTDs) were integrated on the backside of a bonded glass chip to provide good thermal transfer and precise temperature detection for the drilled PCR-wells. This heater/RTD integration procedure was simple and reliable, and the resulting metal layer can be easily renewed when the Ti/Pt layer was damaged in later use or novel heater/RTD design was desired. A straightforward "RTD-calibration" method was employed to optimize the chip-based thermal cycling conditions. This method was convenient and rapid, comparing with a conventional RTD-calibration/temperature adjustment method. The highest ramping rates of 14°C/s for heating and 5°C/s for cooling in a 3-,L reaction volume allow 30 complete PCR cycles in about 33,min. After effectively passivating the PCR-well surface, successful ,-phage DNA amplifications were achieved using a two- or three-temperature cycling protocol. The functionality and performance of the integrated microsystem were demonstrated by successful amplification and subsequent on-line separation/sizing of ,-phage DNA. A rapid assay for Hepatitis B virus, one of the major human pathogens, was performed in less than 45,min, demonstrating that the developed PCR-CE microsystem was capable of performing automatic and high-speed genetic analysis. [source]


    Laser-induced fluorescence detection schemes for the analysis of proteins and peptides using capillary electrophoresis

    ELECTROPHORESIS, Issue 13 2005
    Marlene Lacroix
    Abstract Over the past few years, a large number of studies have been prepared that describe the analysis of peptides and proteins using capillary electrophoresis (CE) and laser-induced fluorescence (LIF). These studies have focused on two general goals: (i) development of automatic, selective and quick separation and detection of mixtures of peptides or proteins; (ii) generation of new methods of quantitation for very low concentrations (nm and subnanomolar) of peptides. These two goals are attained with the use of covalent labelling reactions using a variety of dyes that can be readily excited by the radiation from a commonly available laser or via the use of noncovalent labelling (immunoassay using a labelled antibody or antigen or noncovalent dye interactions). In this review article, we summarize the works which were performed for protein and peptide analysis via CE-LIF. [source]


    Automated ultrasound-assisted method for the determination of the oxidative stability of virgin olive oil

    EUROPEAN JOURNAL OF LIPID SCIENCE AND TECHNOLOGY, Issue 2 2007
    José Platero-López
    Abstract A fast and automated method is proposed for determining the oxidative stability of virgin olive oil by using ultrasound. The ultrasound microprobe (3,mm in diameter) was directly immersed into the olive oil sample contained in a test tube. The most influential variables in the oxidation process, namely pulse amplitude, duty cycle, irradiation time, and sample amount, were optimized. The oil absorbance at 270,nm was continuously monitored by oil recirculation through a 0.1-mm path length flow cell connected to a fiber optic microspectrometer. This short path length allowed the direct monitoring of absorbance without needing any sample dilution. The ultrasound energy was applied during 35,min, and the resulting increase in absorbance was continuously monitored. The difference between the final and the initial absorbance at 270,nm of a set of virgin olive oil samples was closely correlated with their oxidative stability calculated by the Rancimat method (R2,=,0.9915). The resulting equation enabled the prediction of the oxidative stability of virgin olive oil in a short period of time (35,min), by using a simple, inexpensive, automatic and easy-to-use system. [source]


    Speech- and sound-segmentation in dyslexia: evidence for a multiple-level cortical impairment

    EUROPEAN JOURNAL OF NEUROSCIENCE, Issue 8 2006
    T. Kujala
    Abstract Developmental dyslexia involves deficits in the visual and auditory domains, but is primarily characterized by an inability to translate the written linguistic code to the sound structure. Recent research has shown that auditory dysfunctions in dyslexia might originate from impairments in early pre-attentive processes, which affect behavioral discrimination. Previous studies have shown that whereas dyslexic individuals are deficient in discriminating sound distinctions involving consonants or simple pitch changes, discrimination of other sound aspects, such as tone duration, is intact. We hypothesized that such contrasts that can be discriminated by dyslexic individuals when heard in isolation are difficult to identify when occurring within words or structurally similar complex sound patterns. In the current study, we addressed how segments of pseudo-words and their non-speech counterparts are processed in dyslexia. We assessed the detection of long-duration differences in segments of these stimuli and identified the brain processes that could be associated with the behavioral results. Consistent with previous studies, we found no early cortical sound-duration discrimination deficit in dyslexia. However, differences between impaired and non-impaired readers were found in the brain processes associated with sound-change recognition as well as in the behavioral performance. This suggests that even when the early, automatic, sound discrimination processes are intact in dyslexic individuals, deficits in the later, attention-dependent processes may lead to impaired perception of speech and other complex sounds. [source]


    Automatic and controlled processes in behavioural control: Implications for personality psychology

    EUROPEAN JOURNAL OF PERSONALITY, Issue 5 2010
    Philip J. Corr
    Abstract This paper highlights a number of unresolved theoretical issues that, it is argued, continue to impede the construction of a viable model of behavioural control in personality psychology. It is contended that, in order to integrate motivation, emotion, cognition and conscious experience within a coherent framework, two major issues need to be recognised: (a) the relationship between automatic (reflexive) and controlled (reflective) processing and (b) the lateness of controlled processing (including the generation of conscious awareness),phenomenally, such processing seems to ,control' behaviour, but experimentally it can be shown to postdate the behaviour it represents. The implications of these two major issues are outlined, centred on the need to integrate theoretical perspectives within personality psychology, as well as the greater unification of personality psychology with general psychology. A model of behavioural control is sketched, formulated around the concept of the behavioural inhibition system (BIS), which accounts for: (a) why certain stimuli are extracted for controlled processing (i.e. those that are not ,going to plan', as detected by an error mechanism) and (b) the function of controlled processing (including conscious awareness) in terms of adjusting the cybernetic weights of automatic processes (which are always in control of immediate behaviour) which, then, influence future automatically controlled behaviour. The relevance of this model is illustrated in relation to a number of topics in personality psychology, as well related issues of free-will and difficult-to-control behaviours. Copyright © 2010 John Wiley & Sons, Ltd. [source]


    The Contribution of Chemoreflex Drives to Resting Breathing in Man

    EXPERIMENTAL PHYSIOLOGY, Issue 1 2001
    Safraaz Mahamed
    The contribution of automatic drives to breathing at rest, relative to behavioural drives such as ,wakefulness', has been a subject of debate. We measured the combined central and peripheral chemoreflex contribution to resting ventilation using a modified rebreathing method that included a prior hyperventilation and addition of oxygen to maintain isoxia at a PET,O2 (end-tidal partial pressure of oxygen) of 100 mmHg. During rebreathing, ventilation was unrelated to PET,CO2 (end-tidal partial pressure of carbon dioxide) in the hypocapnic range, but after a threshold PET,CO2 was exceeded, ventilation increased linearly with PET,CO2. We considered the sub-threshold ventilation to be an estimate of the behavioural drives to breathe (mean ± S.E.M. = 3.1 ± 0.5 l min,1), and compared it to ventilation at rest (mean ± S.E.M. = 9.1 ± 0.7 l min,1). The difference was significant (Student's paired t test, P < 0.001). We also considered the threshold PCO2 observed during rebreathing to be an estimate of the chemoreflex threshold at rest (mean ± S.E.M. = 42.0 ± 0.5 mmHg). However, PET,CO2 during rebreathing estimates mixed venous or tissue PCO2, whereas the resting PET,CO2 during resting breathing estimates Pa,CO2 (arterial partial pressure of carbon dioxide). The chemoreflex threshold measured during rebreathing was therefore reduced by the difference in PET,CO2 at rest and at the start of rebreathing (the plateau estimates the mixed venous PCO2 at rest) in order to make comparisons. The corrected chemoreflex thresholds (mean ± S.E.M. = 26.0 ± 0.9 mmHg) were significantly less (paired Student's t test, P < 0.001) than the resting PET,CO2 values (mean ± S.E.M. = 34.3 ± 0.5 mmHg). We conclude that both the behavioural and chemoreflex drives contribute to resting ventilation. [source]


    Extraction of media and plaque boundaries in intravascular ultrasound images by level sets and min/max flow

    EXPERT SYSTEMS, Issue 2 2010
    Ali Iskurt
    Abstract: Estimation of the plaque area in intravascular ultrasound images after extraction of the media and plaque,lumen interfaces is an important application of computer-aided diagnosis in medical imaging. This paper presents a novel system for fully automatic and fast calculation of plaque quantity by capturing the surrounding ring called media. The system utilizes an algorithm that consists of an enhanced technique for noise removal and a method of detecting different iso levels by sinking the image gradually under zero level. Moreover, an important novelty with this technique is the simultaneous extraction of media and lumen,plaque interfaces at satisfactory levels. There are no higher dimensional surfaces and evolution of contours, stopping at high image gradients. Thus, the system runs really fast with curvature velocity only and has no complexity. Experiments also show that this shape-recovering curvature term not only removes the noisy behaviour of ultrasound images but also strengthens very weak boundaries and even completes the missing walls of the media. In addition, the lumen,plaque interface can be detected simultaneously. For validation, a new and very useful algorithm is developed for labelling of intravascular ultrasound images, taken from video sequences of 15 patients, and a comparison-based verification is done between manual contours by experts and the contours extracted by our system. [source]


    Detection and delineation of P and T waves in 12-lead electrocardiograms

    EXPERT SYSTEMS, Issue 1 2009
    Sarabjeet Mehta
    Abstract: This paper presents an efficient method for the detection and delineation of P and T waves in 12-lead electrocardiograms (ECGs) using a support vector machine (SVM). Digital filtering techniques are used to remove power line interference and baseline wander. An SVM is used as a classifier for the detection and delineation of P and T waves. The performance of the algorithm is validated using original simultaneously recorded 12-lead ECG recordings from the standard CSE (Common Standards for Quantitative Electrocardiography) ECG multi-lead measurement library. A significant detection rate of 95.43% is achieved for P wave detection and 96.89% for T wave detection. Delineation performance of the algorithm is validated by calculating the mean and standard deviation of the differences between automatic and manual annotations by the referee cardiologists. The proposed method not only detects all kinds of morphologies of QRS complexes, P and T waves but also delineates them accurately. The onsets and offsets of the detected P and T waves are found to be within the tolerance limits given in the CSE library. [source]


    Work and Employment in Small Businesses: Perpetuating and Challenging Gender Traditions

    GENDER, WORK & ORGANISATION, Issue 1 2000
    Susan Baines
    More and more women and men are becoming dependent on some form of small business activity for all or part of their livelihoods but there is little research offering insight into gender and working practices in small businesses. In this article we assess some theoretical approaches and discuss these against an empirical investigation of micro-firms run by women, men and mixed sex partnerships. In the ,entrepreneurship' literature, with its emphasis on the individual business owner, we find little guidance. We argue that in the ,modern' micro-business, family and work are brought into proximity as in the ,in between' organizational form described by Weber. The celebrated ,flexibility' of small firms often involves the reproduction within modernity of seemingly pre-modern practices in household organization and gender divisions of labour. This is true in the Britain of the 1990s in a growing business sector normally associated neither with tradition nor with the family. Tradition, however, is never automatic or uncontested in a ,post-traditional society'. A minority of women and men in micro-enterprises actively resist traditional solutions and even traditional imagery of male and female behaviour. For this small group alone new economic conditions seem to bring new freedom. [source]


    The Hidden Politics of Administrative Reform: Cutting French Civil Service Wages with a Low-Profile Instrument

    GOVERNANCE, Issue 1 2007
    PHILIPPE BEZES
    The article addresses internal and hidden politics of changes in bureaucracies by focusing on the introduction and use of policy instruments as institutional change without radical or explicit shifts in administrative systems. Beneath public administrative reforms, it examines the use of "low-profile instruments" characterized by their technical and goal-oriented dimension but also by their low visibility to external actors due to the high complexity of their commensurating purpose and the automaticity of their use. The core case study of the paper offers a historical sociology of a technique for calculating the growth of the French civil service wage bill from the mid-1960s to the 2000s. The origins, uses, and institutionalisation of this method in the French context are explored to emphasize the important way of governing the bureaucracy at times of crisis through automatic, unobtrusive, incremental, and low-profile mechanisms. While insisting on the salience of techniques for calculating, measuring, classifying, and indexing in the contemporary art of government, it also suggests the need for observing and explaining "everyday forms of retrenchment" in bureaucracies. [source]


    The effect of additives in silages of pure timothy and timothy mixed with red clover on chemical composition and in vitro rumen fermentation characteristics

    GRASS & FORAGE SCIENCE, Issue 3 2003
    M. Hetta
    Abstract The aim was to compare the effects of additives on direct cut silages of pure timothy and timothy mixed with tetraploid red clover. First and second growth cuts were ensiled during three consecutive years, 1994, 1995 and 1996, either without any additive or with the addition of formic acid, or lactic acid bacteria in combination with molasses. Effects of the additives on the degradation characteristics of the herbage and the silages were analysed using an automatic in vitro gas production (GP) technique. At the end of the in vitro procedures, organic matter and neutral-detergent fibre (NDF) degradabilities were determined. The tetraploid red clover persisted in the leys during the 3 years and was the dominant species at the second growth in the mixed leys. The herbage from the mixed crops had lower dry-matter contents, higher crude protein concentrations and higher buffering capacity compared with the pure timothy at both cuts. In general, the additives reduced pH, and the concentrations of ammonium-N and acetic acid in the silages. The treated silages had a more rapid faster GP in both crops. The silages from the mixed crop benefited more from the additives compared with the grass silages. The additives affected the soluble fractions as well as the NDF degradability of the silages of the mixed crop more than those fractions of the grass silages. The addition of molasses in combination with a commercial inocula resulted in increased production of lactic acid and ethanol in silages from both crops. The silages without additives could not meet the requirements for good silages according to the standards of the Swedish dairy industry. [source]


    History of hemodialyzers' designs

    HEMODIALYSIS INTERNATIONAL, Issue 2 2008
    Zbylut J. TWARDOWSKI
    Abstract Accumulation of knowledge requisite for development of hemodialysis started in antiquity and continued through Middle Ages until the 20th century. Firstly, it was determined that the kidneys produce urine containing toxic substances that accumulate in the body if the kidneys fail to function properly; secondly, it was necessary to discover the process of diffusion and dialysis; thirdly, it was necessary to develop a safe method to prevent clotting in the extracorporeal circulation; and fourthly, it was necessary to develop biocompatible dialyzing membranes. Most of the essential knowledge was acquired by the end of the 19th century. Hemodialysis as a practical means of replacing kidney function started and developed in the 20th century. The original hemodialyzers, using celloidin as a dialyzing membrane and hirudin as an anticoagulant, were used in animal experiments at the beginning of the 20th century, and then there were a few attempts in humans in the 1920s. Rapid progress started with the application of cellophane membranes and heparin as an anticoagulant in the late 1930s and 1940s. The explosion of new dialyzer designs continued in the 1950s and 1960s and ended with the development of capillary dialyzers. Cellophane was replaced by other dialyzing membranes in the 1960s, 1970s, and 1980s. Dialysis solution was originally prepared in the tank from water, electrolytes, and glucose. This solution was recirculated through the dialyzer and back to the tank. In the 1960s, a method of single-pass dialysis solution preparation and delivery system was designed. A large quantity of dialysis solution was used for a single dialysis. Sorbent systems, using a small volume of regenerated dialysis solution, were developed in the mid 1960s, and continue to be used for home hemodialysis and acute renal failure. At the end of the 20th century, a new closed system, which prepared and delivered ultrapure dialysis solution preparation, was developed. This system also had automatic reuse of lines and dialyzers and prepared the machine for the next dialysis. This was specifically designed for quotidian home hemodialysis. Another system for frequent home hemodialysis or acute renal failure was developed at the turn of the 21st century. This system used premanufactured dialysis solution, delivered to the home or dialysis unit, as is done for peritoneal dialysis. [source]


    The generation of hexahedral meshes for assembly geometry: survey and progress,

    INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, Issue 12 2001
    Timothy J. Tautges
    Abstract The finite element method is being used today to model component assemblies in a wide variety of application areas, including structural mechanics, fluid simulations, and others. Generating hexahedral meshes for these assemblies usually requires the use of geometry decomposition, with different meshing algorithms applied to different regions. While the primary motivation for this approach remains the lack of an automatic, reliable all-hexahedral meshing algorithm, requirements in mesh quality and mesh configuration for typical analyses are also factors. For these reasons, this approach is also sometimes required when producing other types of unstructured meshes. This paper will review progress to date in automating many parts of the hex meshing process, which has halved the time to produce all-hex meshes for large assemblies. Particular issues which have been exposed due to this progress will also be discussed, along with their applicability to the general unstructured meshing problem. Published in 2001 by John Wiley & Sons, Ltd. [source]


    Prevalence, Predictors, and Prognosis of Atrial Fibrillation Early After Pulmonary Vein Isolation: Findings from 3 Months of Continuous Automatic ECG Loop Recordings

    JOURNAL OF CARDIOVASCULAR ELECTROPHYSIOLOGY, Issue 10 2009
    SANDEEP JOSHI M.D.
    Introduction: Following pulmonary vein isolation (PVI) for atrial fibrillation (AF), early recurrences are frequent, benign and classified as a part of a "blanking period." This study characterizes early recurrences and determines implications of early AF following PVI. Methods and Results: Seventy-two consecutive patients (59.8 ± 10.7 years, 69% male) were studied following PVI for paroxysmal or persistent AF. Subjects were fitted with an external loop recorder for automatic, continuous detection of AF recurrence for 3 months. AF prevalence was highest 2 weeks after PVI (54%) and declined to an eventual low of 22%. A significant number (488, 34%) of recurrences were asymptomatic; however, all patients with ,1 AF event had ,1 symptomatic event. No clear predictor of early recurrence was identified. Forty-seven (65%) patients had at least 1 AF episode, predominantly (39 of 47 patients, 83%) within 2 weeks of PVI. Of the 33 patients who did not experience AF within the first 2 weeks, 85% (28/33) were complete responders (P = 0.03) at 12 months. Recurrence at any time within 3 months was not associated with procedural success or failure. Conclusions: Early AF recurrence peaks within the first few weeks after PVI, but continues at a lower level until the completion of monitoring. A blanking period of 3 months is justified to identify patients with AF recurrences that do not portend procedure failure. Freedom from AF in the first 2 weeks following ablation significantly predicts long-term AF freedom. [source]


    Short-term anti-plaque effect of two chlorhexidine varnishes

    JOURNAL OF CLINICAL PERIODONTOLOGY, Issue 8 2005
    Jan Cosyn
    Abstract Background: Chlorhexidine (CHX) varnishes have been mainly used for the prevention of caries in high-risk populations. Reports regarding their anti-plaque effect on a clinical level are limited to non-existing as opposed to their microbiological impact on plaque formation. Aim: The aim of this preliminary investigation was to evaluate the anti-plaque effect of two CHX varnishes applied on sound enamel in relation to a positive control, a negative control and to one another. Methods: Sixteen healthy subjects volunteered for this randomized-controlled, single-blind, four-treatment,four-period crossover-designed clinical trial. A 3-day plaque re-growth model was used to determine de novo plaque accumulation following CHX rinsing, Cervitec® application, EC40® application and no therapy. The amount of plaque was measured using the Quigley and Hein plaque index and "automatic image analysis" (AIA). Results and Conclusions: Varnish treatment resulted in significantly higher plaque levels than CHX rinsing irrespective of the varnish that was used (p0.002), implying that the latter is likely to remain the gold standard as an anti-plaque agent. However, highly significant differences were also found in favour of both varnish systems when compared with no therapy (p<0.001), which indicates that varnish treatment is an effective means of inhibiting plaque formation in a short time span. Cervitec® exhibited slightly, yet significantly, higher plaque levels in comparison with EC40® as determined by AIA (p=0.006). Large-scale trials with a longer observation period are necessary to substantiate these results. [source]


    The role of cognition in classical and operant conditioning

    JOURNAL OF CLINICAL PSYCHOLOGY, Issue 4 2004
    Irving Kirsch
    For the past 35 years, learning theorists have been providing models that depend on mental representations, even in their most simple, deterministic, and mechanistic approaches. Hence, cognitive involvement (typically thought of as expectancy) is assumed for most instances of classical and operant conditioning, with current theoretical differences concerning the level of cognition that is involved (e.g., simple association vs. rule learning), rather than its presence. Nevertheless, many psychologists not in the mainstream of learning theory continue to think of cognitive and conditioning theories as rival families of hypotheses. In this article, the data pertaining to the role of higher-order cognition in conditioning is reviewed, and a theoretical synthesis is proposed that provides a role for both automatic and cognitively mediated processes. © 2004 Wiley Periodicals, Inc. J Clin Psychol. [source]


    On the meaning of meaning when being mean: commentary on Berkowitz's "On the Consideration of Automatic as Well as Controlled Psychological Processes in Aggression"

    AGGRESSIVE BEHAVIOR, Issue 2 2008
    Kenneth A. Dodge
    Abstract Berkowitz (this issue) makes a cogent case for his cognitive neo-associationist (CNA) model that some aggressive behaviors occur automatically, emotionally, and through conditioned association with other stimuli. He also proposes that they can occur without "processing," that is, without meaning. He contrasts his position with that of social information processing (SIP) models, which he casts as positing only controlled processing mechanisms for aggressive behavior. However, both CNA and SIP models posit automatic as well as controlled processes in aggressive behavior. Most aggressive behaviors occur through automatic processes, which are nonetheless rule governed. SIP models differ from the CNA model in asserting the essential role of meaning (often through nonconscious, automatic, and emotional processes) in mediating the link between a stimulus and an angry aggressive behavioral response. Aggr. Behav. 34:133,135, 2008. © 2008 Wiley-Liss, Inc. [source]


    TGSA-Flex: Extending the capabilities of the Topo-Geometrical superposition algorithm to handle flexible molecules

    JOURNAL OF COMPUTATIONAL CHEMISTRY, Issue 2 2004
    Xavier Gironés
    Abstract In this work, an extension of the already studied Topo-Geometrical Superposition Approach (TGSA) is presented. TGSA, a general-purpose, fast, automatic, and user-intuitive three-dimensional molecular alignment procedure, was originally designed to superpose rigid molecules simply based on atomic numbers, molecular coordinates, and connectivity. The algorithm is further developed to enable handling rotations around single bonds; in this way, common structural features, which were not properly aligned due to conformational causes, can be brought together, thus improving the molecular similarity picture of the final alignment. The present procedure, implemented in Fortran 90 and named TGSA-Flex, is deeply detailed and tested over four molecular sets: amino acids, nordihydroguaiaretic acid (NDGA) derivatives, HIV-1 protease inhibitors, and 1-[2-hydroxyethoxy)methyl]-6-(phenylthio)thymine (HEPT) derivatives. TGSA-Flex performance is evaluated by means of computational time, number of superposed atoms (also comparing it with respect to the rigid approach), and index of fit between the compared structures. © 2003 Wiley Periodicals, Inc. J Comput Chem 25: 153,159, 2004 [source]


    SPLITTING AND BREAKING OF PISTACHIO NUTS WITH STRIKING AND HEATING

    JOURNAL OF FOOD PROCESS ENGINEERING, Issue 3 2008
    H.I. CEM BILIM
    ABSTRACT The objective of this study was to investigate the effects of heating process and striking on splitting and breaking of pistachio nuts and obtaining their kernels without damage. For this purpose, heating process (350C) was applied to pistachio nuts. Heated nuts were dropped onto the rotating disk and then thrown to strike the wall of the container by centrifuge effect . Striking velocity was adjusted with a rotating disk that was driven by an electrical engine. Three different disk rotations (400, 500 and 600/min) with three different moisture contents of pistachios (6.5, 22.0 and 42.5%) were evaluated in the experiments. Results indicated that the highest splitting rate was obtained as 29.33% at 22.0% moisture content with 400 1/min disk velocity. The most healthy kernel percentage obtained from unsplit pistachio nuts was 25.76% at 6.5% moisture content with 500 1/min disk velocity. Additionally, the study results showed that only the heating process had no affect while heating and crushing combinations increased splitting and obtaining the kernels. PRACTICAL APPLICATIONS Consumption of healthy foods is very important for human health. Kirmizi variety of pistachio nuts contains a high ratio of unsplit pistachio nuts after harvest. These pistachio nuts are either consumed as appetizers or used in the sweets sector after extracting the inner part (kernel). The economical value of unsplit pistachio nuts is very low. For this reason, pistachio nut processing plants try to split them or extract the inner part without causing any damage. Unsplit pistachio nuts are split by hand or by primitive hand tools, such as hammer or pliers, and then extracted. This method is not healthy. This study is one of the limited studies concerning automatic, quick and economic splitting and extraction of pistachio nuts. After handling problems like splitting and extracting the kernel of pistachio nuts, pistachio nut processing plants will achieve a healthy pistachio nut production. This study offers a new system for healthy pistachio nut production, with low initial cost, lower wages and in less time. [source]