Increasing Sophistication (increasing + sophistication)

Distribution by Scientific Domains

Selected Abstracts

Mechanobiology and the Microcirculation: Cellular, Nuclear and Fluid Mechanics

Microcirculation (2010) 17, 179,191. doi: 10.1111/j.1549-8719.2009.00016.x Abstract Endothelial cells are stimulated by shear stress throughout the vasculature and respond with changes in gene expression and by morphological reorganization. Mechanical sensors of the cell are varied and include cell surface sensors that activate intracellular chemical signaling pathways. Here, possible mechanical sensors of the cell including reorganization of the cytoskeleton and the nucleus are discussed in relation to shear flow. A mutation in the nuclear structural protein lamin A, related to Hutchinson-Gilford progeria syndrome, is reviewed specifically as the mutation results in altered nuclear structure and stiffer nuclei; animal models also suggest significantly altered vascular structure. Nuclear and cellular deformation of endothelial cells in response to shear stress provides partial understanding of possible mechanical regulation in the microcirculation. Increasing sophistication of fluid flow simulations inside the vessel is also an emerging area relevant to the microcirculation as visualization in situ is difficult. This integrated approach to study,including medicine, molecular and cell biology, biophysics and engineering,provides a unique understanding of multi-scale interactions in the microcirculation. [source]

Maturation of Corporate Governance Research, 1993,2007: An Assessment

Boris Durisin
ABSTRACT Manuscript Type: Review Research Question/Issue: This study seeks to investigate whether governance research in fact is a discipline or whether it is rather the subject of multi-disciplinary research. We map the intellectual structure of corporate governance research and its evolution from 1993,2007. Research Findings/Results: Based on the analysis of more than 1,000 publications and 48,000 citations in Corporate Goverance: An International Review (CGIR) and other academic journals, our study identifies the most influential works, the dominant subfields, and their evolution. Our study assesses the maturation of corporate governance research as a discipline; it finds increasing sophistication, depth and rigor, and consistency in its intellectual structure. Theoretical Implications: There is a large body of accumulated corporate governance research in the US, yet there is an empirical gap on cross-national studies in the literature. Furthermore, hardly any of the top cited works undertake their study in a cross-national setting. Thus, corporate governance research and CGIR in its quest to contribute to a global theory of corporate governance might benefit if articles have a cross-national methodological approach and empirical grounding in their research design and if articles explicitly aim at stating the theoretical underpinnings they draw on. Practical Implications: Globalists find in CGIR an outlet addressing economics and finance (e.g., whether and how compensation or dismissal of CEOs is related to board characteristics), management (e.g., whether and how best practice codes adoption is related to board characteristics and performance), and accounting (e.g., whether and how earnings manipulations is related to board characteristics) issues globally. [source]

Critical period: A history of the transition from questions of when, to what, to how

George F. Michel
Abstract Although age appears to be the defining characteristic of the concept of critical period, central to its investigation is the recognition that there are specific events which must occur in a particular order for the typical development of certain characteristics to occur. A brief history of some research on critical periods reveals that our questions have shifted from those of: is there a critical period and, if so, when does it occur; to questions of what contributes to the criticality of the period; and finally to how is criticality controlled during development. Abandoning age as a defining component of development has permitted the discovery of exactly how previous and current events construct subsequent events in the process of development. The shifts in questions about critical periods mark an increasing sophistication in understanding how development can be controlled. © 2005 Wiley Periodicals, Inc. Dev Psychobiol 46: 156,162, 2005. [source]

Long-term landscape evolution: linking tectonics and surface processes

Paul Bishop
Abstract Research in landscape evolution over millions to tens of millions of years slowed considerably in the mid-20th century, when Davisian and other approaches to geomorphology were replaced by functional, morphometric and ultimately process-based approaches. Hack's scheme of dynamic equilibrium in landscape evolution was perhaps the major theoretical contribution to long-term landscape evolution between the 1950s and about 1990, but it essentially ,looked back' to Davis for its springboard to a viewpoint contrary to that of Davis, as did less widely known schemes, such as Crickmay's hypothesis of unequal activity. Since about 1990, the field of long-term landscape evolution has blossomed again, stimulated by the plate tectonics revolution and its re-forging of the link between tectonics and topography, and by the development of numerical models that explore the links between tectonic processes and surface processes. This numerical modelling of landscape evolution has been built around formulation of bedrock river processes and slope processes, and has mostly focused on high-elevation passive continental margins and convergent zones; these models now routinely include flexural and denudational isostasy. Major breakthroughs in analytical and geochronological techniques have been of profound relevance to all of the above. Low-temperature thermochronology, and in particular apatite fission track analysis and (U,Th)/He analysis in apatite, have enabled rates of rock uplift and denudational exhumation from relatively shallow crustal depths (up to about 4 km) to be determined directly from, in effect, rock hand specimens. In a few situations, (U,Th)/He analysis has been used to determine the antiquity of major, long-wavelength topography. Cosmogenic isotope analysis has enabled the determination of the ,ages' of bedrock and sedimentary surfaces, and/or the rates of denudation of these surfaces. These latter advances represent in some ways a ,holy grail' in geomorphology in that they enable determination of ,dates and rates' of geomorphological processes directly from rock surfaces. The increasing availability of analytical techniques such as cosmogenic isotope analysis should mean that much larger data sets become possible and lead to more sophisticated analyses, such as probability density functions (PDFs) of cosmogenic ages and even of cosmogenic isotope concentrations (CICs). PDFs of isotope concentrations must be a function of catchment area geomorphology (including tectonics) and it is at least theoretically possible to infer aspects of source area geomorphology and geomorphological processes from PDFs of CICs in sediments (,detrital CICs'). Thus it may be possible to use PDFs of detrital CICs in basin sediments as a tool to infer aspects of the sediments' source area geomorphology and tectonics, complementing the standard sedimentological textural and compositional approaches to such issues. One of the most stimulating of recent conceptual advances has followed the considerations of the relationships between tectonics, climate and surface processes and especially the recognition of the importance of denudational isostasy in driving rock uplift (i.e. in driving tectonics and crustal processes). Attention has been focused very directly on surface processes and on the ways in which they may ,drive' rock uplift and thus even influence sub-surface crustal conditions, such as pressure and temperature. Consequently, the broader geoscience communities are looking to geomorphologists to provide more detailed information on rates and processes of bedrock channel incision, as well as on catchment responses to such bedrock channel processes. More sophisticated numerical models of processes in bedrock channels and on their flanking hillslopes are required. In current numerical models of long-term evolution of hillslopes and interfluves, for example, the simple dependency on slope of both the fluvial and hillslope components of these models means that a Davisian-type of landscape evolution characterized by slope lowering is inevitably ,confirmed' by the models. In numerical modelling, the next advances will require better parameterized algorithms for hillslope processes, and more sophisticated formulations of bedrock channel incision processes, incorporating, for example, the effects of sediment shielding of the bed. Such increasing sophistication must be matched by careful assessment and testing of model outputs using pre-established criteria and tests. Confirmation by these more sophisticated Davisian-type numerical models of slope lowering under conditions of tectonic stability (no active rock uplift), and of constant slope angle and steady-state landscape under conditions of ongoing rock uplift, will indicate that the Davis and Hack models are not mutually exclusive. A Hack-type model (or a variant of it, incorporating slope adjustment to rock strength rather than to regolith strength) will apply to active settings where there is sufficient stream power and/or sediment flux for channels to incise at the rate of rock uplift. Post-orogenic settings of decreased (or zero) active rock uplift would be characterized by a Davisian scheme of declining slope angles and non-steady-state (or transient) landscapes. Such post-orogenic landscapes deserve much more attention than they have received of late, not least because the intriguing questions they pose about the preservation of ancient landscapes were hinted at in passing in the 1960s and have recently re-surfaced. As we begin to ask again some of the grand questions that lay at the heart of geomorphology in its earliest days, large-scale geomorphology is on the threshold of another ,golden' era to match that of the first half of the 20th century, when cyclical approaches underpinned virtually all geomorphological work. Copyright © 2007 John Wiley & Sons, Ltd. [source]

Forensic Risk Assessment in Intellectual Disabilities: The Evidence Base and Current Practice in One English Region

Stephen Turner
The growing interest in forensic risk assessment in intellectual disability services reflects the perception that deinstitutionalization has exposed more people to a greater risk of offending. However, ,risk' and the related idea of ,dangerousness' are problematic concepts because of connotations of dichotomous definition, stability and predictability. Assessment instruments in mainstream forensic psychiatry often combine actuarial and clinical data, and increasingly stress the dynamic nature of risk as well as the importance of situational and accidental triggers. Despite this increasing sophistication of research in mainstream forensic psychiatry, the ability to predict future offending behaviour remains very limited. Furthermore, actuarial predictors developed in studies of psychiatric or prison populations may not be valid for individuals with intellectual disabilities. Offending behaviour among people with intellectual disabilities is also hard to circumscribe because it often does not invoke full legal process or even reporting to the police. In order to discover how such problems were reflected in practice, a survey of providers in the North-west Region of England was undertaken. Seventy out of 106 providers identified as possibly relevant to this inquiry responded to a short postal questionnaire. Twenty-nine (42%) respondents , mainly in the statutory sector , reported operating a risk assessment policy relating to offending. The number of risk assessments completed in the previous year varied from none to ,several hundred'. Providers reported three main kinds of problems: (1) resources or service configuration; (2) interagency or interdisciplinary cooperation or coordination; and (3) issues relating to the effectiveness, design and content of assessment. [source]

Types of software evolution and software maintenance

Ned Chapin
Abstract The past two decades have seen increasing sophistication in software work. Now and in the future, the work of both practitioners and researchers would be helped by a more objective and finer granularity recognition of types of software evolution and software maintenance activities as actually done. To these ends, this paper proposes a clarifying redefinition of the types of software evolution and software maintenance. The paper bases the proposed classification not on people's intentions but upon objective evidence of maintainers' activities ascertainable from observation of activities and artifacts, and/or a before and after comparison of the software documentation. The classification includes taking into account in a semi-hierarchical manner evidence of the change or lack thereof in: (1) the software, (2) the documentation, (3) the properties of the software, and (4) the customer-experienced functionality. A comparison is made with other classifications and typologies. The paper provides a classified list of maintenance activities and a condensed decision tree as a summary guide to the proposed evidence-based classification of the types of software evolution and software maintenance. Copyright © 2001 John Wiley & Sons, Ltd. [source]

Imaging appearance of the symptomatic perforating artery in patients with lacunar infarction: Occlusion or other vascular pathology?

Joanna M. Wardlaw FRCR
Lacunar infarction is associated with distinct clinical features. It is thought to result from occlusion of a deep perforating artery in the basal ganglia, centrum semiovale, or brainstem. However, occluded perforating arteries have only rarely been observed at postmortem in patients with lacunar stroke and have not been noted previously on imaging despite the increasing sophistication of the techniques. We observed nine patients with lacunar stroke imaged with computed tomography and magnetic resonance imaging in whom we observed a linear structure with density or signal features consistent with an occluded (or at least abnormal) perforating artery associated with the relevant lacunar infarct. The appearance might also have been caused by a leak of blood and fluid into the perivascular space around the artery, as in several patients the width of the tubular vessel-like structure (>1mm in diameter) was greater than the expected width of a perforating artery (<0.8mm in diameter). This interpretation is supported by the fact that the area of infarction was usually around the abnormal vessel, not at the end of it. We describe the patients' clinical and imaging features, and discuss alternative explanations for the imaging appearance and the implications for gaining insights into the cause of lacunar infarction. [source]

Life, information, entropy, and time: Vehicles for semantic inheritance

COMPLEXITY, Issue 1 2007
Antony R. Crofts
Abstract Attempts to understand how information content can be included in an accounting of the energy flux of the biosphere have led to the conclusion that, in information transmission, one component, the semantic content, or "the meaning of the message," adds no thermodynamic burden over and above costs arising from coding, transmission and translation. In biology, semantic content has two major roles. For all life forms, the message of the genotype encoded in DNA specifies the phenotype, and hence the organism that is tested against the real world through the mechanisms of Darwinian evolution. For human beings, communication through language and similar abstractions provides an additional supra-phenotypic vehicle for semantic inheritance, which supports the cultural heritages around which civilizations revolve. The following three postulates provide the basis for discussion of a number of themes that demonstrate some important consequences. (i) Information transmission through either pathway has thermodynamic components associated with data storage and transmission. (ii) The semantic content adds no additional thermodynamic cost. (iii) For all semantic exchange, meaning is accessible only through translation and interpretation, and has a value only in context. (1) For both pathways of semantic inheritance, translational and copying machineries are imperfect. As a consequence both pathways are subject to mutation and to evolutionary pressure by selection. Recognition of semantic content as a common component allows an understanding of the relationship between genes and memes, and a reformulation of Universal Darwinism. (2) The emergent properties of life are dependent on a processing of semantic content. The translational steps allow amplification in complexity through combinatorial possibilities in space and time. Amplification depends on the increased potential for complexity opened by 3D interaction specificity of proteins, and on the selection of useful variants by evolution. The initial interpretational steps include protein synthesis, molecular recognition, and catalytic potential that facilitate structural and functional roles. Combinatorial possibilities are extended through interactions of increasing complexity in the temporal dimension. (3) All living things show a behavior that indicates awareness of time, or chronognosis. The ,4 billion years of biological evolution have given rise to forms with increasing sophistication in sensory adaptation. This has been linked to the development of an increasing chronognostic range, and an associated increase in combinatorial complexity. (4) Development of a modern human phenotype and the ability to communicate through language, led to the development of archival storage, and invention of the basic skills, institutions and mechanisms that allowed the evolution of modern civilizations. Combinatorial amplification at the supra-phenotypical level arose from the invention of syntax, grammar, numbers, and the subsequent developments of abstraction in writing, algorithms, etc. The translational machineries of the human mind, the "mutation" of ideas therein, and the "conversations" of our social intercourse, have allowed a limited set of symbolic descriptors to evolve into an exponentially expanding semantic heritage. (5) The three postulates above open interesting epistemological questions. An understanding of topics such dualism, the élan vital, the status of hypothesis in science, memetics, the nature of consciousness, the role of semantic processing in the survival of societies, and Popper's three worlds, require recognition of an insubstantial component. By recognizing a necessary linkage between semantic content and a physical machinery, we can bring these perennial problems into the framework of a realistic philosophy. It is suggested, following Popper, that the ,4 billion years of evolution of the biosphere represents an exploration of the nature of reality at the physicochemical level, which, together with the conscious extension of this exploration through science and culture, provides a firm epistemological underpinning for such a philosophy. © 2007 Wiley Periodicals, Inc. Complexity, 2007 [source]