Home About us Contact | |||
Major Breakthrough (major + breakthrough)
Selected AbstractsLong-term landscape evolution: linking tectonics and surface processesEARTH SURFACE PROCESSES AND LANDFORMS, Issue 3 2007Paul Bishop Abstract Research in landscape evolution over millions to tens of millions of years slowed considerably in the mid-20th century, when Davisian and other approaches to geomorphology were replaced by functional, morphometric and ultimately process-based approaches. Hack's scheme of dynamic equilibrium in landscape evolution was perhaps the major theoretical contribution to long-term landscape evolution between the 1950s and about 1990, but it essentially ,looked back' to Davis for its springboard to a viewpoint contrary to that of Davis, as did less widely known schemes, such as Crickmay's hypothesis of unequal activity. Since about 1990, the field of long-term landscape evolution has blossomed again, stimulated by the plate tectonics revolution and its re-forging of the link between tectonics and topography, and by the development of numerical models that explore the links between tectonic processes and surface processes. This numerical modelling of landscape evolution has been built around formulation of bedrock river processes and slope processes, and has mostly focused on high-elevation passive continental margins and convergent zones; these models now routinely include flexural and denudational isostasy. Major breakthroughs in analytical and geochronological techniques have been of profound relevance to all of the above. Low-temperature thermochronology, and in particular apatite fission track analysis and (U,Th)/He analysis in apatite, have enabled rates of rock uplift and denudational exhumation from relatively shallow crustal depths (up to about 4 km) to be determined directly from, in effect, rock hand specimens. In a few situations, (U,Th)/He analysis has been used to determine the antiquity of major, long-wavelength topography. Cosmogenic isotope analysis has enabled the determination of the ,ages' of bedrock and sedimentary surfaces, and/or the rates of denudation of these surfaces. These latter advances represent in some ways a ,holy grail' in geomorphology in that they enable determination of ,dates and rates' of geomorphological processes directly from rock surfaces. The increasing availability of analytical techniques such as cosmogenic isotope analysis should mean that much larger data sets become possible and lead to more sophisticated analyses, such as probability density functions (PDFs) of cosmogenic ages and even of cosmogenic isotope concentrations (CICs). PDFs of isotope concentrations must be a function of catchment area geomorphology (including tectonics) and it is at least theoretically possible to infer aspects of source area geomorphology and geomorphological processes from PDFs of CICs in sediments (,detrital CICs'). Thus it may be possible to use PDFs of detrital CICs in basin sediments as a tool to infer aspects of the sediments' source area geomorphology and tectonics, complementing the standard sedimentological textural and compositional approaches to such issues. One of the most stimulating of recent conceptual advances has followed the considerations of the relationships between tectonics, climate and surface processes and especially the recognition of the importance of denudational isostasy in driving rock uplift (i.e. in driving tectonics and crustal processes). Attention has been focused very directly on surface processes and on the ways in which they may ,drive' rock uplift and thus even influence sub-surface crustal conditions, such as pressure and temperature. Consequently, the broader geoscience communities are looking to geomorphologists to provide more detailed information on rates and processes of bedrock channel incision, as well as on catchment responses to such bedrock channel processes. More sophisticated numerical models of processes in bedrock channels and on their flanking hillslopes are required. In current numerical models of long-term evolution of hillslopes and interfluves, for example, the simple dependency on slope of both the fluvial and hillslope components of these models means that a Davisian-type of landscape evolution characterized by slope lowering is inevitably ,confirmed' by the models. In numerical modelling, the next advances will require better parameterized algorithms for hillslope processes, and more sophisticated formulations of bedrock channel incision processes, incorporating, for example, the effects of sediment shielding of the bed. Such increasing sophistication must be matched by careful assessment and testing of model outputs using pre-established criteria and tests. Confirmation by these more sophisticated Davisian-type numerical models of slope lowering under conditions of tectonic stability (no active rock uplift), and of constant slope angle and steady-state landscape under conditions of ongoing rock uplift, will indicate that the Davis and Hack models are not mutually exclusive. A Hack-type model (or a variant of it, incorporating slope adjustment to rock strength rather than to regolith strength) will apply to active settings where there is sufficient stream power and/or sediment flux for channels to incise at the rate of rock uplift. Post-orogenic settings of decreased (or zero) active rock uplift would be characterized by a Davisian scheme of declining slope angles and non-steady-state (or transient) landscapes. Such post-orogenic landscapes deserve much more attention than they have received of late, not least because the intriguing questions they pose about the preservation of ancient landscapes were hinted at in passing in the 1960s and have recently re-surfaced. As we begin to ask again some of the grand questions that lay at the heart of geomorphology in its earliest days, large-scale geomorphology is on the threshold of another ,golden' era to match that of the first half of the 20th century, when cyclical approaches underpinned virtually all geomorphological work. Copyright © 2007 John Wiley & Sons, Ltd. [source] The soft-output principle,reminiscences and new developments,EUROPEAN TRANSACTIONS ON TELECOMMUNICATIONS, Issue 8 2007Peter A. Hoeher A major breakthrough in digital communications was the provisioning of ,soft' outputs at each processing stage, with appropriate capabilities to use this as soft inputs in the next processing stage. This allowed for much more performant receivers especially in difficult mobile radio channel conditions, and set the stage for iterative processing. This article will outline the development of soft output algorithms over the last two decades along with associated state-of-the-art applications and conclude with an outlook towards novel applications of the soft principle. Copyright © 2007 John Wiley & Sons, Ltd. [source] Toll-like receptor signalling on Tregs: to suppress or not to suppress?IMMUNOLOGY, Issue 4 2008Wendy W. C. Van Maren Summary To balance self-tolerance and immunity against pathogens or tumours, the immune system depends on both activation mechanisms and down-regulatory mechanisms. Immunologists have long been focusing on activation mechanisms, and a major breakthrough was the identification of the Toll-like receptor (TLR) family of proteins. TLRs recognize conserved molecular patterns present on pathogens, including bacteria, viruses, fungi and protozoa. Pathogen recognition via TLRs activates the innate as well as the adaptive immune response. The discovery of a suppressive T-cell subset that constitutively expresses the interleukin (IL)-2 receptor ,-chain (CD25) has boosted efforts to investigate the negative regulation of immune responses. It is now well appreciated that these regulatory T cells (Tregs) play a pivotal role in controlling immune function. Interestingly, recent studies revealed that TLR2 signalling affects Treg expansion and function. This review will focus on the presence and influence of different TLRs on T lymphocytes, including Tregs, and their role in cancer. [source] The end-user application toolkit: a QoS portal for the next generation InternetINTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, Issue 7 2003Charilaos A. Tsetsekas Abstract The support of quality of service (QoS) in the Internet has become one of the most important topics within the Internet community. The introduction of the Integrated Services (IntServ) and the Differentiated Services (DiffServ) architectures was a major breakthrough in this direction. Enhanced by the Bandwidth Broker concept, DiffServ aims to provide QoS in the Internet through the prioritization of some IP flows over others. However, up to now the DiffServ architecture lacks a standard mechanism for the interaction between users/applications and the Bandwidth Brokers (BB), so that end-to-end QoS can be achieved. In this paper we present a distributed middleware architecture for the transparent support of QoS in the Internet. The paper focuses on bridging the gap that currently exists between applications and the network and presents the end-user application toolkit (EAT). The EAT middleware provides a framework for the presentation of network services to users, the description and selection of QoS parameters, the forwarding of reservation requests and the verification of the accredited QoS level. Through the concept of application profiles, it aims to support QoS for legacy applications, that is, commercial applications that cannot be modified to support QoS. Copyright © 2003 John Wiley & Sons, Ltd. [source] Causes and consequences of proteinuria: the kidney filtration barrier and progressive renal failureJOURNAL OF INTERNAL MEDICINE, Issue 3 2003K. Tryggvason Abstract., Tryggvason K, Pettersson E (Karolinska Institute, Stockholm, Sweden). Causes and consequences of proteinuria: the kidney filtration barrier and progressive renal failure (Review). J Intern Med 2003; 254: 216,224. The past few years have witnessed a major breakthrough in the understanding of the molecular mechanisms and ultrastructural changes behind the development of proteinuria. The discovery of several proteins in the glomerular podocyte and slit diaphragm, where mutations lead to disease, has revealed the importance of this cell with its diaphragm as the major filtration barrier as opposed to the glomerular basement membrane (GBM) previously ascribed this function. Furthermore, accumulating clinical as well as experimental evidence points to the harmful effects of proteinuria, irrespective of the original damage. The purpose of this review is to shed light on what we know today about the two sides of this ,coin', the causes and the consequences of proteinuria. [source] Coronary Artery Bypass Surgery Versus Percutaneous Coronary Intervention with Drug-Eluting Stent Implantation in Patients with Multivessel Coronary DiseaseJOURNAL OF INTERVENTIONAL CARDIOLOGY, Issue 1 2007ZHEN KUN YANG M.D. Background: Drug-eluting stents (DES) constitute a major breakthrough in restenosis prevention after percutaneous coronary intervention (PCI). This study compared the clinical outcomes of PCI using DES versus coronary artery bypass graft (CABG) in patients with multivessel coronary artery disease (MVD) in real-world. Methods: From January 2003 to December 2004, 466 consecutive patients with MVD underwent revascularization, 235 by PCI with DES and 231 by CABG. The study end-point was the incidence of major adverse cardiovascular events (MACEs) at the first 30 days after procedure and during follow-up. Results: Most preoperative characteristics were similar in the two groups, but left main disease (24.7% vs 2.6%, P<0.001) and three-vessel disease (65% vs 54%, P = 0.02) were more prevalent in CABG group. The number of coronary lesions was also greater in CABG group (3.7 ± 1.1 vs 3.3 ± 1.1, P<0.001). Despite higher early morbidity (3.9% vs 0.8%, P = 0.03) associated with CABG, there were no significant differences in composite MACEs at the first 30 days between the two groups. During follow-up (mean 25±8 months), the incidence of death, myocardial infarction, or cerebrovascular event was similar in both groups (PCI 6.3% vs CABG 5.6%, P = 0.84). However, bypass surgery still afforded a lower need for repeat revascularization (2.8% vs 10.4%, p = 0.001). Consequently, overall MACE rate (14.5% vs 7.9%, P = 0.03) remained higher after PCI. Conclusion: PCI with DES is a safe and feasible alternative to CABG for selected patients with MVD. The reintervention gap was further narrowed in the era of DES. Aside from restenosis, progression of disease needs to receive substantial emphasis. [source] Diffuse In-Stent RestenosisJOURNAL OF INTERVENTIONAL CARDIOLOGY, Issue 6 2001HANS STÖRGER M.D. Stent restenosis, especially the diffuse pattern, has developed into a significant clinical and economical problem. It has been estimated that up to 250,000 patients developed in-stent restenosis in 2,000 alone, two thirds of them can be expected to have diffuse in-stent restenosis, which is difficult to treat because of high recurrence rates. None of the conventionally available interventional treatment modalities provides optimal long-term results. Intravascular radiation therapy is currently the only effective percutaneous therapy, for combating in-stent restenosis. Late thrombotic complications have largely been eliminated by extended antiplatelet regimens. Geographical miss, a major reason for recurrence of in-stent restenosis after brachytherapy, can be reduced by an improved radiation technique. The first preliminary data on drug-eluting stents, showing only minimal neointimal proliferation at 6-month postimplantation, could represent a major breakthrough in the quest to solve restenosis. [source] Air pollution: A half century of progressAICHE JOURNAL, Issue 6 2004John H. Seinfeld Abstract In the 50 years since the air pollution episodes of Donora, PA and London, U.K., a great deal of progress has been made in understanding the nature and sources of air pollution and the atmospheric transport and transformation of pollutants. Also, many significant technological advances in air pollution control equipment, such as the automobile exhaust gas catalytic converter, have led to effective reduction of emissions from a variety of major pollution sources. Finally, remarkable developments in instrumentation for sampling the trace species in the atmosphere have been and continue to be made. Relatively less progress has been made in understanding the biological mechanisms by which pollutants lead to human injury and mortality. In this review the focus is on the extraordinary progress that has been made over the last half century in understanding the atmospheric nature and behavior of pollutants, both gaseous and particulate. A major breakthrough was the determination of the gas-phase chemistry of both the natural and polluted atmosphere, chemistry that leads to the formation of ozone and a vast array of oxidized molecules. The mechanisms of the oxidation of atmospheric sulfur dioxide, one of the main primary pollutants, were elucidated. Finally, the chemistry, physics, and optics of atmospheric particulate matter (aerosols) have been laid open by many stunning research achievements. Whereas 50 years ago air pollution was thought to be confined to the area around a city, it is now recognized that species emitted on one continent frequently find their way to other continents. Strategies for dealing with a truly global atmospheric backyard now represent a major challenge. © 2004 American Institute of Chemical Engineers AIChE J, 50:1096,1108, 2004 [source] The Rights of Children, the Rights of Nations: Developmental Theory and the Politics of Children's RightsJOURNAL OF SOCIAL ISSUES, Issue 4 2008Colette Daiute The Convention on the Rights of the Child (CRC), U.N. General Assembly (1989) is a major breakthrough in defining children as fully human and working to ensure them the attendant benefits worldwide. While children's rights as equal human beings may seem obvious in the 21st century, the politics of establishing and ensuring such rights are contentious. The CRC is a brilliant negotiation of conceptions of the child and international relations, yet certain tensions in the children's rights process lead to a lack of clarity in a global situation that continues to leave millions of children at risk. Analyzing the CRC and related practices from a developmental perspective can help identify obstacles to the advancement of children's rights, especially those related to opportunities for rights-based thinking and the exercise of self-determination and societal-determination rights. In this article, I offer a qualitative analysis of children's rights in the context of what I refer to as the CRC activity-meaning system. I present a theoretical framework for considering this system of policy and practice as enacted in the CRC treaty and related monitoring, reporting, qualifying, and implementing documents. A discourse analysis of conceptions of the child and those responsible for ensuring their rights in seven representative documents (including the CRC Treaty, a report by the U.N. Committee on the Rights of the Child, minutes of a U.N. Security Council meeting, reports by a State-Party, and a report by a civil society group in that country) reveals tensions inherent in the CRC activity-meaning system.1 Emerging from this analysis is a tension between children's rights and nation's rights. Created in part via explicit and implicit assumptions about child development in the CRC as these posit responsibilities across actors in the broader CRC system, this tension challenges the implementation of children's rights and the development of children's rights-based understandings. I use this analysis to explain why future research and practice should address the development of children's rights-based understanding not only in terms of maturation or socialization but also as integral to salient conflicts in their every day lives. [source] The development of a facet analysis system to identify and measure the dimensions of interaction in online learningJOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 11 2007Shawne D. Miksa The development of a facet analysis system to code and analyze data in a mixed-method study is discussed. The research goal was to identify the dimensions of interaction that contribute to student satisfaction in online Web-supported courses. The study was conducted between 2000 and 2002 at the Florida State University School of Information Studies. The researchers developed a facet analysis system that meets S. R. Ranganathan's (1967) requirements for articulation on three planes (idea, verbal, and notational). This system includes a codebook (verbal), coding procedures, and formulae (notational) for quantitative analysis of logs of chat sessions and postings to discussion boards for eight master's level courses taught online during the fall 2000 semester. Focus group interviews were subsequently held with student participants to confirm that results of the facet analysis reflected their experiences with the courses. The system was developed through a process of emergent coding. The researchers have been unable to identify any prior use of facet analysis for the analysis of research data as in this study. Identifying the facet analysis system was a major breakthrough in the research process, which, in turn, provided the researchers with a lens through which to analyze and interpret the data. In addition, identification of the faceted nature of the system opens up new possibilities for automation of the coding process. [source] Magnetar oscillations pose challenges for strange starsMONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY: LETTERS (ELECTRONIC), Issue 1 2007Anna L. Watts ABSTRACT Compact relativistic stars allow us to study the nature of matter under extreme conditions, probing regions of parameter space that are otherwise inaccessible. Nuclear theory in this regime is not well constrained: one key issue is whether neutron stars are in fact composed primarily of strange quark matter. Distinguishing the two possibilities, however, has been difficult. The recent detection of seismic vibrations in the aftermath of giant flares from two magnetars (highly magnetized compact stars) is a major breakthrough. The oscillations excited seem likely to involve the stellar crust, the properties of which differ dramatically for strange stars. We show that the resulting mode frequencies cannot be reconciled with the observations for reasonable magnetar parameters. Ruling out strange star models would place a strong constraint on models of dense quark matter. [source] A Comparative Analysis of President Clinton and Bush's Handling of the North Korean Nuclear Weapons Program: Power and Strategy,PACIFIC FOCUS, Issue 1 2004Ilsu Kim The purposes of this paper are: 1) to examine and analyze how the two presidents' policy goals in dealing with North Korea actually materialized; 2) to illustrate how these two Presidents implement their policy goals toward North Korea; 3) to discuss the Congressional responses to the president's policy goals toward North Korea; and 4) to provide comparative analysis of the two presidents' handling of North Korea. This study shows that different Presidents have dealt with North Korean issues in different ways. Two such presidents, Bill Clinton and George W. Bush, tried at the beginning of their terms as president to ignore the brewing problems in North Korea. However, both were forced to solve the North's nuclear issues early on in their respective administrations. Their decisions in dealing with North Korean nuclear capabilities help to define their early reputations as foreign policy makers. Yet, the domestic as well as international contexts that President Clinton and Bush faced were somewhat different. President Clinton maintains that the North's nuclear crisis arose from North Korea's security fears: Abandoned by its two Cold War patrons, economically bankrupt, and internationally isolated, the North Korean government saw the pursuit of nuclear weapons and ballistic missiles as the only path to survival and security for their regime. In this regard, Clinton's actual efforts to resolve the issues surrounding the North's nuclear program appeared ambiguous and inconsistent. This led to the temporary suspension of the North's nuclear ambitions through an Agreed Framework. However, President Bush stuck to more of a hardnosed approach. He continues to demand a complete, verifiable and irreversible dismantling of the nuclear program first, before any provision of economic or humanitarian assistance is extended toward North Korea. Bush favors multilateral negotiations, which leads the DPRK to feel more isolated than before. Although the second six-party talks ended without a major breakthrough, it seems that all parties except the North think the meeting was successful in terms of lowering tensions in Korea. This case study demonstrates several observable features that characterize the president's role in shaping North Korean policy. A president who wants to take a new approach to some element of U.S. policy can be caught between the diplomat's desire for flexibility and the power of domestic political forces. The president can achieve success, but only if the new direction in policy finds acceptance on Capitol Hill. [source] Recent advances in the role and biosynthesis of ascorbic acid in plantsPLANT CELL & ENVIRONMENT, Issue 4 2001P. L. Conklin ABSTRACT The past few years have provided many advances in the role and biosynthesis of L -ascorbic acid (AsA) in plants. There is an increasing body of evidence confirming that AsA plays an important role in the detoxification of reactive oxygen species. The role of AsA in photoprotection has been confirmed in vivo with the use of Arabidopsis mutants. A player in the defence against reactive oxygen species, AsA peroxidase, has been extensively studied at the molecular level, and regulation of this key enzymatic activity appears to occur at several levels. As a cofactor in the hydroxylation of prolyl and lysl-residues by peptidyl-prolyl and -lysyl hydroxylases, AsA plays a part in cell wall synthesis, defence, and possibly cell division. The maintenance of reduced levels of AsA appears to be highly regulated, involving the interplay of both monodehydroascorbate and dehydroascorbate reductases and possibly auxin. A major breakthrough in plant AsA biosynthesis has been made recently, and strong biochemical and genetic evidence suggest that GDP-mannose and L -galactose are key substrates. In addition, evidence for an alternative AsA biosynthetic pathway(s) exists and awaits additional scrutiny. Finally, newly described Arabidopsis mutants deficient in AsA will further increase our understanding of AsA biosynthesis [source] A Stem Cell Molecular Signature: Are There Hallmark Properties That are Shared by all Stem Cells?CHEMBIOCHEM, Issue 8 2003Ute Schepers Dr. Where does the problem stem from? A major breakthrough was made toward the identification of genes (see figure) that give stem cells their unique properties. This "stem cell molecular signature" or the "genetic blueprint" will certainly offer a powerful resource for understanding some of the stem-cell-related diseases and will become a basis for new experiments on the therapeutic application of stem cells. [source] Epidemiology and diagnosis of Helicobacter pylori infectionHELICOBACTER, Issue 2002Hazel Mitchell There have been no major breakthroughs in the field of epidemiology and diagnosis of Helicobacter pylori infection over the last year, thus for this reason, these two topics will be treated in the same chapter. Information on the incidence of infection, as in the study of Malaty et al. are now being published from long-term cohort follow-ups. The route of transmission of H. pylori remains controversial, with circumstantial evidence for infection via exposure to animals, contaminated water supplies and oral reservoirs being reported. The value of citric acid to improve urea breath test (UBT) results has been documented. A novel stool test has been released on the market and we are awaiting more information, while detection of antibodies in urine gave satisfactory results. However, the most interesting data comes from the study of McColl et al. who clearly proved on a large sample and a 1-year follow-up that the ,test and treat' strategy using UBT, as proposed in the Maastricht Consensus Report, is definitely the method to use. [source] Osteoclastogenesis, Bone Resorption, and Osteoclast-Based TherapeuticsJOURNAL OF BONE AND MINERAL RESEARCH, Issue 4 2003Mone Zaidi Abstract Over the past decade, advances in molecular tools, stem cell differentiation, osteoclast and osteoblast signaling mechanisms, and genetically manipulated mice models have resulted in major breakthroughs in understanding osteoclast biology. This review focuses on key advances in our understanding of molecular mechanisms underlying the formation, function, and survival of osteoclasts. These include key signals mediating osteoclast differentiation, including PU.1, RANK, CSF-1/c-fms, and src, and key specializations of the osteoclast including HCl secretion driven by H+ -ATPase and the secretion of collagenolytic enzymes including cathepsin K and matrix metalloproteinases (MMPs). These pathways and highly expressed proteins provide targets for specific therapies to modify bone degradation. The main outstanding issues, basic and translational, will be considered in relation to the osteoclast as a target for antiresorptive therapies. [source] Urquhart's and Garfield's Laws: The British controversy over their validityJOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 9 2001Stephen J. Bensman The British controversy over the validity of Urquhart's and Garfield's Laws during the 1970s constitutes an important episode in the formulation of the probability structure of human knowledge. This controversy took place within the historical context of the convergence of two scientific revolutions,the bibliometric and the biometric,that had been launched in Britain. The preceding decades had witnessed major breakthroughs in understanding the probability distributions underlying the use of human knowledge. Two of the most important of these breakthroughs were the laws posited by Donald J. Urquhart and Eugene Garfield, who played major roles in establishing the institutional bases of the bibliometric revolution. For his part, Urquhart began his realization of S. C. Bradford's concept of a national science library by analyzing the borrowing of journals on interlibrary loan from the Science Museum Library in 1956. He found that 10% of the journals accounted for 80% of the loans and formulated Urquhart's Law, by which the interlibrary use of a journal is a measure of its total use. This law underlay the operations of the National Lending Library for Science and Technology (NLLST), which Urquhart founded. The NLLST became the British Library Lending Division (BLLD) and ultimately the British Library Document Supply Centre (BLDSC). In contrast, Garfield did a study of 1969 journal citations as part of the process of creating the Science Citation Index (SCI), formulating his Law of Concentration, by which the bulk of the information needs in science can be satisfied by a relatively small, multidisciplinary core of journals. This law became the operational principle of the Institute for Scientific Information created by Garfield. A study at the BLLD under Urquhart's successor, Maurice B. Line, found low correlations of NLLST use with SCI citations, and publication of this study started a major controversy, during which both laws were called into question. The study was based on the faulty use of the Spearman rank-correlation coefficient, and the controversy over it was instrumental in causing B. C. Brookes to investigate bibliometric laws as probabilistic phenomena and begin to link the bibliometric with the biometric revolution. This paper concludes with a resolution of the controversy by means of a statistical technique that incorporates Brookes' criticism of the Spearman rank-correlation method and demonstrates the mutual supportiveness of the two laws. [source] |