Pattern Matching (pattern + matching)

Distribution by Scientific Domains


Selected Abstracts


Database of queryable gene expression patterns for Xenopus

DEVELOPMENTAL DYNAMICS, Issue 6 2009
Michael J. Gilchrist
Abstract The precise localization of gene expression within the developing embryo, and how it changes over time, is one of the most important sources of information for elucidating gene function. As a searchable resource, this information has up until now been largely inaccessible to the Xenopus community. Here, we present a new database of Xenopus gene expression patterns, queryable by specific location or region in the embryo. Pattern matching can be driven either from an existing in situ image, or from a user-defined pattern based on development stage schematic diagrams. The data are derived from the work of a group of 21 Xenopus researchers over a period of 4 days. We used a novel, rapid manual annotation tool, XenMARK, which exploits the ability of the human brain to make the necessary distortions in transferring data from the in situ images to the standard schematic geometry. Developmental Dynamics 238:1379,1388, 2009. © 2009 Wiley-Liss, Inc. [source]


Semantic patterns for user-interactive question answering

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 7 2008
Tianyong Hao
Abstract A new type of semantic pattern is proposed in this paper, which can be used by users to post questions and answers in user-interactive question answering (QA) systems. The necessary procedures of using semantic patterns in a QA system are also presented, which include question structure analysis, pattern matching, pattern generation, pattern classification and answer extraction. Both the manual creation method and the automatic generation method are proposed for patterns for different applications. A pattern instantiation level metrics is also presented for the predication of the quality of generated or learned patterns. We implemented a user interface for using the semantic pattern in our QA system, which allows users to effectively post and answer questions. Copyright © 2007 John Wiley & Sons, Ltd. [source]


An astronomical pattern-matching algorithm for computer-aided identification of whale sharks Rhincodon typus

JOURNAL OF APPLIED ECOLOGY, Issue 6 2005
Z. ARZOUMANIAN
Summary 1The formulation of conservation policy relies heavily on demographic, biological and ecological knowledge that is often elusive for threatened species. Essential estimates of abundance, survival and life-history parameters are accessible through mark and recapture studies given a sufficiently large sample. Photographic identification of individuals is an established mark and recapture technique, but its full potential has rarely been exploited because of the unmanageable task of making visual identifications in large data sets. 2We describe a novel technique for identifying individual whale sharks Rhincodon typus through numerical pattern matching of their natural surface ,spot' colourations. Together with scarring and other markers, spot patterns captured in photographs of whale shark flanks have been used, in the past, to make identifications by eye. We have automated this process by adapting a computer algorithm originally developed in astronomy for the comparison of star patterns in images of the night sky. 3In tests using a set of previously identified shark images, our method correctly matched pairs exhibiting the same pattern in more than 90% of cases. From a larger library of previously unidentified images, it has to date produced more than 100 new matches. Our technique is robust in that the incidence of false positives is low, while failure to match images of the same shark is predominantly attributable to foreshortening in photographs obtained at oblique angles of more than 30°. 4We describe our implementation of the pattern-matching algorithm, estimates of its efficacy, its incorporation into the new ECOCEAN Whale Shark Photo-identification Library, and prospects for its further refinement. We also comment on the biological and conservation implications of the capability of identifying individual sharks across wide geographical and temporal spans. 5Synthesis and applications. An automated photo-identification technique has been developed that allows for efficient ,virtual tagging' of spotted animals. The pattern-matching software has been implemented within a Web-based library created for the management of generic encounter photographs and derived data. The combined capabilities have demonstrated the reliability of whale shark spot patterns for long-term identifications, and promise new ecological insights. Extension of the technique to other species is anticipated, with attendant benefits to management and conservation through improved understanding of life histories, population trends and migration routes, as well as ecological factors such as exploitation impact and the effectiveness of wildlife reserves. [source]


Virtual Electrode Polarization Leads to Reentry in the Far Field

JOURNAL OF CARDIOVASCULAR ELECTROPHYSIOLOGY, Issue 8 2001
ANNETTE E. LINDBLOM M.S.
Virtual Electrode Polarization. Introduction: Our previous article examined cardiac vulnerability to reentry in the near field within the framework of the virtual electrode polarization (VEP) concept. The present study extends this examination to the far field and compares its predictions to the critical point hypothesis. Methods and Results: We simulate the electrical behavior of a sheet of myocardium using a two-dimensional bidomain model. The fiber field is extrapolated from a set of rabbit heart fiber directions obtained experimentally. An S1 stimulus is applied along the top or left border. An extracellular line electrode on the top delivers a cathodal or anodal S2 stimulus. A VEP pattern matching that seen experimentally is observed and covers the entire sheet, thus constituting a far-field effect. Reentry arises from break excitation, make excitation, or a combination of both, and subsequent propagation through deexcited and recovered areas. Reentry occurs in cross-field, parallel-field, and uniform refractoriness protocols. For long coupling intervals (CIs) above CImakemin (defined as the shortest CI at which make excitation can take place), rotors move away from the cathodal electrode and the S1 site for increases in S2 strength and CI, respectively. For cathodal S2 stimuli, findings are consistent with the critical point hypothesis. For CIs below CImakemin, reentry is initiated by break excitation only, and the resulting reentrant patterns are no longer consistent with those predicted by the critical point hypothesis. Conclusion: Shock-induced VEP can explain vulnerability in the far field. The VEP theory of vulnerability encompasses the critical point hypothesis for cathodal S2 shocks at long CIs. [source]


Personal Identification Using the Frontal Sinus,

JOURNAL OF FORENSIC SCIENCES, Issue 3 2010
Joanna L. Besana M.Sc.
Abstract:, The frontal sinuses are known to be unique to each individual; however, no one has tested the independence of the frontal sinus traits to see if probability analysis through trait combination is a viable method of identifying an individual using the frontal sinuses. This research examines the feasibility of probability trait combination, based on criteria recommended in the literature, and examines two other methods of identification using the frontal sinuses: discrete trait combinations and superimposition pattern matching. This research finds that most sinus traits are dependent upon one another and thus cannot be used in probability combinations. When looking at traits that are independent, this research finds that metric methods are too fraught with potential errors to be useful. Discrete trait combinations do not have a high enough discriminating power to be useful. Only superimposition pattern matching is an effective method of identifying an individual using the frontal sinuses. [source]


EquiX,A search and query language for XML

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, Issue 6 2002
Sara Cohen
EquiX is a search language for XML that combines the power of querying with the simplicity of searching. Requirements for such languages are discussed, and it is shown that EquiX meets the necessary criteria. Both a graph-based abstract syntax and a formal concrete syntax are presented for EquiX queries. In addition, the semantics is defined and an evaluation algorithm is presented. The evaluation algorithm is polynomial under combined complexity. EquiX combines pattern matching, quantification, and logical expressions to query both the data and meta-data of XML documents. The result of a query in EquiX is a set of XML documents. A DTD describing the result documents is derived automatically from the query. [source]


Identifying residues in natural organic matter through spectral prediction and pattern matching of 2D NMR datasets

MAGNETIC RESONANCE IN CHEMISTRY, Issue 1 2004
Andre J. Simpson
Abstract This paper describes procedures for the generation of 2D NMR databases containing spectra predicted from chemical structures. These databases allow flexible searching via chemical structure, substructure or similarity of structure as well as spectral features. In this paper we use the biopolymer lignin as an example. Lignin is an important and relatively recalcitrant structural biopolymer present in the majority of plant biomass. We demonstrate how an accurate 2D NMR database of ,600 2D spectra of lignin fragments can be easily constructed, in ,2 days, and then subsequently show how some of these fragments can be identified in soil extracts through the use of various search tools and pattern recognition techniques. We demonstrate that once identified in one sample, similar residues are easily determined in other soil extracts. In theory, such an approach can be used for the analysis of any organic mixtures. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Improved detection of reactive metabolites with a bromine-containing glutathione analog using mass defect and isotope pattern matching

RAPID COMMUNICATIONS IN MASS SPECTROMETRY, Issue 9 2010
André LeBlanc
Drug bioactivation leading to the formation of reactive species capable of covalent binding to proteins represents an important cause of drug-induced toxicity. Reactive metabolite detection using invitro microsomal incubations is a crucial step in assessing potential toxicity of pharmaceutical compounds. The most common method for screening the formation of these unstable, electrophilic species is by trapping them with glutathione (GSH) followed by liquid chromatography/mass spectrometry (LC/MS) analysis. The present work describes the use of a brominated analog of glutathione, N -(2-bromocarbobenzyloxy)-GSH (GSH-Br), for the invitro screening of reactive metabolites by LC/MS. This novel trapping agent was tested with four drug compounds known to form reactive metabolites, acetaminophen, fipexide, trimethoprim and clozapine. Invitro rat microsomal incubations were performed with GSH and GSH-Br for each drug with subsequent analysis by liquid chromatography/high-resolution mass spectrometry on an electrospray time-of-flight (ESI-TOF) instrument. A generic LC/MS method was used for data acquisition, followed by drug-specific processing of accurate mass data based on mass defect filtering and isotope pattern matching. GSH and GSH-Br incubations were compared to control samples using differential analysis (Mass Profiler) software to identify adducts formed via the formation of reactive metabolites. In all four cases, GSH-Br yielded improved results, with a decreased false positive rate, increased sensitivity and new adducts being identified in contrast to GSH alone. The combination of using this novel trapping agent with powerful processing routines for filtering accurate mass data and differential analysis represents a very reliable method for the identification of reactive metabolites formed in microsomal incubations. Copyright © 2010 John Wiley & Sons, Ltd. [source]


Statistical density modification using local pattern matching

ACTA CRYSTALLOGRAPHICA SECTION D, Issue 10 2003
Thomas C. Terwilliger
A method for improving crystallographic phases is presented that is based on the preferential occurrence of certain local patterns of electron density in macromolecular electron-density maps. The method focuses on the relationship between the value of electron density at a point in the map and the pattern of density surrounding this point. Patterns of density that can be superimposed by rotation about the central point are considered equivalent. Standard templates are created from experimental or model electron-density maps by clustering and averaging local patterns of electron density. The clustering is based on correlation coefficients after rotation to maximize the correlation. Experimental or model maps are also used to create histograms relating the value of electron density at the central point to the correlation coefficient of the density surrounding this point with each member of the set of standard patterns. These histograms are then used to estimate the electron density at each point in a new experimental electron-density map using the pattern of electron density at points surrounding that point and the correlation coefficient of this density to each of the set of standard templates, again after rotation to maximize the correlation. The method is strengthened by excluding any information from the point in question from both the templates and the local pattern of density in the calculation. A function based on the origin of the Patterson function is used to remove information about the electron density at the point in question from nearby electron density. This allows an estimation of the electron density at each point in a map, using only information from other points in the process. The resulting estimates of electron density are shown to have errors that are nearly independent of the errors in the original map using model data and templates calculated at a resolution of 2.6,Ĺ. Owing to this independence of errors, information from the new map can be combined in a simple fashion with information from the original map to create an improved map. An iterative phase-improvement process using this approach and other applications of the image-reconstruction method are described and applied to experimental data at resolutions ranging from 2.4 to 2.8,Ĺ. [source]