Supervised Learning (supervised + learning)

Distribution by Scientific Domains


Selected Abstracts


A brainlike learning system with supervised, unsupervised, and reinforcement learning

ELECTRICAL ENGINEERING IN JAPAN, Issue 1 2008
Takafumi Sasakawa
Abstract According to Hebb's cell assembly theory, the brain has the capability of function localization. On the other hand, it is suggested that in the brain there are three different learning paradigms: supervised, unsupervised, and reinforcement learning, which are related deeply to the three parts of brain: cerebellum, cerebral cortex, and basal ganglia, respectively. Inspired by the above knowledge of the brain in this paper we present a brainlike learning system consisting of three parts: supervised learning (SL) part, unsupervised learning (UL) part, and reinforcement learning (RL) part. The SL part is a main part learning input,output mapping; the UL part is a competitive network dividing input space into subspaces and realizes the capability of function localization by controlling firing strength of neurons in the SL part based on input patterns; the RL part is a reinforcement learning scheme, which optimizes system performance by adjusting the parameters in the UL part. Numerical simulations have been carried out and the simulation results confirm the effectiveness of the proposed brainlike learning system. © 2007 Wiley Periodicals, Inc. Electr Eng Jpn, 162(1): 32,39, 2008; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/eej.20600 [source]


Recurrent neural networks with multi-branch structure

ELECTRONICS & COMMUNICATIONS IN JAPAN, Issue 9 2008
Takashi Yamashita
Abstract Universal Learning Networks (ULNs) provide a generalized framework for many kinds of structures in neural networks with supervised learning. Multi-Branch Neural Networks (MBNNs) which use the framework of ULNs have already been shown to have better representation ability in feedforward neural networks (FNNs). The multi-branch structure of MBNNs can be easily extended to recurrent neural networks (RNNs) because the characteristics of ULNs include the connection of multiple branches with arbitrary time delays. In this paper, therefore, RNNs with multi-branch structure are proposed and are shown to have better representation ability than conventional RNNs. RNNs can represent dynamical systems and are useful for time series prediction. The performance evaluation of RNNs with multi-branch structure was carried out using a benchmark of time series prediction. Simulation results showed that RNNs with multi-branch structure could obtain better performance than conventional RNNs, and also showed that they could improve the representation ability even if they are smaller-sized networks. © 2009 Wiley Periodicals, Inc. Electron Comm Jpn, 91(9): 37,44, 2008; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ecj.10157 [source]


A connectionist production system which can perform both modus ponens and modus tollens simultaneously

EXPERT SYSTEMS, Issue 1 2000
Minoru AsogawaArticle first published online: 16 DEC 200
Modus ponens is used in forward inference and backward inference, where the truth of the conclusion is inferred from the truth of the premise. In modus tollens, the falseness of the premise is inferred from the falseness of the conclusion. Although modus ponens is used in general connectionist production systems, modus tollens is rarely used, except in Quinlan's proposed INFERNO system and in the system proposed by Thornber. A connectionist production system called ConnPS that can perform both modus ponens and modus tollens simultaneously is described. Compared to the INFERNO system, one of the advantages of ConnPS is its supervised learning ability. The rules and examples given as external knowledge are often erroneous and incomplete. In ConnPS, these rules can be refined by using the supervised learning. Both positive and negative examples are presented to ConnPS, onto which the external rules and observations are mapped. Moreover, ConnPS's implementations of implications, conjunctions, disjunctions and negation are intuitively consistent with Boolean logic. [source]


A hierarchical Bayesian model for predicting the functional consequences of amino-acid polymorphisms

JOURNAL OF THE ROYAL STATISTICAL SOCIETY: SERIES C (APPLIED STATISTICS), Issue 1 2005
Claudio J. Verzilli
Summary., Genetic polymorphisms in deoxyribonucleic acid coding regions may have a phenotypic effect on the carrier, e.g. by influencing susceptibility to disease. Detection of deleterious mutations via association studies is hampered by the large number of candidate sites; therefore methods are needed to narrow down the search to the most promising sites. For this, a possible approach is to use structural and sequence-based information of the encoded protein to predict whether a mutation at a particular site is likely to disrupt the functionality of the protein itself. We propose a hierarchical Bayesian multivariate adaptive regression spline (BMARS) model for supervised learning in this context and assess its predictive performance by using data from mutagenesis experiments on lac repressor and lysozyme proteins. In these experiments, about 12 amino-acid substitutions were performed at each native amino-acid position and the effect on protein functionality was assessed. The training data thus consist of repeated observations at each position, which the hierarchical framework is needed to account for. The model is trained on the lac repressor data and tested on the lysozyme mutations and vice versa. In particular, we show that the hierarchical BMARS model, by allowing for the clustered nature of the data, yields lower out-of-sample misclassification rates compared with both a BMARS and a frequen-tist MARS model, a support vector machine classifier and an optimally pruned classification tree. [source]