You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
A new edition of a graduate-level machine learning textbook that focuses on the analysis and theory of algorithms. This book is a general introduction to machine learning that can serve as a textbook for graduate students and a reference for researchers. It covers fundamental modern topics in machine learning while providing the theoretical basis and conceptual tools needed for the discussion and justification of algorithms. It also describes several key aspects of the application of these algorithms. The authors aim to present novel theoretical tools and concepts while giving concise proofs even for relatively advanced topics. Foundations of Machine Learning is unique in its focus on the an...
One of the largest and most active areas of AI, machine learning is of interest to students of psychology, philosophy of science, and education. Although self-contained, volume III follows the tradition of volume I (1983) and volume II (1986). Annotation copyrighted by Book News, Inc., Portland, OR
Computational Learning Theory presents the theoretical issues in machine learning and computational models of learning. This book covers a wide range of problems in concept learning, inductive inference, and pattern recognition. Organized into three parts encompassing 32 chapters, this book begins with an overview of the inductive principle based on weak convergence of probability measures. This text then examines the framework for constructing learning algorithms. Other chapters consider the formal theory of learning, which is learning in the sense of improving computational efficiency as opposed to concept learning. This book discusses as well the informed parsimonious (IP) inference that generalizes the compatibility and weighted parsimony techniques, which are most commonly applied in biology. The final chapter deals with the construction of prediction algorithms in a situation in which a learner faces a sequence of trials, with a prediction to be given in each and the goal of the learner is to make some mistakes. This book is a valuable resource for students and teachers.
COLT
COLT '90 covers the proceedings of the Third Annual Workshop on Computational Learning Theory, sponsored by the ACM SIGACT/SIGART, University of Rochester, Rochester, New York on August 6-8, 1990. The book focuses on the processes, methodologies, principles, and approaches involved in computational learning theory. The selection first elaborates on inductive inference of minimal programs, learning switch configurations, computational complexity of approximating distributions by probabilistic automata, and a learning criterion for stochastic rules. The text then takes a look at inductive identification of pattern languages with restricted substitutions, learning ring-sum-expansions, sample co...
Abstract: "We attempt to determine the theoretical boundaries of the ability of computers to learn. We consider several rigorous models of learning, aimed at addressing types of learning problems excluded from earlier models. In Part I, we consider learning dependencies between real-valued quantities in situations where the environment is assumed to be an adversary, operating within constraints that model the prior knowledge of the learner. While our assumptions as to the form of these dependencies is taken from previous work in statistics, this work is distinguished by the fact that the analysis is worst case. In Part II, we consider learning in situations in which the learner's environment is assumed to be at least partially random. We consider methods for extending the tools for learning [0,1]-valued functions to apply to the learning of many-valued and real-valued functions. We also study the learning of [0,1]-valued functions in situations in which the relationship to be learned is gradually changing as learning is taking place."
This monograph describes results derived from the mathematically oriented framework of computational learning theory.