A brand new edition of a graduate-level machine learning textbook that focuses on the evaluation and theory of algorithms.
This publication is a general introduction to machine learning that may function as a textbook for graduate students and a reference for researchers. It covers basic modern topics in machine learning while providing the theoretical foundation and conceptual tools needed for the discussion and justification of algorithms. Additionally, it describes several key facets of the application of those algorithms. The authors aim to exhibit publication theoretical tools and concepts while giving concise evidence for relatively advanced topics.
Foundations of Machine Learning is unique because of its focus on the evaluation and theory of algorithms. The first four chapters set the theoretical base for what follows; subsequent chapters are mostly self-contained. Topics covered include the Probably Approximately Correct (PAC) learning frame; generalization bounds based on Rademacher complexity and VC-dimension; Support Vector Machines (SVMs); kernel methods; fostering; online learning; multi-class classification; ranking; regression; algorithmic stability; dimensionality reduction; studying automata and languages; and reinforcement learning. Each chapter finishes with a set of exercises. Appendixes provide additional material including concise chances inspection.
New material in the appendixes includes a significant section on Fenchel duality, expanded coverage of concentration inequalities, and an entirely new entrance on information theory. Over fifty percent of those exercises are new to this edition.