|
![]() |
|||
|
||||
OverviewRobust statistics is the study of designing estimators that perform well even when the dataset significantly deviates from the idealized modeling assumptions, such as in the presence of model misspecification or adversarial outliers in the dataset. The classical statistical theory, dating back to pioneering works by Tukey and Huber, characterizes the information-theoretic limits of robust estimation for most common problems. A recent line of work in computer science gave the first computationally efficient robust estimators in high dimensions for a range of learning tasks. This reference text for graduate students, researchers, and professionals in machine learning theory, provides an overview of recent developments in algorithmic high-dimensional robust statistics, presenting the underlying ideas in a clear and unified manner, while leveraging new perspectives on the developed techniques to provide streamlined proofs of these results. The most basic and illustrative results are analyzed in each chapter, while more tangential developments are explored in the exercises. Full Product DetailsAuthor: Ilias Diakonikolas (University of Wisconsin-Madison) , Daniel M. Kane (University of California, San Diego)Publisher: Cambridge University Press Imprint: Cambridge University Press ISBN: 9781108837811ISBN 10: 1108837816 Pages: 300 Publication Date: 07 September 2023 Audience: College/higher education , Tertiary & Higher Education Format: Hardback Publisher's Status: Active Availability: Manufactured on demand ![]() We will order this item for you from a manufactured on demand supplier. Table of Contents1. Introduction to robust statistics; 2. Efficient high-dimensional robust mean estimation; 3. Algorithmic refinements in robust mean estimation; 4. Robust covariance estimation; 5. List-decodable learning; 6. Robust estimation via higher moments; 7. Robust supervised learning; 8. Information-computation tradeoffs in high-dimensional robust statistics; A. Mathematical background; References; Index.Reviews'This is a timely book on efficient algorithms for computing robust statistics from noisy data. It presents lucid intuitive descriptions of the algorithms as well as precise statements of results with rigorous proofs - a nice combination indeed. The topic has seen fundamental breakthroughs over the last few years and the authors are among the leading contributors. The reader will get a ringside view of the developments.' Ravi Kannan, Visiting Professor, Indian Institute of Science Author InformationIlias Diakonikolas is an associate professor of computer science at the University of Wisconsin-Madison. His current research focuses on the algorithmic foundations of machine learning. Diakonikolas is a recipient of a number of research awards, including the best paper award at NeurIPS 2019. Daniel M. Kane is an associate professor at the University of California, San Diego in the departments of Computer Science and Mathematics. He is a four-time Putnam Fellow and two-time IMO gold medallist. Kane's research interests include number theory, combinatorics, computational complexity, and computational statistics. Tab Content 6Author Website:Countries AvailableAll regions |