|
![]() |
|||
|
||||
OverviewThis book provides an in-depth exploration of nonsmooth optimization, covering foundational algorithms, theoretical insights, and a wide range of applications. Nonsmooth optimization, characterized by nondifferentiable objective functions or constraints, plays a crucial role across various fields, including machine learning, imaging, inverse problems, statistics, optimal control, and engineering. Its scope and relevance continue to expand, as many real-world problems are inherently nonsmooth or benefit significantly from nonsmooth regularization techniques. This book covers a variety of algorithms for solving nonsmooth optimization problems, which are foundational and recent. It first introduces basic facts on convex analysis and subdifferetial calculus, various algorithms are then discussed, including subgradient methods, mirror descent methods, proximal algorithms, alternating direction method of multipliers, primal dual splitting methods and semismooth Newton methods. Moreover, error bound conditions are discussed and the derivation of linear convergence is illustrated. A particular chapter is delved into first order methods for nonconvex optimization problems satisfying the Kurdyka-Lojasiewicz condition. The book also addresses the rapid evolution of stochastic algorithms for large-scale optimization. This book is written for a wide-ranging audience, including senior undergraduates, graduate students, researchers, and practitioners who are interested in gaining a comprehensive understanding of nonsmooth optimization. Full Product DetailsAuthor: Qinian JinPublisher: Springer International Publishing AG Imprint: Springer International Publishing AG Volume: 82 ISBN: 9783031914164ISBN 10: 3031914163 Pages: 475 Publication Date: 14 July 2025 Audience: Professional and scholarly , Professional & Vocational Format: Hardback Publisher's Status: Forthcoming Availability: Not yet available ![]() This item is yet to be released. You can pre-order this item and we will dispatch it to you upon its release. Table of ContentsPreface.- Introduction.- Convex sets and convex functions.- Subgradient and mirror descent methods.- Proximal algorithms.- Karush-Kuhn-Tucker theory and Lagrangian duality.- ADMM: alternating direction method of multipliers.- Primal dual splitting algorithms.- Error bound conditions and linear convergence.- Optimization with Kurdyka- Lojasiewicz property.- Semismooth Newton methods.- Stochastic algorithms.- References.- Index.ReviewsAuthor InformationQinian Jin graduated from Anhui Normal University in China with a bachelor degree and obtained his PhD degree from the Department of Mathematics at Rutgers University, New Brunswick, USA. He then joined the Mathematical Sciences Institute at Australian National University in 2011. His research was supported by Australian Research Council (ARC) and he was awarded the Future Fellowship from ARC. His research interest covers inverse problems, numerical analysis, optimization, partial differential equations, geometric analysis. In particular his recent research focuses on using nonsmooth optimization technique to design algorithms for solving ill-posed inverse problems. He has published about 70 papers on international journals. Tab Content 6Author Website:Countries AvailableAll regions |