Wir verwenden Cookies und Analyse-Tools, um die Nutzerfreundlichkeit der Internet-Seite zu verbessern und für Marketingzwecke. Wenn Sie fortfahren, diese Seite zu verwenden, nehmen wir an, dass Sie damit einverstanden sind. Zur Datenschutzerklärung.
Lectures on Nonsmooth Optimization
Details
This book provides an in-depth exploration of nonsmooth optimization, covering foundational algorithms, theoretical insights, and a wide range of applications. Nonsmooth optimization, characterized by nondifferentiable objective functions or constraints, plays a crucial role across various fields, including machine learning, imaging, inverse problems, statistics, optimal control, and engineering. Its scope and relevance continue to expand, as many real-world problems are inherently nonsmooth or benefit significantly from nonsmooth regularization techniques. This book covers a variety of algorithms for solving nonsmooth optimization problems, which are foundational and recent. It first introduces basic facts on convex analysis and subdifferetial calculus, various algorithms are then discussed, including subgradient methods, mirror descent methods, proximal algorithms, alternating direction method of multipliers, primal dual splitting methods and semismooth Newton methods. Moreover, error bound conditions are discussed and the derivation of linear convergence is illustrated. A particular chapter is delved into first order methods for nonconvex optimization problems satisfying the Kurdyka-Lojasiewicz condition. The book also addresses the rapid evolution of stochastic algorithms for large-scale optimization. This book is written for a wide-ranging audience, including senior undergraduates, graduate students, researchers, and practitioners who are interested in gaining a comprehensive understanding of nonsmooth optimization.
Provides a comprehensive review of the recent development of nonsmooth optimization Addresses the rapid evolution of stochastic algorithms for large-scale optimization Lays out new ideas and concepts gradually in a way that can be understood by mathematically oriented learners
Autorentext
Qinian Jin graduated from Anhui Normal University in China with a bachelor degree and obtained his PhD degree from the Department of Mathematics at Rutgers University, New Brunswick, USA. He then joined the Mathematical Sciences Institute at Australian National University in 2011. His research was supported by Australian Research Council (ARC) and he was awarded the Future Fellowship from ARC. His research interest covers inverse problems, numerical analysis, optimization, partial differential equations, geometric analysis. In particular his recent research focuses on using nonsmooth optimization technique to design algorithms for solving ill-posed inverse problems. He has published about 70 papers on international journals.
Inhalt
Preface.- Introduction.- Convex sets and convex functions.- Subgradient and mirror descent methods.- Proximal algorithms.- Karush-Kuhn-Tucker theory and Lagrangian duality.- ADMM: alternating direction method of multipliers.- Primal dual splitting algorithms.- Error bound conditions and linear convergence.- Optimization with Kurdyka- Lojasiewicz property.- Semismooth Newton methods.- Stochastic algorithms.- References.- Index.
Weitere Informationen
- Allgemeine Informationen
- GTIN 09783031914164
- Lesemotiv Verstehen
- Genre Maths
- Anzahl Seiten 560
- Herausgeber Springer Nature Switzerland
- Größe H235mm x B155mm
- Jahr 2025
- EAN 9783031914164
- Format Fester Einband
- ISBN 978-3-031-91416-4
- Veröffentlichung 04.07.2025
- Titel Lectures on Nonsmooth Optimization
- Autor Qinian Jin
- Untertitel Texts in Applied Mathematics 82
- Sprache Englisch