Benutzerdefiniertes Cover
Benutzerdefiniertes Cover
Normale Ansicht MARC-Ansicht ISBD

Lectures on Nonsmooth Optimization / by Qinian Jin

Von: Resource type: Ressourcentyp: Buch (Online)Buch (Online)Sprache: Englisch Reihen: Texts in Applied Mathematics ; 82Verlag: Cham : Springer Nature Switzerland, 2025Verlag: Cham : Imprint: Springer, 2025Auflage: 1st ed. 2025Beschreibung: 1 Online-Ressource(XIII, 560 p.)ISBN:
  • 9783031914171
Schlagwörter: Andere physische Formen: 9783031914164 | 9783031914188 | Erscheint auch als: 9783031914164 Druck-Ausgabe | Erscheint auch als: 9783031914188 Druck-AusgabeDDC-Klassifikation:
  • 519.6 23
DOI: DOI: 10.1007/978-3-031-91417-1Online-Ressourcen: Zusammenfassung: Preface -- Introduction -- Convex sets and convex functions -- Subgradient and mirror descent methods -- Proximal algorithms -- Karush-Kuhn-Tucker theory and Lagrangian duality -- ADMM: alternating direction method of multipliers -- Primal dual splitting algorithms -- Error bound conditions and linear convergence -- Optimization with Kurdyka- Lojasiewicz property -- Semismooth Newton methods -- Stochastic algorithms -- References -- Index.Zusammenfassung: This book provides an in-depth exploration of nonsmooth optimization, covering foundational algorithms, theoretical insights, and a wide range of applications. Nonsmooth optimization, characterized by nondifferentiable objective functions or constraints, plays a crucial role across various fields, including machine learning, imaging, inverse problems, statistics, optimal control, and engineering. Its scope and relevance continue to expand, as many real-world problems are inherently nonsmooth or benefit significantly from nonsmooth regularization techniques. This book covers a variety of algorithms for solving nonsmooth optimization problems, which are foundational and recent. It first introduces basic facts on convex analysis and subdifferetial calculus, various algorithms are then discussed, including subgradient methods, mirror descent methods, proximal algorithms, alternating direction method of multipliers, primal dual splitting methods and semismooth Newton methods. Moreover, error bound conditions are discussed and the derivation of linear convergence is illustrated. A particular chapter is delved into first order methods for nonconvex optimization problems satisfying the Kurdyka-Lojasiewicz condition. The book also addresses the rapid evolution of stochastic algorithms for large-scale optimization. This book is written for a wide-ranging audience, including senior undergraduates, graduate students, researchers, and practitioners who are interested in gaining a comprehensive understanding of nonsmooth optimization.PPN: PPN: 1929959710Package identifier: Produktsigel: ZDB-2-SEB | ZDB-2-SMA | ZDB-2-SXMS
Dieser Titel hat keine Exemplare

Barrierefreier Inhalt: Accessibility summary: This PDF has been created in accordance with the PDF/UA-1 standard to enhance accessibility, including screen reader support, described non-text content (images, graphs), bookmarks for easy navigation, keyboard-friendly links and forms and searchable, selectable text. We recognize the importance of accessibility, and we welcome queries about accessibility for any of our products. If you have a question or an access need, please get in touch with us at accessibilitysupport@springernature.com. Please note that a more accessible version of this eBook is available as ePub.