Optimization in Banach Spaces

Β· Springer Nature
Π•-ΠΊΠ½ΠΈΠ³Π°
126
Π‘Ρ‚Ρ€Π°Π½ΠΈΡ†ΠΈ
ΠžΡ†Π΅Π½ΠΈΡ‚Π΅ ΠΈ Ρ€Π΅Ρ†Π΅Π½Π·ΠΈΠΈΡ‚Π΅ Π½Π΅ сС ΠΏΠΎΡ‚Π²Ρ€Π΄Π΅Π½ΠΈ Β Π”ΠΎΠ·Π½Π°Ρ˜Ρ‚Π΅ повСќС

Π—Π° Π΅-ΠΊΠ½ΠΈΠ³Π°Π²Π°

The book is devoted to the study of constrained minimization problems on closed and convex sets in Banach spaces with a Frechet differentiable objective function. Such problems are well studied in a finite-dimensional space and in an infinite-dimensional Hilbert space. When the space is Hilbert there are many algorithms for solving optimization problems including the gradient projection algorithm which is one of the most important tools in the optimization theory, nonlinear analysis and their applications. An optimization problem is described by an objective function and a set of feasible points. For the gradient projection algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error. In our recent research we show that the gradient projection algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. It should be mentioned that the properties of a Hilbert space play an important role. When we consider an optimization problem in a general Banach space the situation becomes more difficult and less understood. On the other hand such problems arise in the approximation theory. The book is of interest for mathematicians working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of algorithms for convex and nonconvex minimization problems in a general Banach space. The book is of interest for experts in applications of optimization to the approximation theory.
In this book the goal is to obtain a good approximate solution of the constrained optimization problem in a general Banach space under the presence of computational errors. It is shown that the algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors. The algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problemwhich is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors.



Π—Π° Π°Π²Ρ‚ΠΎΡ€ΠΎΡ‚

​Alexander J. Zaslavski is professor in the Department of Mathematics, Technion-Israel Institute of Technology, Haifa, Israel. He has authored numerous books with Springer, the most recent of which include Turnpike Phenomenon and Symmetric Optimization Problems (978-3-030-96972-1), Turnpike Theory for the Robinson–Solow–Srinivasan Model (978-3-030-60306-9), The Projected Subgradient Algorithm in Convex Optimization (978-3-030-60299-4), Convex Optimization with Computational Errors (978-3-030-37821-9), Turnpike Conditions in Infinite Dimensional Optimal Control (978-3-030-20177-7), Optimization on Solution Sets of Common Fixed Point Problems (978-3-030-78848-3).

ΠžΡ†Π΅Π½Π΅Ρ‚Π΅ ја Π΅-ΠΊΠ½ΠΈΠ³Π°Π²Π°

ΠšΠ°ΠΆΠ΅Ρ‚Π΅ Π½ΠΈ ΡˆΡ‚ΠΎ мислитС.

Π˜Π½Ρ„ΠΎΡ€ΠΌΠ°Ρ†ΠΈΠΈ Π·Π° Ρ‡ΠΈΡ‚Π°ΡšΠ΅

ΠŸΠ°ΠΌΠ΅Ρ‚Π½ΠΈ Ρ‚Π΅Π»Π΅Ρ„ΠΎΠ½ΠΈ ΠΈ Ρ‚Π°Π±Π»Π΅Ρ‚ΠΈ
Π˜Π½ΡΡ‚Π°Π»ΠΈΡ€Π°Ρ˜Ρ‚Π΅ ја Π°ΠΏΠ»ΠΈΠΊΠ°Ρ†ΠΈΡ˜Π°Ρ‚Π° Google Play Books Π·Π° Android ΠΈ iPad/iPhone. Автоматски сС синхронизира со смСтката ΠΈ Π²ΠΈ ΠΎΠ²ΠΎΠ·ΠΌΠΎΠΆΡƒΠ²Π° Π΄Π° Ρ‡ΠΈΡ‚Π°Ρ‚Π΅ онлајн ΠΈΠ»ΠΈ ΠΎΡ„Π»Π°Ρ˜Π½ ΠΊΠ°Π΄Π΅ ΠΈ Π΄Π° стС.
Π›Π°ΠΏΡ‚ΠΎΠΏΠΈ ΠΈ ΠΊΠΎΠΌΠΏΡ˜ΡƒΡ‚Π΅Ρ€ΠΈ
МоТС Π΄Π° ΡΠ»ΡƒΡˆΠ°Ρ‚Π΅ Π°ΡƒΠ΄ΠΈΠΎΠΊΠ½ΠΈΠ³ΠΈ ΠΊΡƒΠΏΠ΅Π½ΠΈ ΠΎΠ΄ Google Play со ΠΊΠΎΡ€ΠΈΡΡ‚Π΅ΡšΠ΅ Π½Π° Π²Π΅Π±-прСлистувачот Π½Π° ΠΊΠΎΠΌΠΏΡ˜ΡƒΡ‚Π΅Ρ€ΠΎΡ‚.
Π•-Ρ‡ΠΈΡ‚Π°Ρ‡ΠΈ ΠΈ Π΄Ρ€ΡƒΠ³ΠΈ ΡƒΡ€Π΅Π΄ΠΈ
Π—Π° Π΄Π° Ρ‡ΠΈΡ‚Π°Ρ‚Π΅ Π½Π° ΡƒΡ€Π΅Π΄ΠΈ со Π΅-мастило, ΠΊΠ°ΠΊΠΎ ΡˆΡ‚ΠΎ сС Π΅-Ρ‡ΠΈΡ‚Π°Ρ‡ΠΈΡ‚Π΅ Kobo, ќС Ρ‚Ρ€Π΅Π±Π° Π΄Π° ΠΏΡ€Π΅Π·Π΅ΠΌΠ΅Ρ‚Π΅ Π΄Π°Ρ‚ΠΎΡ‚Π΅ΠΊΠ° ΠΈ Π΄Π° ја ΠΏΡ€Π΅Ρ„Ρ€Π»ΠΈΡ‚Π΅ Π½Π° ΡƒΡ€Π΅Π΄ΠΎΡ‚. Π‘Π»Π΅Π΄Π΅Ρ‚Π΅ Π³ΠΈ Π΄Π΅Ρ‚Π°Π»Π½ΠΈΡ‚Π΅ упатства Π²ΠΎ Π¦Π΅Π½Ρ‚Π°Ρ€ΠΎΡ‚ Π·Π° помош Π·Π° ΠΏΡ€Π΅Ρ„Ρ€Π»Π°ΡšΠ΅ Π½Π° Π΄Π°Ρ‚ΠΎΡ‚Π΅ΠΊΠΈΡ‚Π΅ Π½Π° ΠΏΠΎΠ΄Π΄Ρ€ΠΆΠ°Π½ΠΈ Π΅-Ρ‡ΠΈΡ‚Π°Ρ‡ΠΈ.