Combining fast inertial dynamics for convex optimization with Tikhonov regularization
Journal of Mathematical Analysis and Applications, Volume 457, No. 2, Year 2018
Notification
URL copied to clipboard!
In a Hilbert space setting H, we study the convergence properties as t→+∞ of the trajectories of the second-order differential equation (AVD)α,ϵx¨(t)+[Formula presented]x˙(t)+∇Φ(x(t))+ϵ(t)x(t)=0, where ∇Φ is the gradient of a convex continuously differentiable function Φ:H→R, α is a positive parameter, and ϵ(t)x(t) is a Tikhonov regularization term, with ϵ(t) positive, and limt→∞ϵ(t)=0. In this damped inertial system, the damping coefficient [Formula presented] vanishes asymptotically, but not too quickly, a key property to obtain rapid convergence of the values. In the case ϵ(⋅)≡0, this dynamic has been highlighted recently by Su, Boyd, and Candès as a continuous version of the Nesterov accelerated gradient method. Depending on the speed of convergence of ϵ(t) to zero, we analyze the convergence properties of the trajectories of (AVD)α,ϵ. We obtain results ranging from the rapid convergence of Φ(x(t)) to minΦ when ϵ(t) decreases rapidly to zero, up to the strong convergence of the trajectories to the element of minimum norm of the set of minimizers of Φ, when ϵ(t) tends slowly to zero. When ϵ(t)=[Formula presented], the critical value of r separating the two above cases is r=2.