ATTENTION/WARNING - NE PAS DÉPOSER ICI/DO NOT SUBMIT HERE

Ceci est la version de TEST de DIAL.mem. Veuillez ne pas soumettre votre mémoire sur ce site mais bien à l'URL suivante: 'https://thesis.dial.uclouvain.be'.
This is the TEST version of DIAL.mem. Please use the following URL to submit your master thesis: 'https://thesis.dial.uclouvain.be'.
 

Performance estimation of the gradient method with fixed arbitrary step sizes

(2019)

Files

Daccache_63601400_2019.pdf
  • Open access
  • Adobe PDF
  • 2.89 MB

Details

Supervisors
Faculty
Degree label
Abstract
Based on the recently developed performance estimation framework (\cite{DT14, ATGH}) and the associated tool Pesto toolbox \cite{pesto2017}, we attempt to estimate the exact performance of the classical gradient method with fixed but arbitrary step sizes. We provide a new conjecture on the exact worst-case bound for two fixed but arbitrary step sizes of the gradient method for smooth convex and strongly convex functions. We also provide a partial conjecture for three steps in the case of smooth convex functions. Following these conjectures, we provide an estimation on the shape of the worst-case bound for $N$ variable step sizes. \\ Based on those exact worst-case bounds, we also estimate the optimal step sizes for two and three arbitrary fixed steps and compare to the case of two and three constant steps, from which we infer a trend for $N$ steps.\\ We also focus on obtaining proofs for these bounds based on the interpolation inequalities introduced in \cite{ATGH}, we introduce the concepts of minimal proof and local minimal proofs, corresponding to the minimal set of inequalities needed to provide a proof for all and some step sizes respectively, as well as a procedure to obtain an estimation of these optimal proofs. Finally, we apply this procedure to the case of two steps and we provide elements of the proof of the conjecture previously identified.