Subventions et des contributions :
Subvention ou bourse octroyée s'appliquant à plus d'un exercice financier. (2017-2018 à 2022-2023)
An abundance of statistical methods and learning algorithms reduce to an optimization problem. With the advent of computers and the information age these problems have exploded both in size and complexity. One of the most important questions nowadays is how to extract trends and substructures in extremely large data sets. In this area of data science, which comprises data mining , machine learning , support vector machines or signal processing , techniques like sparse and low-rank optimization or compressed sensing are frequently used.
A common feature of the resulting optimization problems is nonsmoothness of the occurring functions. Hence, there is an increased demand for nonsmooth optimization methods. These rely heavily on solid mathematical foundations in nonsmooth and set-valued analysis . Therefore, as a long-term goal, this research program aims at developing novel variational tools and bring them to bear on solution methods for nonsmooth optimization problems occurring in a variety of fields such as data science and learning.
A watershed in optimization, in particular in the nonsmooth setting, is convexity . This is due to the fact that convex functions exhibit many desirable properties: They allow for a powerful subdifferential and duality calculus. Moreover, stationary points, local and global minima coincide and they have desirable analytical features such as affine minorization , (local) Lipschitz properties or prox-regularity .
Although many problems in practice are not fully convex, there are often convex substructures which can (and should) be exploited. We take this as a guideline for this program in which we focus on two intimately related topics in nonsmooth optimization which have strong connections to statistical and machine learning as well as to other current areas in optimization.
Primarily, we would like to study DC optimization, i.e. minimization problems where the objective function is the difference of two convex functions. This well-established nonconvex problem class covers an abundance of applications and many of the problems of our interest.
We also lay a focus on the variational analysis of some of the concrete nonsmooth, convex functions occurring in various applications such as the matrix-fractional function, which seems to be ubiquitous in the area of data science and connects different topics such as quadratic optimization, multitask learning and nuclear norm smoothing.