Subventions et des contributions :

Titre :
Efficient methods for large-scale structural optimization with application to machine learning
Numéro de l’entente :
RGPIN
Valeur d'entente :
140 000,00 $
Date d'entente :
10 mai 2017 -
Organisation :
Conseil de recherches en sciences naturelles et en génie du Canada
Location :
Colombie-Britannique, Autre, CA
Numéro de référence :
GC-2017-Q1-01791
Type d'entente :
subvention
Type de rapport :
Subventions et des contributions
Informations supplémentaires :

Subvention ou bourse octroyée s'appliquant à plus d'un exercice financier. (2017-2018 à 2022-2023)

Nom légal du bénéficiaire :
Lu, Zhaosong (Simon Fraser University)
Programme :
Programme de subventions à la découverte - individuelles
But du programme :

Nowadays structural optimization problems frequently arise in engineering and sciences, especially in data analytics, machine learning and statistics. In big data era, these problems are often of high dimension and consist of large number of functions in their objective and/or constraints. This brings unprecedented challenges to traditional optimization methods. The objectives of this proposal are to propose efficient methods and develop software for solving them, and explore their applications in data analytics and machine learning.

The proposed research consists of three parts: 1) developing efficient methods for minimization of a finite sum of functions that has numerous applications in machine learning and statistics; 2) exploring block coordinate updated methods for solving large-scale structural optimization problems; and 3) studying efficient methods for solving large-scale constrained structural optimization problems.

The success of this project will provide efficient methods and software for solving large-scale structural optimization problems. They will help government, financial institutions, business and industry to use larger data sets to make a better prediction and decision for the future that will produce long-term benefits. The success of this proposed research will also result in new optimization theory and techniques that complement existing knowledge of continuous optimization.