Subventions et des contributions :

Titre :
Joint Prediction of Multiple Waiting Times with Recurrent Neural Nets
Numéro de l’entente :
EGP
Valeur d'entente :
25 000,00 $
Date d'entente :
7 mars 2018 -
Organisation :
Conseil de recherches en sciences naturelles et en génie du Canada
Location :
Ontario, Autre, CA
Numéro de référence :
GC-2017-Q4-00098
Type d'entente :
subvention
Type de rapport :
Subventions et des contributions
Informations supplémentaires :

Subvention ou bourse octroyée s'appliquant à plus d'un exercice financier (2017-2018 à 2018-2019).

Nom légal du bénéficiaire :
Badescu, Andrei Lucian (University of Toronto)
Programme :
Subventions d'engagement partenarial pour les universités
But du programme :

Demand forecasting has always been important for retailers as it drives important decisions regarding businessx000D
strategy and supply chain management. Traditional approaches model the purchase arrival rates using classicalx000D
statistical models, which were developed without easy access to scalable computing power and both widex000D
(many types of data) and long (for many individuals) purchase data. Benefits brought forth by the huge influxx000D
of data during the recent years ought not to be limited to having better estimates of aggregate statistics to applyx000D
classical statistical models, but should also include a more fine-grained modelling based on the detailedx000D
individual-level purchase information.x000D
There have been many models proposed in classical statistics to describe the behaviour of event arrivals, asx000D
they're useful in many areas, such as monitoring equipment remaining lifetimes and disaster forecasting. Thesex000D
models have been meticulously described and their properties have been rigorously proven. However, thesex000D
same models also require a large number of assumptions, which may not necessarily hold in reality. One suchx000D
assumption is that the dependence between events occurring further apart in time also tends to be smaller. We'dx000D
like to explore a data-driven model that aims to relax these assumptions. The recent explosion in accessibilityx000D
to computational power has re-ignited interest in the Artificial Neural Net, which has shown success in manyx000D
different areas of application. A modification to this is the Recurrent Neural Net. With the addition of anx000D
internal state variable, this neural-network set-up can be applied to sequential data as well. It had also beenx000D
shown to be Turing complete, meaning that it can emulate any computer program given a large enough set ofx000D
parameters. Applications of the RNN in natural language processing have been rather successful,x000D
demonstrating that this model is much better than classical statistical models in capturing complex sequentialx000D
dependence, since the classical models require short-range dependence assumptions to work.