Subventions et des contributions :
Subvention ou bourse octroyée s'appliquant à plus d'un exercice financier (2017-2018 à 2018-2019).
The data generated and contributed by today's personal computing devices, ranging from text, voice, to picturex000D
and video, are enormous and contain invaluable information worth discovering. Recent advances in deep neuralx000D
networking have shown great potentials in exploring the hidden information therein. Deep learning relies onx000D
strong computation power to process the massive amount of data, which is typical offered by machine clusters,x000D
or more general, modern data centers. While cloud-centric learning works well for the data that are available inx000D
datacenters, gathering the data from worldwide sources unavoidably incurs high traffic and, more importantly,x000D
latency that challenges such realtime learning applications as face recognition and human tracking in camerax000D
networks. The concept of edge computing has been recently advocated as a complement to cloud computing. Itx000D
pushes applications, data, and computing content away from the centralized data centers. As such, it canx000D
significantly accelerate the training process by reducing the traffic transferred to the cloud and the inferencex000D
latency for a broad spectrum of deep learning applicationsx000D
Huawei is an industrial pioneer in building the communication and computation infrastructure for edgex000D
computing, and has started building its public cloud infrastructure as well. In this project, we will workx000D
together to understand the state-of-the-art of collaborative edge and cloud learning. We will identify thex000D
opportunities and challenges on how to push data pre-processing and feature extracting to the network edge.x000D
We will then develop novel solutions that minimize the network traffic and inference latency yet with desiredx000D
accuracy for deep learning.