Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

7-2018

Abstract

Deep Neural Networks (DNNs) are typically trained by backpropagation in a batch setting, requiring the entire training data to be made available prior to the learning task. This is not scalable for many real-world scenarios where new data arrives sequentially in a stream. We aim to address an open challenge of “Online Deep Learning” (ODL) for learning DNNs on the fly in an online setting. Unlike traditional online learning that often optimizes some convex objective function with respect to a shallow model (e.g., a linear/kernel-based hypothesis), ODL is more challenging as the optimization objective is non-convex, and regular DNN with standard backpropagation does not work well in practice for online settings. We present a new ODL framework that attempts to tackle the challenges by learning DNN models which dynamically adapt depth from a sequence of training data in an online learning setting. Specifically, we propose a novel Hedge Backpropagation (HBP) method for online updating the parameters of DNN effectively, and validate the efficacy on large data sets (both stationary and concept drifting scenarios).

Keywords

Neural Networks, Online Learning, Time-series, Data Streams, Machine Learning, Deep Learning

Discipline

Databases and Information Systems | Numerical Analysis and Scientific Computing

Research Areas

Data Science and Engineering

Publication

Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence IJCAI 2018, July 13-19, Stockholm

First Page

2660

Last Page

2666

ISBN

9780999241127

Identifier

10.24963/ijcai.2018/369

Publisher

IJCAI

City or Country

Cambridge, MA

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.24963/ijcai.2018/369

Share

COinS