Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

12-2023

Abstract

Neural networks are an emerging data-driven programming paradigm widely used in many areas. Unlike traditional software systems consisting of decomposable modules, a neural network is usually delivered as a monolithic package, raising challenges for some maintenance tasks such as model restructure and re-adaption. In this work, we propose DeepArc, a novel modularization method for neural networks, to reduce the cost of model maintenance tasks. Specifically, DeepArc decomposes a neural network into several consecutive modules, each of which encapsulates consecutive layers with similar semantics. The network modularization facilitates practical tasks such as refactoring the model to preserve existing features (e.g., model compression) and enhancing the model with new features (e.g., fitting new samples). The modularization and encapsulation allow us to restructure or retrain the model by only pruning and tuning a few localized neurons and layers. Our experiments show that (1) DeepArc can boost the runtime efficiency of the state-of-the-art model compression techniques by 14.8%; (2) compared to the traditional model retraining, DeepArc only needs to train less than 20% of the neurons on average to fit adversarial samples and repair under-performing models, leading to 32.85% faster training performance while achieving similar model prediction performance.

Keywords

architecture, modularization, neural networks

Discipline

Software Engineering

Research Areas

Software and Cyber-Physical Systems

Publication

2023 IEEE/ACM 45th International Conference on Software Engineering (ICSE): Melbourne May 14-20: Proceedings

First Page

1008

Last Page

1019

ISBN

9781665457019

Identifier

10.1109/ICSE48619.2023.00092

Publisher

IEEE

City or Country

Pistacataway

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1109/ICSE48619.2023.00092

Share

COinS