Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

8-2019

Abstract

The platform migration and customization have become an indispensable process of deep neural network (DNN) development lifecycle. A highprecision but complex DNN trained in the cloud on massive data and powerful GPUs often goes through an optimization phase (e.g., quantization, compression) before deployment to a target device (e.g., mobile device). A test set that effectively uncovers the disagreements of a DNN and its optimized variant provides certain feedback to debug and further enhance the optimization procedure. However, the minor inconsistency between a DNN and its optimized version is often hard to detect and easily bypasses the original test set. This paper proposes DiffChaser, an automated black-box testing framework to detect untargeted/targeted disagreements between version variants of a DNN. We demonstrate 1) its effectiveness by comparing with the state-of-the-art techniques, and 2) its usefulness in real-world DNN product deployment involved with quantization and optimization.

Keywords

Uncertainty in AI: Uncertainty Representations, Machine Learning: Adversarial Machine Learning

Discipline

OS and Networks | Software Engineering

Research Areas

Software and Cyber-Physical Systems

Publication

Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, Macao, 2019 August 10-16

First Page

5772

Last Page

5778

ISBN

9780999241141

Identifier

10.24963/ijcai.2019/800

Publisher

International Joint Conferences on Artificial Intelligence Organization

City or Country

Macao, China

Share

COinS