"Mitigating regression faults induced by feature evolution in deep lear" by Hanmo YU, Zan WANG et al.
 

Publication Type

Journal Article

Version

publishedVersion

Publication Date

1-2025

Abstract

Deep learning (DL) systems have been widely utilized across various domains. However, the evolution of DL systems can result in regression faults. In addition to the evolution of DL systems through the incorporation of new data, feature evolution, such as the addition of new features, is also common and can introduce regression faults. In this work, we first investigate the underlying factors that are correlated with regression faults in feature evolution scenarios, i.e., redundancy and contribution shift. Based on our investigation, we propose a novel mitigation approach called FeaProtect, which aims to minimize the impact of these two factors. To evaluate the performance of FeaProtect, we conducted an extensive study comparing it with state-of-the-art approaches. The results show that FeaProtect outperforms the in-processing baseline approaches, with an average improvement of 50.6% ∼ 56.4% in terms of regression fault mitigation. We also show that FeaProtect can further enhance the effectiveness of mitigating regression faults by integrating with state-of-the-art post-processing approaches.

Keywords

Regression Mitigation, Regression Fault, Deep Learning, Feature Evolution, Fault Mitigation

Discipline

Software Engineering

Publication

ACM Transactions on Software Engineering and Methodology

First Page

1

Last Page

32

ISSN

1049-331X

Identifier

10.1145/3712199

Publisher

Association for Computing Machinery (ACM)

Copyright Owner and License

Authors CC-BY

Additional URL

https://doi.org/10.1145/3712199

Share

COinS