Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

4-2023

Abstract

Recent years have witnessed wider adoption of Automated Speech Recognition (ASR) techniques in various domains. Consequently, evaluating and enhancing the quality of ASR systems is of great importance. This paper proposes Asdf, an Automated Speech Recognition Differential Testing Framework to test ASR systems. Asdf extends an existing ASR testing tool, the CrossASR++, which synthesizes test cases from a text corpus. However, CrossASR++ fails to make use of the text corpus efficiently and provides limited information on how the failed test cases can improve ASR systems. To address these limitations, our tool incorporates two novel features: (1) a text transformation module to boost the number of generated test cases and uncover more errors in ASR systems, and (2) a phonetic analysis module to identify phonemes that the ASR systems tend to transcribe incorrectly. Asdf generates more high-quality test cases by applying various text transformation methods (e.g., changing tense) to the input text in a failed test case. By doing so, Asdf can utilize a small text corpus to generate a large number of audio test cases, something which CrossASR++ is not capable of. In addition, Asdf implements more metrics to evaluate the performance of ASR systems from multiple perspectives. Asdf performs phonetic analysis on the identified failed test cases to identify the phonemes that ASR systems tend to transcribe incorrectly, providing useful information for developers to improve ASR systems. The demonstration video of our tool is made online at https://www.youtube.com/watch?v=DzVwfc3h9As. The implementation is available at https://github.com/danielyuenhx/asdf-differential-testing.

Keywords

Automated speech recognition, Automatic speech recognition system, Differential testing, Limited information, Phonetic analysis, Speech recognition systems, Test case, Testing framework, Testing tools, Text corpora

Discipline

Databases and Information Systems

Research Areas

Data Science and Engineering

Publication

Proceedings of the 16th IEEE International Conference on Software Testing, Verification and Validation, Dublin, Ireland, 2023 April 16-20

First Page

461

Last Page

463

ISBN

9781665456661

Identifier

10.1109/ICST57152.2023.00050

Publisher

IEEE

City or Country

New Jersey

Additional URL

https://doi.org/10.1109/ICST57152.2023.00050

Share

COinS