Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

10-2015

Abstract

Recent years have seen an increasing attention to social aspects of software engineering, including studies of emotions and sentiments experienced and expressed by the software developers. Most of these studies reuse existing sentiment analysis tools such as SentiStrength and NLTK. However, these tools have been trained on product reviews and movie reviews and, therefore, their results might not be applicable in the software engineering domain. In this paper we study whether the sentiment analysis tools agree with the sentiment recognized by human evaluators (as reported in an earlier study) as well as with each other. Furthermore, we evaluate the impact of the choice of a sentiment analysis tool on software engineering studies by conducting a simple study of differences in issue resolution times for positive, negative and neutral texts. We repeat the study for seven datasets (issue trackers and Stack Overflow questions) and different sentiment analysis tools and observe that the disagreement between the tools can lead to contradictory conclusions.

Keywords

Androids, Humanoid robots, Labeling, Manuals, Sentiment analysis, Software, Software engineering

Discipline

Databases and Information Systems | Numerical Analysis and Scientific Computing | Software Engineering

Research Areas

Information Systems and Management

Publication

ICSME 2015: Proceedings of the 31st IEEE International Conference on Software Maintenance and Evolution (2015), Bremen, Germany, September 29 - October 1

First Page

531

Last Page

535

ISBN

9781467375320

Identifier

10.1109/ICSM.2015.7332508

Publisher

IEEE

City or Country

Piscataway, NJ

Additional URL

https://doi.org/10.1109/ICSM.2015.7332508

Share

COinS