Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

7-2014

Abstract

Human, especially elderly, require frequent attention, continuous companionship, and deep understanding from the others. To provide more specific and appropriate tender care to the elderly, knowing their affective states is a great advantage. Recent work on human emotion recognition shows promising results that the expressive emotion can be successfully captured through visual, audio, and keyboard or touchpad stroke pattern signals. Furthermore, human activities are shown to be accurately recognizable with context by nonintrusive sensors within or connected to the smartphones. In this paper, we propose a computational model to characterize the affective states of the elderly based on the recognizable daily activities. Therefore, by integrating such an understanding module into a humanoid agent residing in the smartphone platform, we make the mobile agent more human-like. The initial knowledge of the activity-affect associations is taken from published work in psychology and gerontology. Based on the provided training signals, our model adapts the activityaffect knowledge accordingly. Consequently, by modeling mood awareness of the elderly, our agent can carry out more specific task and provide more appropriate tender care.

Discipline

Databases and Information Systems

Research Areas

Data Science and Engineering

Publication

Proceedings of the 2014 International Joint Conference on Neural Networks, Beijing, China, July 6-11

First Page

1

Last Page

8

Identifier

10.1109/IJCNN.2014.6889916

Publisher

IEEE

City or Country

Beijing, China

Share

COinS