Publication Type
Journal Article
Version
acceptedVersion
Publication Date
1-2022
Abstract
Person Re-IDentification (ReID) aims at re-identifying persons from different viewpoints across multiple cameras. Capturing the fine-grained appearance differences is often the key to accurate person ReID, because many identities can be differentiated only when looking into these fine-grained differences. However, most state-of-the-art person ReID approaches, typically driven by a triplet loss, fail to effectively learn the fine-grained features as they are focused more on differentiating large appearance differences. To address this issue, we introduce a novel pairwise loss function that enables ReID models to learn the fine-grained features by adaptively enforcing an exponential penalization on the images of small differences and a bounded penalization on the images of large differences. The proposed loss is generic and can be used as a plugin to replace the triplet loss to significantly enhance different types of state-of-the-art approaches. Experimental results on four benchmark datasets show that the proposed loss substantially outperforms a number of popular loss functions by large margins; and it also enables significantly improved data efficiency.
Keywords
Person Re-Identification, Fine-grained Difference, Representation Learning, Triplet Loss, Pairwise Loss
Discipline
Artificial Intelligence and Robotics | Databases and Information Systems
Research Areas
Intelligent Systems and Optimization
Publication
IEEE Transactions on Multimedia
Volume
24
First Page
1665
Last Page
1677
ISSN
1520-9210
Identifier
10.1109/TMM.2021.3069562
Publisher
IEEE Transactions on Multimedia
Citation
YAN, Cheng; PANG, Guansong; BAI, Xiao; LIU, Changhong; NING, Xin; and ZHOU, Jun.
Beyond triplet loss: Person re-identification with fine-grained difference-aware pairwise loss. (2022). IEEE Transactions on Multimedia. 24, 1665-1677.
Available at: https://ink.library.smu.edu.sg/sis_research/7023
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.