Publication Type
Journal Article
Version
acceptedVersion
Publication Date
9-2023
Abstract
Integrating low-level edge features has been proven to be effective in preserving clear boundaries of salient objects. However, the locality of edge features makes it difficult to capture globally salient edges, leading to distraction in the final predictions. To address this problem, we propose to produce distraction-free edge features by incorporating cross-scale holistic interdependencies between high-level features. In particular, we first formulate our edge features extraction process as a boundary-filling problem. In this way, we enforce edge features to focus on closed boundaries instead of those disconnected background edges. Second, we propose to explore cross-scale holistic contextual connections between every position pair of high-level feature maps regardless of their distances across different scales. It selectively aggregates features at each position based on its connections to all the others, simulating the "contrast" stimulus of visual saliency. Finally, we present a complementary features integration module to fuse low- and high-level features according to their properties. Experimental results demonstrate our proposed method outperforms previous state-of-the-art methods on the benchmark datasets, with the fast inference speed of 30 FPS on a single GPU.
Keywords
Feature extraction, Image edge detection, Object detection, Visualization, Filling, Task analysis, Convolution
Discipline
Graphics and Human Computer Interfaces | Software Engineering
Research Areas
Software and Cyber-Physical Systems
Publication
IEEE MultiMedia
Volume
30
Issue
3
First Page
63
Last Page
73
ISSN
1070-986X
Identifier
10.1109/MMUL.2023.3235936
Publisher
Institute of Electrical and Electronics Engineers
Citation
REN, Sucheng; LIU, Wenxi; JIAO, Jianbo; HAN, Guoqiang; and HE, Shengfeng.
Edge Distraction-aware Salient Object Detection. (2023). IEEE MultiMedia. 30, (3), 63-73.
Available at: https://ink.library.smu.edu.sg/sis_research/8273
Copyright Owner and License
Authors
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.1109/MMUL.2023.3235936