Stereo object proposals
Publication Type
Journal Article
Publication Date
2-2017
Abstract
Object proposal detection is an effective way of accelerating object recognition. Existing proposal methods are mostly based on detecting object boundaries, which may not be effective for cluttered backgrounds. In this paper, we leverage stereopsis as a robust and effective solution for generating object proposals. We first obtain a set of candidate bounding boxes through adaptive transformation, which fits the bounding boxes tightly to object boundaries detected by rough depth and color information. A two-level hierarchy composed of proposal and cluster levels is then constructed to estimate object locations in an efficient and accurate manner. Three stereo-based cues "exactness," "focus," and "distribution" are proposed for objectness estimation. Two-level hierarchical ranking is proposed to accurately obtain ranked object proposals. A stereo data set with 400 labeled stereo image pairs is constructed to evaluate the performance of the proposed method in both indoor and outdoor scenes. Extensive experimental evaluations show that the proposed stereo-based approach achieves a better performance than the state of the arts with either a small or a large number of object proposals. As stereopsis can be a complement to the color information, the proposed method can be integrated with existing proposal methods to obtain superior results.
Keywords
Stereopsis, objectness estimation, object proposals, stereo object proposals
Discipline
Information Security
Research Areas
Information Systems and Management
Publication
IEEE Transactions on Image Processing
Volume
26
Issue
2
First Page
671
Last Page
683
ISSN
1057-7149
Identifier
10.1109/TIP.2016.2627819
Publisher
Institute of Electrical and Electronics Engineers
Citation
HUANG, Shao; WANG, Weiqiang; HE, Shengfeng; and LAU, Rynson W. H..
Stereo object proposals. (2017). IEEE Transactions on Image Processing. 26, (2), 671-683.
Available at: https://ink.library.smu.edu.sg/sis_research/7882
Additional URL
https://doi.org/10.1109/TIP.2016.2627819