2017 IEEE International Conference on Computer Vision (ICCV) 2017
DOI: 10.1109/iccv.2017.593
|View full text |Cite
|
Sign up to set email alerts
|

Soft-NMS — Improving Object Detection with One Line of Code

Abstract: Non-maximum suppression is an integral part of the object detection pipeline. First, it sorts all detection boxes on the basis of their scores. The detection box M with the maximum score is selected and all other detection boxes with a significant overlap (using a pre-defined threshold) with M are suppressed. This process is recursively applied on the remaining boxes. As per the design of the algorithm, if an object lies within the predefined overlap threshold, it leads to a miss. To this end, we propose Soft-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
1,065
0
5

Year Published

2019
2019
2023
2023

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 1,774 publications
(1,070 citation statements)
references
References 32 publications
0
1,065
0
5
Order By: Relevance
“…Repulsion loss [1] is designed to directly penalize the predicted box for shifting to the other ground-truth objects or other predicted boxes associated with different ground truths. Soft NMS [23] decays the detection scores of boxes depending on the overlap and eliminates no boxes in the process to achieve high recall. Adaptive NMS [24] applies a dynamic suppression threshold according to the target density.…”
Section: Occlusion Handlingmentioning
confidence: 99%
“…Repulsion loss [1] is designed to directly penalize the predicted box for shifting to the other ground-truth objects or other predicted boxes associated with different ground truths. Soft NMS [23] decays the detection scores of boxes depending on the overlap and eliminates no boxes in the process to achieve high recall. Adaptive NMS [24] applies a dynamic suppression threshold according to the target density.…”
Section: Occlusion Handlingmentioning
confidence: 99%
“…After that, we apply non-maximum suppression (NMS) with IoU threshold 0.6. In NMS procedure, instead of removing adjacent bounding boxes, we divide their scores by 10, which is similar to Soft-NMS [4]. In Figure 5, performance of proposals generated by our method are quantified and compared with popular proposal techniques [18,26,38,2,7,5,9,21,1,30,31,17].…”
Section: Object Proposalsmentioning
confidence: 99%
“…We only use one scale, namely 512 × 512 for both training and testing. During inference, we run soft NMS [3] on the model outputs with the standard deviation parameter of 0.55 in the Gaussian weighting function.…”
Section: Implementation Details and Baselinesmentioning
confidence: 99%