2022
DOI: 10.3390/s22051790
|View full text |Cite
|
Sign up to set email alerts
|

Detection of Farmland Obstacles Based on an Improved YOLOv5s Algorithm by Using CIoU and Anchor Box Scale Clustering

Abstract: It is necessary to detect multi-type farmland obstacles in real time and accurately for unmanned agricultural vehicles. An improved YOLOv5s algorithm based on the K-Means clustering algorithm and CIoU Loss function was proposed to improve detection precision and speed up real-time detection. The K-Means clustering algorithm was used in order to generate anchor box scales to accelerate the convergence speed of model training. The CIoU Loss function, combining the three geometric measures of overlap area, center… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
16
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 29 publications
(16 citation statements)
references
References 29 publications
0
16
0
Order By: Relevance
“…Firstly, the effect of the information fusion of the mmWave radar and the camera was tested in a non-agricultural environment, as shown in Figure 13 . In Figure 13 a, three obstacle targets such as people, houses, and trees were detected with the camera only, by using the improved YOLOv5s algorithm in the literature [ 9 ]. But after the data fusion, 8 targets sequences including position, longitudinal speed, and category were output: (−9.84, 13.21, 0.00, house), (−7.96, 17.57, 0.00, tree), (−5.32, 14.63, 0.00, house), (−0.21, 4.30, 0.00, person), (4.36, 13.51, 0.00, tree), (−4.78, 17.08, 0.00, nan), (−4.85, 12.83, 0.00, nan), (5.64, 13.97, 0.00, nan), as shown in Figure 13 b.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…Firstly, the effect of the information fusion of the mmWave radar and the camera was tested in a non-agricultural environment, as shown in Figure 13 . In Figure 13 a, three obstacle targets such as people, houses, and trees were detected with the camera only, by using the improved YOLOv5s algorithm in the literature [ 9 ]. But after the data fusion, 8 targets sequences including position, longitudinal speed, and category were output: (−9.84, 13.21, 0.00, house), (−7.96, 17.57, 0.00, tree), (−5.32, 14.63, 0.00, house), (−0.21, 4.30, 0.00, person), (4.36, 13.51, 0.00, tree), (−4.78, 17.08, 0.00, nan), (−4.85, 12.83, 0.00, nan), (5.64, 13.97, 0.00, nan), as shown in Figure 13 b.…”
Section: Resultsmentioning
confidence: 99%
“…For the visual inspection module, this paper adopted improved YOLOv5s [ 9 ], automatically generated the anchor frame scale through the K-Means algorithm to speed up the convergence speed, and used the CIoU loss function to reduce false detection and missing detection to improve the accuracy. For the mmWave radar detection module, in order to reduce the amount of data during fusion input, a three-step filtering algorithm including empty target filtering, false target filtering, and non-threat target filtering was adopted, which was mainly realized through relative distance, effective target life cycle, and horizontal and vertical coordinate threshold.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…For the target detection task, the loss function of the bounding box regression (BBR) [34] is crucial. The baseline model we use is the complete-intersection over union (CIoU) loss [35], which considers three geometric measures, including the overlap between the predicted and target frames, the distance to the centroid, and the consistency of the aspect ratio. The CIoU formula is shown below:…”
Section: Focal Eiou Lossmentioning
confidence: 99%