2020
DOI: 10.48550/arxiv.2006.06385
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

TensorFlow with user friendly Graphical Framework for object detection API

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…In order to address the class number imbalance, we utilized the linear inverse class frequency to regulate the weighted cross-entropy losses, as suggested in previous studies [ 34 , 35 ]. For the implementation of the sperm acrosome object detection, we employed the TensorFlow object detection package [ 36 ] and its extension. The Faster R-CNN model was pretrained on the COCO dataset [ 37 ] and then fine-tuned using the training dataset to detect AR/Non-AR.…”
Section: Methodsmentioning
confidence: 99%
“…In order to address the class number imbalance, we utilized the linear inverse class frequency to regulate the weighted cross-entropy losses, as suggested in previous studies [ 34 , 35 ]. For the implementation of the sperm acrosome object detection, we employed the TensorFlow object detection package [ 36 ] and its extension. The Faster R-CNN model was pretrained on the COCO dataset [ 37 ] and then fine-tuned using the training dataset to detect AR/Non-AR.…”
Section: Methodsmentioning
confidence: 99%
“…Additionally, efforts have been made to enhance the neural network architecture by increasing its depth and complexity [5,[20][21][22][23][24]. Optimization of training hyperparameters has also been explored [25][26][27][28][29][30][31][32]. However, the literature lacks a clear consensus on the ideal size of a training dataset [17,[33][34][35][36][37][38], indicating a research gap concerning the optimal dataset scale for training object detectors.…”
Section: Literature Reviewmentioning
confidence: 99%