2016 Sixth International Conference on Image Processing Theory, Tools and Applications (IPTA) 2016
DOI: 10.1109/ipta.2016.7821025
|View full text |Cite
|
Sign up to set email alerts
|

OUHANDS database for hand detection and pose recognition

Abstract: In this paper we propose a proprietary static hand pose database called OUHANDS and protocols for training and evaluating hand pose classification and hand detection methods. A comparison between the OUHANDS database and existing databases is given. Baseline results for both of the protocols are presented.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 49 publications
(22 citation statements)
references
References 23 publications
0
22
0
Order By: Relevance
“…We have collected a combined dataset containing a total of 24,535 images and over 41,000 hand instances. To ensure the diversity of the data collected, the dataset combines samples from different datasets, (e.g., those from [12,24,44,45]) and other images sources. In addition, we are more interested in creating a realistic and diverse dataset in terms of viewpoints (first and third person views, etc.…”
Section: Experiments and Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…We have collected a combined dataset containing a total of 24,535 images and over 41,000 hand instances. To ensure the diversity of the data collected, the dataset combines samples from different datasets, (e.g., those from [12,24,44,45]) and other images sources. In addition, we are more interested in creating a realistic and diverse dataset in terms of viewpoints (first and third person views, etc.…”
Section: Experiments and Discussionmentioning
confidence: 99%
“…The utilization of hand-crafted features has dominated early research in hand detection and gesture recognition. Most of these approaches have utilized hand skin color, texture, and appearance features for hand detection and gesture recognition [8,9,10,11,12,13]. However, their success is only found in certain well-prepared environments.…”
Section: Introductionmentioning
confidence: 99%
“…This dataset (MATILAINEN et al, 2016) is aimed for evaluation of both, classification and segmentation methods. It contains manually segmented binary masks, as well as aligned depth and color frames.…”
Section: Ouhandsmentioning
confidence: 99%
“…The datasets used in this work contain gestures from more than one subject (BARCZAK et al, 2011;HSIAO et al, 2014;MATILAINEN et al, 2016). This makes it possible to use two different validation techniques:…”
Section: Validation Techniquementioning
confidence: 99%
“…The proposed method is evaluated on two benchmark datasets, i.e. OUHANDS [23] and HGR1 [24]. Our experiments show that by using a robust hand segmentation/recognition architecture, supported by an efficient data augmentation technique, we can develop a robust RGB‐based HGR system, which has excellent performance in uncontrolled and unseen scenarios, at much lower training and computational costs than the current state‐of‐the‐art.…”
Section: Introductionmentioning
confidence: 99%