2013
DOI: 10.1016/j.neucom.2011.10.047
|View full text |Cite
|
Sign up to set email alerts
|

A k-nearest-neighbor classifier with heart rate variability feature-based transformation algorithm for driving stress recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
58
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
4
3
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 107 publications
(60 citation statements)
references
References 29 publications
2
58
0
Order By: Relevance
“…In this data matrix, not all the elements are nonnegative. We used our real value matrix factorization method to represent this matrix and then use the factor vectors for Knearest neighbors classification [55], [56]. The factorization result is shown in Figure 4.…”
Section: Methodsmentioning
confidence: 99%
“…In this data matrix, not all the elements are nonnegative. We used our real value matrix factorization method to represent this matrix and then use the factor vectors for Knearest neighbors classification [55], [56]. The factorization result is shown in Figure 4.…”
Section: Methodsmentioning
confidence: 99%
“…KNN categorizes the feature spaces into binary or multiclass clusters by employing a training dataset to further classify the data points according to the closest data points to in the training dataset. KNN has been used in medical informatics, such as the detection of epilepsy [89], stress [90], and depression [85,91].…”
Section: Knnmentioning
confidence: 99%
“…To extract high-level HRV features from R-R intervals, we followed earlier studies on stress inference using HRV [19][20][21][22]46]. After the pre-processing method proposed above, we extracted the following hand-engineered HRV features: a) HRV F1: LF Power b) HRV F2: HF Power c) HRV F3: LF/HF ratio d) HRV F4: SDNN (standard deviation of R-R intervals) e) HRV F5: RMSSD (root mean square of the successive differences of R-R intervals) f) HRV F6: pNN50 (ratio of the number of the successive differences of R-R intervals greater than 50ms of the total number of R-R intervals)…”
Section: Self-report Of Perceived Mental Stressmentioning
confidence: 99%
“…The first algorithm we use is a knearest neighbor classifier (denoted as kNN, k=1), using the extracted high level features listed above as in [46]. The second algorithm is a feed-forward and backpropagation based single layer neural network which uses the original input sources (i.e., 1D R-R intervals and 1Hz sampled thermal sequences) as low-level features.…”
Section: Labeling Strategy and Machine Learning Classifiersmentioning
confidence: 99%