2021
DOI: 10.1007/978-981-16-0575-8_9
|View full text |Cite
|
Sign up to set email alerts
|

Fully Convolutional Network Bootstrapped by Word Encoding and Embedding for Activity Recognition in Smart Homes

Abstract: Activity recognition in smart homes is essential when we wish to propose automatic services for the inhabitants. However, it is a challenging problem in terms of environments' variability, sensory-motor systems, user habits, but also sparsity of signals and redundancy of models. Therefore, end-to-end systems fail at automatically extracting key features, and need to access context and domain knowledge. We propose to tackle feature extraction for activity recognition in smart homes by merging methods of Natural… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0
3

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 26 publications
(28 citation statements)
references
References 14 publications
0
25
0
3
Order By: Relevance
“…However, in recent years, researchers in the field of Natural Language Processing (NLP) have developed techniques of word embedding and the language model for deep learning algorithms to understand not only the meaning of words but also the structure of phases and texts. A first attempt to add NLP word embedding to deep learning has shown a better performance in daily activity recognition in smart homes [46]. Moreover, the use of the semantics of the HAR domain may allow the development of new learning techniques for quick adaptation, such as zero-shot learning, which is developed in Section 5.…”
Section: Semanticsmentioning
confidence: 99%
See 2 more Smart Citations
“…However, in recent years, researchers in the field of Natural Language Processing (NLP) have developed techniques of word embedding and the language model for deep learning algorithms to understand not only the meaning of words but also the structure of phases and texts. A first attempt to add NLP word embedding to deep learning has shown a better performance in daily activity recognition in smart homes [46]. Moreover, the use of the semantics of the HAR domain may allow the development of new learning techniques for quick adaptation, such as zero-shot learning, which is developed in Section 5.…”
Section: Semanticsmentioning
confidence: 99%
“…Applied to HAR, they could model the context of the sensors and their order of appearance. Taking inspiration from References [46,66], we can draw a parallel between NLP and HAR: a word is apparent to a sensor event, a micro activity composed of sensor events is apparent to a sentence, and a compound activity composed of sub-activities is a paragraph. The parallel between word and sensor events has led to the combination of word encodings with deep learning to improve the performance of HAR in smart homes in Reference [46].…”
Section: Sequences Of Sub-activitiesmentioning
confidence: 99%
See 1 more Smart Citation
“…Regarding the EB0 method, the returned vector has 9408 dimensions; • GlobalAveragePooling2D: this parameter represents a global pooling operation for spatial representation of the data, which is applied to replace the stage of fully connected layers in conventional CNNs. This operation generates a resource map for each class that will be classified [84]. This operation was used to obtain a drastic reduction of the dimensions of the feature maps in the layers, top_activation, mixed9 and mixed10 of the respective methods of the algorithms: EfficientNetB0 (EB0 baseline) and InceptionV3 (IV3 and IV3 baseline).…”
Section: Feature Extraction By Pre-trained Cnn Modelsmentioning
confidence: 99%
“…Some of the challenges are irrelevant and routine activities, whereas others are related to a predefined target class. Unlabeled data can significantly reduce the AR performance [53][54][55][56] because of their correlation with labeled data. In real-time AR services that test all sensor data generated in real time in order, if the AR model is not trained on the other data, it will erroneously recognize queries as predefined activities even though there is no correspondence between them.…”
Section: Introductionmentioning
confidence: 99%