2018
DOI: 10.1155/2018/1789121
|View full text |Cite
|
Sign up to set email alerts
|

The Hierarchies of Multivalued Attribute Domains and Corresponding Applications in Data Mining

Abstract: In mobile computing, machine learning models for natural language processing (NLP) have become one of the most attractive focus areas in research. Association rules among attributes are common knowledge patterns, which can often provide potential and useful information such as mobile users' interests. Actually, almost each attribute is associated with a hierarchy of the domain. Given an relation R=(U,A) and any cut αa on the hierarchy for every attribute a, there is another rough relation RΦ, where Φ=(αa:a∈A).… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 15 publications
0
1
0
Order By: Relevance
“…Several problems remain unsolved. Some of the interesting problems are how to adjust subsequence length/ step length and regularize a network and how to use LSTM-FCN, ALSTM-FCN [9], hierarchies of feature [19], or time aware [20,21] to improve the performance of time series classification. In addition, we will try to apply the method of classification service in distributed cloud/fog environment [22][23][24][25][26][27][28][29][30][31][32] in the future.…”
Section: Discussionmentioning
confidence: 99%
“…Several problems remain unsolved. Some of the interesting problems are how to adjust subsequence length/ step length and regularize a network and how to use LSTM-FCN, ALSTM-FCN [9], hierarchies of feature [19], or time aware [20,21] to improve the performance of time series classification. In addition, we will try to apply the method of classification service in distributed cloud/fog environment [22][23][24][25][26][27][28][29][30][31][32] in the future.…”
Section: Discussionmentioning
confidence: 99%