2018 International Conference on High Performance Computing &Amp; Simulation (HPCS) 2018
DOI: 10.1109/hpcs.2018.00137
|View full text |Cite
|
Sign up to set email alerts
|

Ranking Mutual Information Dependencies in a Summary-based Approximate Analytics Framework

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…In this section, the feature weighting performance of the ELM method is compared against those of one unsupervised feature weighting method: PCA [14] and two supervised feature weighting methods: MI [15] and BPNN, in terms of dimensionality reduction. Here, BPNN is implemented with the use of 100 hidden nodes, the sigmoid activation function, learning rate = 0.2 and the maximum iteration = 5000.…”
Section: Dimensionality Reductionmentioning
confidence: 99%
“…In this section, the feature weighting performance of the ELM method is compared against those of one unsupervised feature weighting method: PCA [14] and two supervised feature weighting methods: MI [15] and BPNN, in terms of dimensionality reduction. Here, BPNN is implemented with the use of 100 hidden nodes, the sigmoid activation function, learning rate = 0.2 and the maximum iteration = 5000.…”
Section: Dimensionality Reductionmentioning
confidence: 99%
“…The last aspect is the analysis in a wider time window that captures trends and tendencies arising from the repetition of certain phenomena in a defined historical window. Calculating approximated results based on meta-descriptions and summaries are widely applied in many areas of interest, such as analytical databases [13], large relational data sets [12], redesigning and accelerating machine learning algorithms [2] or systems for monitoring health conditions for members of nursing homes [5]. A slightly different approach is to use Japanese candles as summaries [9], and then compute and process that data.…”
Section: B Goal Descriptionmentioning
confidence: 99%