2022 5th International Conference on Signal Processing and Information Security (ICSPIS) 2022
DOI: 10.1109/icspis57063.2022.10002437
|View full text |Cite
|
Sign up to set email alerts
|

Recommendation System Towards Residential Energy Saving Based on Anomaly Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 18 publications
0
1
0
Order By: Relevance
“…the trainable parameter of this transformation. The original attention score between each node is computed by Eq (11), which first splices two node embeddings, followed by a dot product of the spliced embeddings and a learnable weight vector; finally, a LeakyReLU activation function is applied. This form of attention mechanism is often referred to as "additive attention" and is distinct from the "dot product" attention in Transformer.…”
Section: Plos Onementioning
confidence: 99%
See 2 more Smart Citations
“…the trainable parameter of this transformation. The original attention score between each node is computed by Eq (11), which first splices two node embeddings, followed by a dot product of the spliced embeddings and a learnable weight vector; finally, a LeakyReLU activation function is applied. This form of attention mechanism is often referred to as "additive attention" and is distinct from the "dot product" attention in Transformer.…”
Section: Plos Onementioning
confidence: 99%
“…Use GAT to calculate the weight values between each feature node; take the feature factors L 1 and L 2 as an example, as shown in Fig 7(B). GAT linearly transforms the two nodes separately, and the original attention coefficient e 12 between each node is calculated by Eqs ( 10) and (11), then the attention coefficient e 12 is assigned to the nodes in the graph using the mapping function, and in order to compare the attention coefficients between different nodes, the attention weight b is obtained by normalizing using the softmax function. W!…”
Section: Plos Onementioning
confidence: 99%
See 1 more Smart Citation
“… Explainability and transparency : Explainable AI can help users understand why a particular recommendation was made. Seeing the rationale behind the recommendations might make users more receptive to different content, reducing the filter bubble effect (Atalla et al, 2022). User ‐ controlled recommendations : Allowing users to have more control over their recommendations, such as adjusting the degree of novelty or diversity, could also help alleviate the filter bubble problem. Cross ‐ domain recommendations : Leveraging data from different domains can help in providing a broader range of recommendations.…”
Section: Open Issues and Future Research Directionsmentioning
confidence: 99%
“…• Explainability and transparency: Explainable AI can help users understand why a particular recommendation was made. Seeing the rationale behind the recommendations might make users more receptive to different content, reducing the filter bubble effect (Atalla et al, 2022).…”
Section: Future Research Directionsmentioning
confidence: 99%