2020 International Joint Conference on Neural Networks (IJCNN) 2020
DOI: 10.1109/ijcnn48605.2020.9206597
|View full text |Cite
|
Sign up to set email alerts
|

Generalising Recursive Neural Models by Tensor Decomposition

Abstract: Most machine learning models for structured data encode the structural knowledge of a node by leveraging simple aggregation functions (in neural models, typically a weighted sum) of the information in the node's neighbourhood. Nevertheless, the choice of simple context aggregation functions, such as the sum, can be widely sub-optimal. In this work we introduce a general approach to model aggregation of structural context leveraging a tensor-based formulation. We show how the exponential growth in the size of t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
1

Relationship

5
0

Authors

Journals

citations
Cited by 5 publications
(9 citation statements)
references
References 15 publications
0
9
0
Order By: Relevance
“…Two input accumulation functions are discussed in the paper: one leveraging CP decomposition and the other relying on the Tensor-Train factorization. Similarly, [30] shows how a tensor recursive neuron can exploit the Tucker decomposition to control the trade-off between the size of the neural representation and the size of the neural parameter space.…”
Section: Neural Model Compressionmentioning
confidence: 99%
“…Two input accumulation functions are discussed in the paper: one leveraging CP decomposition and the other relying on the Tensor-Train factorization. Similarly, [30] shows how a tensor recursive neuron can exploit the Tucker decomposition to control the trade-off between the size of the neural representation and the size of the neural parameter space.…”
Section: Neural Model Compressionmentioning
confidence: 99%
“…This section aims to introduce new Tree-LSTM models that extend the recurrent approach to tensor-based processing tailored for constituency trees. These two models rely on Tensor Tree-LSTM (Castellana and Bacciu, 2020a) and canonical tensor decomposition. We focus on such decomposition since it can be combined with a weight sharing constraint, developing a composition function which can exploit nonbinary constituency trees without increasing the model parameters number.…”
Section: Canonical Tree-lstmmentioning
confidence: 99%
“…The Binary Tensor Tree-LSTM (Castellana and Bacciu, 2020a) computes the hidden state of an intern node v by:…”
Section: Binary Canonical Tree-lstmmentioning
confidence: 99%
See 2 more Smart Citations