2022
DOI: 10.1088/2632-2153/aca271
|View full text |Cite
|
Sign up to set email alerts
|

Interaction decompositions for tensor network regression

Abstract: It is well known that tensor network regression models operate on an exponentially large feature space, but questions remain as to how effectively they are able to utilize this space. Using a polynomial featurization, we propose the interaction decomposition as a tool that can assess the relative importance of different regressors as a function of their polynomial degree. We apply this decomposition to tensor ring and tree tensor network models trained on the MNIST and Fashion MNIST datasets, and find that up … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…Finally, Ref. [151] analyzes the contribution of polynomials of different degrees to the supervised learning performance of different architectures.…”
Section: Tns As a Generalization Of The Main Model Architectures In MLmentioning
confidence: 99%
“…Finally, Ref. [151] analyzes the contribution of polynomials of different degrees to the supervised learning performance of different architectures.…”
Section: Tns As a Generalization Of The Main Model Architectures In MLmentioning
confidence: 99%
“…Finally, Ref. [59] analyzes the contribution of polynomials of different degrees to the supervised learning performance of different architectures.…”
Section: Tns As a Generalization Of The Main Model Architectures In MLmentioning
confidence: 99%