2021
DOI: 10.21203/rs.3.rs-1035440/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Tensor Based Stacked Fuzzy Networks for Efficient Data Regression

Abstract: Random vector functional link and extreme learning machine have been extended by the type-2 fuzzy sets with vector stacked methods, this extension leads to a new way to use tensor to construct learning structure for the type-2 fuzzy sets-based learning framework. In this paper, type-2 fuzzy sets-based random vector functional link, type-2 fuzzy sets-based extreme learning machine and Tikhonov-regularized extreme learning machine are fused into one network, a tensor way of stacking data is used to incorporate t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 75 publications
(98 reference statements)
0
1
0
Order By: Relevance
“…As discussed in §4, MPS suit one-dimensional data like time series and logarithmic TTNs or two-dimensional TNs like MERA or PEPS are better suited for images depending on the amount of local correlation. For text, the information scales even steeper [110], which requires three-dimensional PEPS or high dimensional MERA variants which have not been implemented on a quantum computer yet. Exploiting symmetries reduces the need for complexity within the structure, e.g.…”
Section: Tensor Network For Data Encodingmentioning
confidence: 99%
“…As discussed in §4, MPS suit one-dimensional data like time series and logarithmic TTNs or two-dimensional TNs like MERA or PEPS are better suited for images depending on the amount of local correlation. For text, the information scales even steeper [110], which requires three-dimensional PEPS or high dimensional MERA variants which have not been implemented on a quantum computer yet. Exploiting symmetries reduces the need for complexity within the structure, e.g.…”
Section: Tensor Network For Data Encodingmentioning
confidence: 99%