2019
DOI: 10.1016/j.neunet.2019.03.012
|View full text |Cite
|
Sign up to set email alerts
|

Short-term cognitive networks, flexible reasoning and nonsynaptic learning

Abstract: While the machine learning literature dedicated to fully automated reasoning algorithms is abundant, the number of methods enabling the inference process on the basis of previously defined knowledge structures is scanter. Fuzzy Cognitive Maps (FCMs) are neural networks that can be exploited towards this goal because of their flexibility to handle external knowledge. However, FCMs suffer from a number of issues that range from the limited prediction horizon to the absence of theoretically sound learning algorit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
16
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
2
2

Relationship

3
5

Authors

Journals

citations
Cited by 21 publications
(16 citation statements)
references
References 33 publications
0
16
0
Order By: Relevance
“…Aiming at conducting the simulations, we use the same datasets used in [3]. These datasets are traditional pattern classification problems adapted to the simulation problem where we need to predict the values of some variables from the values of others.…”
Section: Benchmark Problemsmentioning
confidence: 99%
See 1 more Smart Citation
“…Aiming at conducting the simulations, we use the same datasets used in [3]. These datasets are traditional pattern classification problems adapted to the simulation problem where we need to predict the values of some variables from the values of others.…”
Section: Benchmark Problemsmentioning
confidence: 99%
“…To overcome the FCMs' drawbacks, Nápoles et al [3] introduced the Short-term Cognitive Networks and a learning principle referred to as nonsynaptic learning. Instead of adjusting the weight matrix, the nonsynaptic learning focuses on the transfer function parameters controlling neurons' excitation degree.…”
Section: Introductionmentioning
confidence: 99%
“…In this paper, we propose the Long Short-term Cognitive Networks (LSTCNs) to cope with the efficient and transparent forecasting of long univariate and multivariate time series. LSTCNs involve a sequential collection of Short-term Cognitive Network (STCN) blocks [34], each processing a specific time patch in the sequence. The STCN model allows for transparent reasoning since both weights and neurons map to specific features in the problem domain.…”
Section: Introductionmentioning
confidence: 99%
“…As a second contribution, we propose a deterministic learning algorithm to compute the tunable parameters of each STCN block in a deterministic fashion. The highly efficient algorithm replaces the non-synaptic learning method presented in [34]. As a final contribution, we present a feature influence score as a proxy to explain the reasoning process of our neural system.…”
Section: Introductionmentioning
confidence: 99%
“…The proposed model relies on a brand new neural system termed Long Short-term Cognitive Network (LSTCN) [30] that allows for online learning. In this model, each iteration processes a data chunk using a Short-term Cognitive Network (STCN) block [31] that operates with the knowledge transferred from the previous block. This means that the model can easily be retrained without compromising what the networks has learned from previous data chunks.…”
Section: Introductionmentioning
confidence: 99%