2021
DOI: 10.1038/s41598-021-83182-4
|View full text |Cite
|
Sign up to set email alerts
|

Self-incremental learning vector quantization with human cognitive biases

Abstract: Human beings have adaptively rational cognitive biases for efficiently acquiring concepts from small-sized datasets. With such inductive biases, humans can generalize concepts by learning a small number of samples. By incorporating human cognitive biases into learning vector quantization (LVQ), a prototype-based online machine learning method, we developed self-incremental LVQ (SILVQ) methods that can be easily interpreted. We first describe a method to automatically adjust the learning rate that incorporates … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 51 publications
0
4
0
Order By: Relevance
“…In [19] and [20], LVQ is a supervised classification algorithm, which is widely used for classification problems due to its easy implementation and understanding. The purpose of LVQ is to learn a codebook (or prototype) for assigning an arbitrary input vector to a target class label from training data composed of an input vector x and a corresponding label y.…”
Section: Proposed Intelligent Pir System Using Learning Vector Quanti...mentioning
confidence: 99%
“…In [19] and [20], LVQ is a supervised classification algorithm, which is widely used for classification problems due to its easy implementation and understanding. The purpose of LVQ is to learn a codebook (or prototype) for assigning an arbitrary input vector to a target class label from training data composed of an input vector x and a corresponding label y.…”
Section: Proposed Intelligent Pir System Using Learning Vector Quanti...mentioning
confidence: 99%
“…biased. Taniguchi, Sato, and Shirakawa (105) and Manome et al (106) extend this idea by incorporating the same biases in neural networks and learning vector quantization respectively for different tasks.…”
Section: Decisionmentioning
confidence: 99%
“…Chen and Lee [19] propose an incremental few-shot learning algorithm that makes use of deep incremental learning vector quantization to solve problems related to catastrophic forgetting in classincremental tasks. A self-incremental learning vector quantization algorithm (SILVQ) is proposed in [20];…”
Section: B Learning Vector Quantizationmentioning
confidence: 99%