2020
DOI: 10.3390/app10165631
|View full text |Cite
|
Sign up to set email alerts
|

Improvement of Heterogeneous Transfer Learning Efficiency by Using Hebbian Learning Principle

Abstract: Transfer learning algorithms have been widely studied for machine learning in recent times. In particular, in image recognition and classification tasks, transfer learning has shown significant benefits, and is getting plenty of attention in the research community. While performing a transfer of knowledge among source and target tasks, homogeneous dataset is not always available, and heterogeneous dataset can be chosen in certain circumstances. In this article, we propose a way of improving transfer learning e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(14 citation statements)
references
References 75 publications
0
12
0
Order By: Relevance
“…The suggested system designated by adopts the Hebbian neural network infrastructure [5]- [9], but with several modifications has been done on HNN to the behalf of the proposed system. These modifications include: i) the number of input nodes and output nodes are equally; ii) the learning process is an unsupervised approach; iii) it is a single layer (no hidden layer), this topology provides an ability of simple implementation and fast learning speed compared to other networks with hidden layers; and iv) the propagation of the signals in the network will be in one direction only (feedforward), therefore each neuron will depend on the directed input signals only.…”
Section: Modified Hebbian Neural Network (Mhnn)mentioning
confidence: 99%
See 1 more Smart Citation
“…The suggested system designated by adopts the Hebbian neural network infrastructure [5]- [9], but with several modifications has been done on HNN to the behalf of the proposed system. These modifications include: i) the number of input nodes and output nodes are equally; ii) the learning process is an unsupervised approach; iii) it is a single layer (no hidden layer), this topology provides an ability of simple implementation and fast learning speed compared to other networks with hidden layers; and iv) the propagation of the signals in the network will be in one direction only (feedforward), therefore each neuron will depend on the directed input signals only.…”
Section: Modified Hebbian Neural Network (Mhnn)mentioning
confidence: 99%
“…Therefore, it provides an algorithm for modifying the neuronal network relationship weight. The Hebb rule offers a simplified model based on physiology to simulate synaptic plasticity's activity-dependent characteristics and has been widely used in the artificial neural network field [5]- [9].…”
Section: Introductionmentioning
confidence: 99%
“…The biological origin of Hebb's supervised learning was established from a neuroscience perspective: when two neurons are activated simultaneously, the link intensity (also called plasticity) is proportional to the multiplication of their stimulation [41,42]. Therefore, this concept can be translated mathematically for the adjustment of the PID parameters (k p , k i and k d ) which can be obtained through a neural settlement of Equation (8) as Equation (8) shows, where η i are learning rates that correspond to w i (k) [43].…”
Section: Pemfc Control With Ann-pid 221 Control Designmentioning
confidence: 99%
“…Deep learning is one of the most popular machine learning algorithms in computer vision [27][28][29]. Its main motivation is to establish and simulate the neural network of the human brain for analytical learning.…”
Section: Deep Learningmentioning
confidence: 99%