2017
DOI: 10.1109/jstsp.2017.2671789
|View full text |Cite
|
Sign up to set email alerts
|

Multitask Diffusion Adaptation Over Networks With Common Latent Representations

Abstract: Online learning with streaming data in a distributed and collaborative manner can be useful in a wide range of applications. This topic has been receiving considerable attention in recent years with emphasis on both single-task and multitask scenarios. In single-task adaptation, agents cooperate to track an objective of common interest, while in multitask adaptation agents track multiple objectives simultaneously. Regularization is one useful technique to promote and exploit similarity among tasks in the latte… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
25
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
2

Relationship

3
5

Authors

Journals

citations
Cited by 56 publications
(25 citation statements)
references
References 50 publications
0
25
0
Order By: Relevance
“…Behavior model (21) implies that the mean weight error behavior of DLMS depend on the type of periodic input variance σ 2 x,k (n) and its period T instead of the random perturbation of optimal weight vector and the type of white non-Gaussian input distribution u k (n) [12].…”
Section: B Mean Weight Error Behavior Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Behavior model (21) implies that the mean weight error behavior of DLMS depend on the type of periodic input variance σ 2 x,k (n) and its period T instead of the random perturbation of optimal weight vector and the type of white non-Gaussian input distribution u k (n) [12].…”
Section: B Mean Weight Error Behavior Analysismentioning
confidence: 99%
“…The associate editor coordinating the review of this manuscript and approving it for publication was Xiao-Sheng Si. extended to multitask scenarios [19]- [21]. Nevertheless, the most of existing analytical models of DLMS are limited to under the stationary Gaussian assumption on the input distributions [19], [22]- [24].…”
Section: Introductionmentioning
confidence: 99%
“…With the development of automation and wireless communication technology, data acquisition, data processing, data transmission, data control and data storage are not only convenient and fast, but also safe and reliable in a complicated network [1,2]. Combining distributed computing, computer science, automatic control theory, wireless sensor, and microelectronics manufacturing, intelligent network is a kind of large-scale distributed network systems, which is the integration of data-aware, intelligent learning, dynamic optimization, and wireless data communication [3,4].…”
Section: Introductionmentioning
confidence: 99%
“…For the latter, a subgradient and a proximal algorithm are introduced in [13] and [14], respectively. In [15] and [16], the authors derive solutions for other classes of multitask problems where the relations between the nodes are defined by common latent representations or local linear equality constraints, respectively. In [17], the authors solve a multitask problem by estimating the combination matrix.…”
Section: Introductionmentioning
confidence: 99%