2012
DOI: 10.1007/978-3-642-32820-6_39
|View full text |Cite
|
Sign up to set email alerts
|

Peer-to-Peer Multi-class Boosting

Abstract: Abstract. We focus on the problem of data mining over large-scale fully distributed databases, where each node stores only one data record. We assume that a data record is never allowed to leave the node it is stored at. Possible motivations for this assumption include privacy or a lack of a centralized infrastructure. To tackle this problem, earlier we proposed the generic gossip learning framework (GoLF), but so far we have studied only basic linear algorithms. In this paper we implement the well-known boost… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0
1

Year Published

2013
2013
2013
2013

Publication Types

Select...
2
1

Relationship

3
0

Authors

Journals

citations
Cited by 3 publications
(10 citation statements)
references
References 21 publications
0
9
0
1
Order By: Relevance
“…This is very hard to do even without any extra measures, given that models perform random walks based on local decisions, and that merge operations are performed as well. This short informal reasoning motivates our ongoing work towards understanding and enhancing the privacy-preserving properties of gossip learning, while we extend the basic idea to tackle with different learning scenarios (like concept drift [53,55]), and different models (like boosting models [54], multi-armed bandit models [114]). The presented results are mainly based on our previous paper [94].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…This is very hard to do even without any extra measures, given that models perform random walks based on local decisions, and that merge operations are performed as well. This short informal reasoning motivates our ongoing work towards understanding and enhancing the privacy-preserving properties of gossip learning, while we extend the basic idea to tackle with different learning scenarios (like concept drift [53,55]), and different models (like boosting models [54], multi-armed bandit models [114]). The presented results are mainly based on our previous paper [94].…”
Section: Discussionmentioning
confidence: 99%
“…Chapter 3 Chapter 4 Chapter 5 Chapter 6 Chapter 7 ICDM 2008 [89] • TSD 2010 [91] • EUROPAR 2010 [92] • WETICE 2010 [90] • EUROPAR 2011 [93] • • CCPE 2012 [94] • SASO 2012 [55] • • SISY 2012 [53] • • EUROPAR 2012 [54] • • ICML 2013 [114] • • from the instantiations.…”
Section: Introductionmentioning
confidence: 99%
“…Since the nodes in the network store locally the received models (as its CURRENTMODEL or the latest models can be collected in a bounded queue), they can use them to predict the labels of new instances without additional commu- wait(∆) 4: p ← selectPeer() 5: send currentModel to p 6: end loop currentModel ← m 10: end procedure nication cost. Moreover, incoming models can be combined as well, both locally (e.g., merging the received models, or implementing a local voting mechanism on the models in the queue) [88,89] or globally (e.g., finding the best model in the network) [52].…”
Section: Gossip Learningmentioning
confidence: 99%
“…In this chapter we present a well-known technique in GOLF, called boosting, which was published in our paper [52]. Boosting techniques have attracted growing attention in machine learning due to their outstanding performance in many practical applications.…”
Section: Chaptermentioning
confidence: 99%
See 1 more Smart Citation