2013 IEEE 13th International Conference on Data Mining 2013
DOI: 10.1109/icdm.2013.90
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Discovery of Multiple Bayesian Networks via Transfer Learning

Abstract: Abstract-Bayesian network structure learning algorithms with limited data are being used in domains such as systems biology and neuroscience to gain insight into the underlying processes that produce observed data. Learning reliable networks from limited data is difficult, therefore transfer learning can improve the robustness of learned networks by leveraging data from related tasks. Existing transfer learning algorithms for Bayesian network structure learning give a single maximum a posteriori estimate of ne… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
13
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(13 citation statements)
references
References 17 publications
0
13
0
Order By: Relevance
“…4 encodes an exchangeability assumption that any DAG G ( k ) is equally likely a priori to be similar to any other DAG G ( l ) ( k ≠ l ). Such an exchangeability assumption is implicit in much of the recent literature on multiple graphical models (Werhli and Husmeier, 2008; Oyen and Lane, 2013; Danaher et al , 2014). However, exchangeability will be inappropriate when the collection of subjects is heterogeneous, for instance containing groups or subgroups that correspond to differential neural connectivities.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…4 encodes an exchangeability assumption that any DAG G ( k ) is equally likely a priori to be similar to any other DAG G ( l ) ( k ≠ l ). Such an exchangeability assumption is implicit in much of the recent literature on multiple graphical models (Werhli and Husmeier, 2008; Oyen and Lane, 2013; Danaher et al , 2014). However, exchangeability will be inappropriate when the collection of subjects is heterogeneous, for instance containing groups or subgroups that correspond to differential neural connectivities.…”
Section: Resultsmentioning
confidence: 99%
“…Until very recently, estimation of more general DAGs required either the strong assumption that an ordering of the variables 1, …, P applied equally to all subjects (Oyen and Lane, 2013), or the use of expensive computational approximations such as Markov chain Monte Carlo that scale extremely poorly as either the number P of variables or number K of subjects grows (Werhli and Husmeier, 2008). An exact algorithm that facilitates the joint estimation of multiple DAGs was recently developed in the sister paper Oates et al (2014), viewing the estimation problem within a hierarchical Bayesian framework (somewhat similar to a random effects model for the graph structure) and applying advanced techniques from integer linear programming to obtain a maximum a posteriori estimate of all DAGs simultaneously.…”
Section: Introductionmentioning
confidence: 99%
“…Transfer learning, with the support of CI, is usually turned towards domains such as neural networks, fuzzy systems, and evolutionary computation. It is applied in real-world applications to: natural language processing (Huang et al 2013;Swietojanski et al 2012;Behbood et al 2013b), computer vision (Cireşan et al 2012;Kandaswamy et al 2014;Shell and Coupland 2012), biology (Celiberto et al 2011;Niculescu-Mizil and Caruana 2007;Oyen and Lane 2013), finance (Behbood et al 2011(Behbood et al , 2013a(Behbood et al , 2014, and business management (Ma et al 2012;Luis et al 2010;Shell 2013).…”
Section: Related Workmentioning
confidence: 99%
“…We may find that learning to recognize apples 75 might help us to recognize pears, or learning to play the electronic 76 organ may facilitate learning the piano. The study of transfer learn- 77 ing has been inspired by the fact that human beings can utilize pre- 78 viously-acquired knowledge to solve new but similar problems 79 much more quickly and effectively. The fundamental motivation 80 for transfer learning in the field of machine learning focuses on 81 the need for lifelong machine learning methods that retain and 82 reuse previously learned knowledge.…”
mentioning
confidence: 99%