2014 IEEE International Symposium on Information Theory 2014
DOI: 10.1109/isit.2014.6874893
|View full text |Cite
|
Sign up to set email alerts
|

Social learning and distributed hypothesis testing

Abstract: Abstract-This paper considers a problem of distributed hypothesis testing and social learning. Individual nodes in a network receive noisy local (private) observations whose distribution is parameterized by a discrete parameter (hypotheses). The marginals of the joint observation distribution conditioned on each hypothesis are known locally at the nodes, but the true parameter/hypothesis is not known. An update rule is analyzed in which nodes first perform a Bayesian update of their belief (distribution estima… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 52 publications
(7 citation statements)
references
References 15 publications
0
7
0
Order By: Relevance
“…This assumption is technical assumption one and relaxes the assumption of bounded support for the likelihood ratio random variable in prior work [1], [3]- [6]. Next, we provide families of distributions which satisfy Assumption 2 even though they might have unbounded support.…”
Section: Assumption 2 For Every Pairmentioning
confidence: 97%
See 4 more Smart Citations
“…This assumption is technical assumption one and relaxes the assumption of bounded support for the likelihood ratio random variable in prior work [1], [3]- [6]. Next, we provide families of distributions which satisfy Assumption 2 even though they might have unbounded support.…”
Section: Assumption 2 For Every Pairmentioning
confidence: 97%
“…Even though each individual cannot identify the parameter through local observations alone, the parameter may be collectively identifiable. In this paper we study the learning rule proposed in [1], which is based on local Bayesian updating followed by consensus averaging on a reweighting of the log beliefs of nodes. Under the assumptions of network-wide observability and connectivity, in [1], we characterized the rate of convergence.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations