2009 Ninth IEEE International Conference on Data Mining 2009
DOI: 10.1109/icdm.2009.56
|View full text |Cite
|
Sign up to set email alerts
|

Stacked Gaussian Process Learning

Abstract: Triggered by a market relevant application that involves making joint predictions of pedestrian and public transit flows in urban areas, we address the question of how to utilize hidden common cause relations among variables of interest in order to improve performance in the two related regression tasks. Specifically, we propose stacked Gaussian process learning, a meta-learning scheme in which a base Gaussian process is enhanced by adding the posterior covariance functions of other related tasks to its covari… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
29
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
4
3

Relationship

3
4

Authors

Journals

citations
Cited by 34 publications
(29 citation statements)
references
References 15 publications
0
29
0
Order By: Relevance
“…The work of [15] used GP to represent the traffic flow over a network of only highways and defined the correlation of speeds between highway segments to depend only on the geodesic (i.e., shortest path) distance of these segments with respect to the network topology; their features are not considered. The work of [43] maintained a mixture of two independent GPs for flow prediction such that the correlation structure of one GP utilized road segment features while that of the other GP depended on manually specified relations (instead of geodesic distance) between segments with respect to an undirected network topology. Different from the above works, we propose a relational GP whose correlation structure exploits the geodesic distance between segments based on the topology of a directed road network with vertices denoting road segments and edges indicating adjacent segments weighted by dissimilarity of their features, hence tightly integrating the features and relational information.…”
Section: Related Work a Models For Predicting Spatiotemporallymentioning
confidence: 99%
See 3 more Smart Citations
“…The work of [15] used GP to represent the traffic flow over a network of only highways and defined the correlation of speeds between highway segments to depend only on the geodesic (i.e., shortest path) distance of these segments with respect to the network topology; their features are not considered. The work of [43] maintained a mixture of two independent GPs for flow prediction such that the correlation structure of one GP utilized road segment features while that of the other GP depended on manually specified relations (instead of geodesic distance) between segments with respect to an undirected network topology. Different from the above works, we propose a relational GP whose correlation structure exploits the geodesic distance between segments based on the topology of a directed road network with vertices denoting road segments and edges indicating adjacent segments weighted by dissimilarity of their features, hence tightly integrating the features and relational information.…”
Section: Related Work a Models For Predicting Spatiotemporallymentioning
confidence: 99%
“…The first two equalities expand the second component by the same trick as that in (43). The third and fourth equalities exploit (10) and (12), respectively.…”
Section: Appendix B Proof Of Theorem 2bmentioning
confidence: 99%
See 2 more Smart Citations
“…"Short-term" can be subjectively defined based on the temporal scale of the sensor readings. strated to be an effective tool for modeling and predicting various traffic phenomena such as mobility demand [8], traffic congestion [25], short-term traffic volume [44], travel time [18], and pedestrian and public transit flows in urban areas [28]. Indeed, comparative studies on short-term traffic volume prediction showed that GPs outperform other methods such as autoregressive integrated moving average, support vector machine, and multilayer feedforward neural network for the task [44], [48].…”
Section: Introductionmentioning
confidence: 99%