2023
DOI: 10.1007/s10115-023-01847-0
|View full text |Cite
|
Sign up to set email alerts
|

Harnessing heterogeneity in space with statistically guided meta-learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 30 publications
0
3
0
Order By: Relevance
“…Embedding spatial data in the training phase provides a meaningful solution, but its agnostic spatial extent poses a huge challenge. Li et al [254] proposed a spatial moderator to generalize learned spatial patterns and spatialized network structures from the original region to the new region. Meanwhile, the method introduced model-agnostic meta-learning methods to address the issue that the acquisition of deep learning models is limited to individual partitions or locations.…”
Section: Meta Learningmentioning
confidence: 99%
“…Embedding spatial data in the training phase provides a meaningful solution, but its agnostic spatial extent poses a huge challenge. Li et al [254] proposed a spatial moderator to generalize learned spatial patterns and spatialized network structures from the original region to the new region. Meanwhile, the method introduced model-agnostic meta-learning methods to address the issue that the acquisition of deep learning models is limited to individual partitions or locations.…”
Section: Meta Learningmentioning
confidence: 99%
“…Sample-reweighting and self-training approaches (Bickel, Brückner, and Scheffer 2007;An et al 2022a;He et al 2023) also aim to reduce the distribution gap between training and testing sets by assigning higher weights to samples more similar to test samples feature-wise, or include highconfidence pseudo-labels on test samples during training. In addition, heterogeneity-aware learning tackles variability by data partitioning and network branching (Xie et al 2021(Xie et al , 2023. While these methods address distribution shifts, they also do not consider the changes of groups (locations) between training and test for fairness applications.…”
Section: Introductionmentioning
confidence: 99%
“…While these methods address distribution shifts, they also do not consider the changes of groups (locations) between training and test for fairness applications. Meta-learning: Model-agnostic meta learning (MAML)'s gradient-by-gradient training allows it to learn an initial model that can be quickly fine-tuned to the test data with only a small number of observations (Finn, Abbeel, and Levine 2017;Ren et al 2018;Xie et al 2023;Chen et al 2023). Recent developments have also started exploring the use of MAML in fairness-aware learning (Zhao et al 2020(Zhao et al , 2022a.…”
Section: Introductionmentioning
confidence: 99%