2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2020
DOI: 10.1109/cvpr42600.2020.01340
|View full text |Cite
|
Sign up to set email alerts
|

DPGN: Distribution Propagation Graph Network for Few-Shot Learning

Abstract: This letter proposes a few-shot physics-guided spatial temporal graph convolutional network (FPG-STGCN) to fast solve unit commitment (UC). Firstly, STGCN is tailored to parameterize UC. Then, few-shot physics-guided learning scheme is proposed. It exploits few typical UC solutions yielded via commercial optimizer to escape from local minimum, and leverages the augmented Lagrangian method for constraint satisfaction. To further enable both feasibility and continuous relaxation for integers in learning process,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
129
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 212 publications
(129 citation statements)
references
References 12 publications
0
129
0
Order By: Relevance
“…Transductive few-shot methods [1], [5], [12], [26], [31], [50], [59] assume that the model simultaneously accesses all the query set. A transductive episodic-wise adaptive metric (TEAM) [31] defined the optimization process as a standard semi-definite programming problem to train a generalizable classifier.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Transductive few-shot methods [1], [5], [12], [26], [31], [50], [59] assume that the model simultaneously accesses all the query set. A transductive episodic-wise adaptive metric (TEAM) [31] defined the optimization process as a standard semi-definite programming problem to train a generalizable classifier.…”
Section: Related Workmentioning
confidence: 99%
“…A transductive episodic-wise adaptive metric (TEAM) [31] defined the optimization process as a standard semi-definite programming problem to train a generalizable classifier. A distribution propagation graph network (DPGN) [50] proposed utilizing both the distribution-level and instance-level relations by designing a dual complete graph network consisting of a point graph and a distribution graph. [5] proposed transductive fine-tuning, which pursues outputs with a peaked posterior or low Shannon entropy, and a hardness metric to deliver a standardized evaluation protocol.…”
Section: Related Workmentioning
confidence: 99%
“…A learnable scaling parameter is added to the metric function, which greatly improves the classification performance. GNN [38] EGNN [39]DPGN [40] These articles introduced graph network to transfer sample distance measurement from European space to non-European space to achieve target classification.…”
Section: Prototypementioning
confidence: 99%
“…Metric learning based methods [11,5,12,13] finds better similarity metrics for image embeddings. Graph based methods [14,9] construct a complete graph that connects all samples of a fewshot task to perform information propagation and the queries are classified based on their relations to shots. Besides, there are also image retrieval based works [15] and reinforcement learning based ones [16].…”
Section: Related Workmentioning
confidence: 99%
“…The performance is reported as the mean accuracy and the 95% confidence interval. Usually, the training set is also sampled into many training few-shot tasks to optimize the model [5,8,9]. Recently, subspace learning based methods for few-shot learning [6,7] draws increasing attention.…”
Section: Introductionmentioning
confidence: 99%