2023 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) 2023
DOI: 10.1109/wacv56688.2023.00250
|View full text |Cite
|
Sign up to set email alerts
|

HyperShot: Few-Shot Learning by Kernel HyperNetworks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(5 citation statements)
references
References 13 publications
0
5
0
Order By: Relevance
“…• • Non-Gaussian Gaussian process (NGGP). 42 Expands on basic Gaussian process techniques for few-shot learning by modeling the posterior distribution with an ODE-based normalizing flow. • MetaNet.…”
Section: • Molbert + Attentive Neural Process (Anp) Combinesmentioning
confidence: 99%
“…• • Non-Gaussian Gaussian process (NGGP). 42 Expands on basic Gaussian process techniques for few-shot learning by modeling the posterior distribution with an ODE-based normalizing flow. • MetaNet.…”
Section: • Molbert + Attentive Neural Process (Anp) Combinesmentioning
confidence: 99%
“…where f −1 β (y) = z. CNFs are rather designed to model complex probability distributions for low-dimensional data, what was confirmed in various applications including point cloud generation [27], future prediction [29] or probabilistic few-shot regression [23]. Compared to models like RealNVP [5] or Glow [11], they can be successfully applied to one-dimensional data and achieve better results for tabular datasets.…”
Section: Decision Tree Ensembles Decision Treementioning
confidence: 99%
“…The low-dimensional representation w of the sparse embedding o is further passed to the conditional CNF module as a conditioning factor. We postulate to use the variant of the conditional flow-based model provided in [27,23], where w is delivered to the function of dynamics, g β (z(t), t, w). The transformation function is given by eq.…”
Section: Treeflowmentioning
confidence: 99%
See 1 more Smart Citation
“…Kernel hypernetworks combines the hypernetworks paradigm with kernel methods to realize a new strategy that mimics the human way of learning. In 2023, Marcin Sendera et al [67] first proposed this method. This method directly relies on the kernel-based representations of the support examples and a hypernetwork paradigm to create the query set's classification module.…”
Section: Kernel Hypernetworkmentioning
confidence: 99%