2020 International Joint Conference on Neural Networks (IJCNN) 2020
DOI: 10.1109/ijcnn48605.2020.9207213
|View full text |Cite
|
Sign up to set email alerts
|

Fuzzy Graph Neural Network for Few-Shot Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
17
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(18 citation statements)
references
References 19 publications
1
17
0
Order By: Relevance
“…Code generation models have been applied to a variety of tasks, including test generation [19], docstring generation [20], code search [17,21], type inference [22,23,24], and more [25]. We focus on the natural-language-to-code task (NL2Code): given the description of a function in natural language, complete the function body.…”
Section: The Natural Language To Code Taskmentioning
confidence: 99%
See 1 more Smart Citation
“…Code generation models have been applied to a variety of tasks, including test generation [19], docstring generation [20], code search [17,21], type inference [22,23,24], and more [25]. We focus on the natural-language-to-code task (NL2Code): given the description of a function in natural language, complete the function body.…”
Section: The Natural Language To Code Taskmentioning
confidence: 99%
“…Other tasks. Although we focus specifically on benchmarks for the code generation task, there are many other tasks that have been used to evaluate code generation models, including generating unit tests from code [19], code search [17,21], and type inference [22,23,24]. Lu et al [20] propose a suite of evaluation datasets for ten tasks, including code translation, docstring generation, and code summarization.…”
Section: Related Workmentioning
confidence: 99%
“…A GNN learns the current node representation by repeatedly combining the features of neighboring nodes using the message passing algorithm, thereby generating similar representations for strongly linked nodes. Due to the advantages of GNNs, some approaches [13], [24], [25] incorporated GNNs in few-shot learning domain and demonstrated promising results. The authors of [13] used a GNN as a label propagation module to forecast the label of unlabeled nodes.…”
Section: Graph Neural Networkmentioning
confidence: 99%
“…Wei et al, (2020) developed LambdaNet for inferring types for TypeScript. Given a program, LambdaNet first transforms it to a type dependency graph, where nodes are type variables for subexpressions in the programs and hyperedges express constraints (such as the subtyping relation or type equality).…”
Section: Related Workmentioning
confidence: 99%