2023
DOI: 10.1109/tcad.2022.3217421
|View full text |Cite
|
Sign up to set email alerts
|

Pretraining Graph Neural Networks for Few-Shot Analog Circuit Modeling and Design

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
2

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 15 publications
(14 citation statements)
references
References 19 publications
0
12
2
Order By: Relevance
“…Examples include the integration of logic synthesis [22] [23] [24], reinforcement learning for field-programmable gate array performance [22] and transistor placement [16]. Notably, these examples do not necessarily consider the circuit and graph reversibility because the output consists of images of the component layout [16] and numerical values for the simulation [13]. A method similar to the one-hot embedding vector concatenates the real values of the circuit constants into a one-hot vector [25] [26] [27].…”
Section: A Previous Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Examples include the integration of logic synthesis [22] [23] [24], reinforcement learning for field-programmable gate array performance [22] and transistor placement [16]. Notably, these examples do not necessarily consider the circuit and graph reversibility because the output consists of images of the component layout [16] and numerical values for the simulation [13]. A method similar to the one-hot embedding vector concatenates the real values of the circuit constants into a one-hot vector [25] [26] [27].…”
Section: A Previous Workmentioning
confidence: 99%
“…Unlike continuous values in images and natural language, graph networks have discrete values that allow for a degree of freedom in the graph transformation method. Conventionally, circuit components and wiring are defined as nodes, with edges connecting these nodes [13] [14]. Because the roles of the circuit components and wiring differ, a constraint necessitates alternating appearances of component and wiring nodes.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Feature concatenation is a numerical representation of circuit inputs and outputs that properly tuned by minimizing the loss function. Attempts has been made to include different circuit topologies and obtain predictions as in [50], where two circuit types were included in the study: the ladder circuits and two stage operational amplifier circuits, with 20k training data instances of resistor ladders with 2 to 10 branches with equal distribution weight. The model is based on DeepGEN architecture and was able to make predictions on ladder circuits with higher number of branches.…”
Section: Branchmentioning
confidence: 99%
“…Recently, various studies have focused on learning movements comprising multiple short-term movements [1]. A common method involves using hierarchical architectures [13], [14]. These typically consist of low-level policies that learn primitive behaviors and high-level policies that plan sequences of low-level policies.…”
Section: Related Work a Imitation Learningmentioning
confidence: 99%