2021 58th ACM/IEEE Design Automation Conference (DAC) 2021
DOI: 10.1109/dac18074.2021.9586172
|View full text |Cite
|
Sign up to set email alerts
|

Local Bayesian Optimization For Analog Circuit Sizing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(5 citation statements)
references
References 12 publications
0
5
0
Order By: Relevance
“…The common ground between GA MOO and TS efficient optimization is that both algorithms begin by randomly initializing some set of data points (Touloupas and Sotiriadis, 2021). However, the GA produces offsprings from parents through the cross-over operation and mutation, while in the TS algorithm, new data points are generated using sample functions from the Gaussian processes.…”
Section: Discussionmentioning
confidence: 99%
“…The common ground between GA MOO and TS efficient optimization is that both algorithms begin by randomly initializing some set of data points (Touloupas and Sotiriadis, 2021). However, the GA produces offsprings from parents through the cross-over operation and mutation, while in the TS algorithm, new data points are generated using sample functions from the Gaussian processes.…”
Section: Discussionmentioning
confidence: 99%
“…There is a large body of work on BO for single-objective optimization [13,15]. Standard BO methods have been applied to a variety of problems including solving simple analog circuit design optimization and synthesis problems [27,33,31,49,26,34,42,32,41].…”
Section: Related Workmentioning
confidence: 99%
“…However, there is no way to yield exact, analytical expressions for the samples of GP models [13]. Typically, TS samples the values of the functions at a predefined, finitelength vector of input points [10], and determines the query point by finding the best value among them. However, in the case of high-dimensional variable spaces, this quantization approach is not practical since exponentially many points need to be considered to achieve good coverage.…”
Section: Algorithm 1: Bo Algorithmmentioning
confidence: 99%
“…Therefore, in this work, we use analytic approximations of the posterior samples using Random Fourier Features (RFF) [26], instead of relying on quantization. RFFs are a set of basis cosine functions φ(x) = [φ i (x] M i=1 that approximate the GP's kernel function as k x, x ≈ φ(x) T φ(x ) and can be also used to to approximate a sample from the GP as a linear model [26] f sample (x) ≈ φ(x) T θ, (10) where θ is an M-dimensional vector drawn from a Gaussian distribution. In this work, at each iteration, N s ≥ 1 approximate analytic samples are created through RFF, for each of the objective and the constraint functions.…”
Section: Algorithm 1: Bo Algorithmmentioning
confidence: 99%
See 1 more Smart Citation