2014 IEEE Symposium on Computational Intelligence for Communication Systems and Networks (CIComms) 2014
DOI: 10.1109/cicomms.2014.7014633
|View full text |Cite
|
Sign up to set email alerts
|

ANN based optimization of resonating frequency of split ring resonator

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…With time, it will be integrated into all levels of technology readiness, becoming commonplace as an assistive tool in the computational design of metamaterials. Weng, Ding, Hu, et al [266] 2020 Deep Learning Classification 10 Melo Filho, Angeli, Ophem, et al [267] 2020 ANN Inverse Design 11 Chen, Lu, Karniadakis, et al [268] 2020 Deep Learning Inverse Design 12 Kollmann, Abueidda, Koric, et al [269] 2020 Deep Learning Optimization Framework 13 Qu, Zhu, Shen, et al [270] 2020 ANN Optimization Framework 14 Lai, Amirkulova, and Gerstoft [271] 2021 CNN, GAN Inverse Design 15 Gurbuz, Kronowetter, Dietz, et al [272] 2021 GAN Inverse Design 16 Amirkulova, Tran, and Khatami [273] 2021 Deep Learning Inverse Design 17 Wu, Liu, Jahanshahi, et al [274] 2021 ANN Inverse Design 18 Shah, Zhuo, Lai, et al [275] 2021 RL Optimization Framework 19 Tran, Amirkulova, and Khatami [276] 2022 ANN Inverse Design 20 Wiest, Seepersad, and Haberman [277] 2022 GNN Inverse Design 21 Amirkulova, Zhou, Abbas, et al [278] 2022 Deep Learning Inverse Design 22 Tran, Khatami, and Amirkulova [279] 2022 CNN Inverse Design 23 Li, Chen, Li, et al [280] 2023 CNN Inverse Design 24 Li, Chen, Li, et al [281] 2023 Deep Learning Inverse Design 25 Wang, Chen, Xu, et al [282] 2023 ANN Inverse Design Application field: Electromagnetics 26 Jiang, Xiao, Liu, et al [283] 2010 ANN and scaled conjugate Surrogate model gradient 27 Freitas, Rêgo, and Vasconcelos [140] 2011 ANN Surrogate model 28 Vasconcelos, Rêgo, and Cruz [139] 2012 ANN Surrogate model 29 Sarmah, Sarma, and Baruah [172] 2015 ANN optimization framework 30 Saha and Maity [138] 2016 ANN Surrogate model 31 Nanda, Sahu, and Mishra [108] 2019 ANN Inverse design 32 An, Fowler, Shalaginov, et al [144] 2019 ANN Surrogate model 33 Yuze, Hai, and Qinglin [157] 2019 CNN Classification and clustering 34 Liu, Zhang, and Cui [156] 2019 CNN optimization framework 35 Hodge, Mishra, and Zaghloul [284] 2019 DC-GAN Inverse design 36 Hodge, Mishra, and Zaghloul…”
Section: The Causal Relationship Problemmentioning
confidence: 99%
See 1 more Smart Citation
“…With time, it will be integrated into all levels of technology readiness, becoming commonplace as an assistive tool in the computational design of metamaterials. Weng, Ding, Hu, et al [266] 2020 Deep Learning Classification 10 Melo Filho, Angeli, Ophem, et al [267] 2020 ANN Inverse Design 11 Chen, Lu, Karniadakis, et al [268] 2020 Deep Learning Inverse Design 12 Kollmann, Abueidda, Koric, et al [269] 2020 Deep Learning Optimization Framework 13 Qu, Zhu, Shen, et al [270] 2020 ANN Optimization Framework 14 Lai, Amirkulova, and Gerstoft [271] 2021 CNN, GAN Inverse Design 15 Gurbuz, Kronowetter, Dietz, et al [272] 2021 GAN Inverse Design 16 Amirkulova, Tran, and Khatami [273] 2021 Deep Learning Inverse Design 17 Wu, Liu, Jahanshahi, et al [274] 2021 ANN Inverse Design 18 Shah, Zhuo, Lai, et al [275] 2021 RL Optimization Framework 19 Tran, Amirkulova, and Khatami [276] 2022 ANN Inverse Design 20 Wiest, Seepersad, and Haberman [277] 2022 GNN Inverse Design 21 Amirkulova, Zhou, Abbas, et al [278] 2022 Deep Learning Inverse Design 22 Tran, Khatami, and Amirkulova [279] 2022 CNN Inverse Design 23 Li, Chen, Li, et al [280] 2023 CNN Inverse Design 24 Li, Chen, Li, et al [281] 2023 Deep Learning Inverse Design 25 Wang, Chen, Xu, et al [282] 2023 ANN Inverse Design Application field: Electromagnetics 26 Jiang, Xiao, Liu, et al [283] 2010 ANN and scaled conjugate Surrogate model gradient 27 Freitas, Rêgo, and Vasconcelos [140] 2011 ANN Surrogate model 28 Vasconcelos, Rêgo, and Cruz [139] 2012 ANN Surrogate model 29 Sarmah, Sarma, and Baruah [172] 2015 ANN optimization framework 30 Saha and Maity [138] 2016 ANN Surrogate model 31 Nanda, Sahu, and Mishra [108] 2019 ANN Inverse design 32 An, Fowler, Shalaginov, et al [144] 2019 ANN Surrogate model 33 Yuze, Hai, and Qinglin [157] 2019 CNN Classification and clustering 34 Liu, Zhang, and Cui [156] 2019 CNN optimization framework 35 Hodge, Mishra, and Zaghloul [284] 2019 DC-GAN Inverse design 36 Hodge, Mishra, and Zaghloul…”
Section: The Causal Relationship Problemmentioning
confidence: 99%
“…In their findings, they note that although this 1442 gradient-based method might still get caught at a local 1443 minima, it is much faster than conventional numerical 1444 gradient-based approaches, such as finite-difference meth-1445 ods (FDM). Sarmah et al[172] proposed the use of an 1446 ANN-based optimization framework, trained using several 1447 gradient descent algorithms. Their findings suggest that 1448 ANN is highly suitable for the optimization of design pa-1449 rameters and in their case for defining the performance of 1450[174] discusses the 1466 optimization of ANN architecture and model parameters 1467 in more detail.…”
mentioning
confidence: 99%
“…ANN are machine-learning tools that are very effective when we have data, which is experimentally generated. ANN are used where we have no mathematical formula to calculate the output for a certain input and there is no linear relationship between the input and output parameters, so the ANN's are the best tool for the prediction and for the Optimization [3][4][5] of output parameters. In this kind of scenario, ANN reduces the complex calculations [6-8] and reduces the computational cost.…”
Section: Introductionmentioning
confidence: 99%