2022
DOI: 10.1080/03772063.2022.2058631
|View full text |Cite|
|
Sign up to set email alerts
|

RETRACTED ARTICLE: Internet of Thing based Koch Fractal Curve Fractal Antennas for Wireless Applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 10 publications
0
9
0
Order By: Relevance
“…Although the above models have achieved improvement in the accuracy of summary generation, the recurrent neural network and its variants are all time-step-based sequence structures, which seriously hinders the parallel training of the model [16][17][18], resulting in the inference process being limited by memory, resulting in reduced encoding and decoding speed of the summary generation model, and increased training overhead [19][20][21][22][23]. On the other hand, the above works optimize the model to maximize the ROUGE index or maximum likelihood without considering the coherence or fluency of the summary sentence [24][25][26] and relying on the ground-truth value of the annotated summary text in advance. With supervised training, the data cost involved in model training is high.…”
Section: Related Workmentioning
confidence: 99%
“…Although the above models have achieved improvement in the accuracy of summary generation, the recurrent neural network and its variants are all time-step-based sequence structures, which seriously hinders the parallel training of the model [16][17][18], resulting in the inference process being limited by memory, resulting in reduced encoding and decoding speed of the summary generation model, and increased training overhead [19][20][21][22][23]. On the other hand, the above works optimize the model to maximize the ROUGE index or maximum likelihood without considering the coherence or fluency of the summary sentence [24][25][26] and relying on the ground-truth value of the annotated summary text in advance. With supervised training, the data cost involved in model training is high.…”
Section: Related Workmentioning
confidence: 99%
“…As this technique supports “Wiener Filter” that has the ability to observe the noisy images, here the objective function must be required. Now, the objective function as per FCM is [ 28 , 29 ] where G pq = fuzzy factor and it is defined as where x p , x r = pixels d pr = spatial Euclidian distance between x p and x r N p = set of neighbors σ qp = Fuzzy membership of p th pixel with respect to q σ qr = neighbor of σ qp N = total number of pixels in image C = cluster centre V = fuzziness of significant division …”
Section: Methodsmentioning
confidence: 99%
“…As this technique supports "Wiener Filter" that has the ability to observe the noisy images, here the objective function must be required. Now, the objective function as per FCM is [28,29]…”
Section: Proposed Ifrfcm For Image Segmentationmentioning
confidence: 99%
“…The First, the author provided an arrival-service model for FoT data traffic based on multilayer waiting for lines with finite-size intervals. The proposed policy was compared to first-in-first-out and multi-priority-discipline queue strategies with the help of a complete study of wait times and gaps in wait times [23][24][25]. Machine learning and Clustering based various methods are used for the text analysis [26], internet of things [27][28][29][30][31][32][33], and disease detection [34].…”
Section: Related Workmentioning
confidence: 99%