2021
DOI: 10.1016/j.neucom.2021.09.003
|View full text |Cite
|
Sign up to set email alerts
|

Simulated annealing for optimization of graphs and sequences

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2

Relationship

4
4

Authors

Journals

citations
Cited by 20 publications
(17 citation statements)
references
References 55 publications
0
17
0
Order By: Relevance
“…Our work on unsupervised summarization follows the recent progress of search-based text generation (Liu et al, , 2021aKumar et al, 2020). Schumann et al (2020) formulate summarization as word-level extraction (with order preserved), and apply edit-based discrete local search to maximize a heuristically designed objective.…”
Section: Search-based Summarizationmentioning
confidence: 99%
“…Our work on unsupervised summarization follows the recent progress of search-based text generation (Liu et al, , 2021aKumar et al, 2020). Schumann et al (2020) formulate summarization as word-level extraction (with order preserved), and apply edit-based discrete local search to maximize a heuristically designed objective.…”
Section: Search-based Summarizationmentioning
confidence: 99%
“…In this work, we propose a heuristic, non-convex optimization algorithm, namely simulated annealing (SA), for the structure optimization of partially connected neural networks after pruning [47]. The choice of simulated annealing has been motivated by the success of the algorithm in various problems involving network/graph structures with a large number of configurations and complicated cost surfaces with various local minima [48][49][50].…”
Section: Network Optimization Using Simulated Annealingmentioning
confidence: 99%
“…After updating the temperature for approximately 100 to 150 iterations, the network performance is observed to converge. The time consumption of Algorithm 1 for different MLLs (1,10,20,30,40,50,60,70,80,90) are measured for each one of three different fractions of gradual pruning percentage p (2%, 5%, 10%) in Figure 10. Even though the extreme cases are included in the time consumption evaluation; MLLs of 20 to 50 seem to be enough, as can be observed in Figure 7.…”
Section: Time Complexity Of the Sa-based Pruning Processmentioning
confidence: 99%
“…To improve the dissimilarity between the outputs and input sentences, Lin and Wan [2021] leverage multi-round paraphrase generation and back translation. Also, Liu et al [2021] propose multi-round modification in hope of lexically different sequences (e.g., paraphrases). In addition, Li et al [2019] generate paraphrases of a sentence at different levels of granularity in a disentangled way.…”
Section: Related Workmentioning
confidence: 99%
“…The abstract rule was first introduced to study the language acquisition of infants [Marcus et al, 1999]. In this paper, we use an abstract rule to represent a type of paraphrase transformation.…”
Section: Abstract Rules In Paraphrase Generationmentioning
confidence: 99%