2006
DOI: 10.1007/s00373-006-0649-0
|View full text |Cite
|
Sign up to set email alerts
|

Toughness in Graphs – A Survey

Abstract: In this survey we have attempted to bring together most of the results and papers that deal with toughness related to cycle structure. We begin with a brief introduction and a section on terminology and notation, and then try to organize the work into a few self explanatory categories. These categories are circumference, the disproof of the 2-tough conjecture, factors, special graph classes, computational complexity, and miscellaneous results as they relate to toughness. We complete the survey with some tough … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
91
0

Year Published

2008
2008
2022
2022

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 107 publications
(92 citation statements)
references
References 125 publications
1
91
0
Order By: Relevance
“…Since, for example, instead of requiring that the number of edges on average is 10, we now require that the square of the number of edges is 100 on average-we do not expect this to cause differences that are too big. Indeed, we can show that due to the fact that the new MaxEnt ensemble is concentrated around ⎯⎯ , the differences are actually rather small (see the last paragraph on pg 3 of our published paper [17]), and the new model still respects (10) with good approximation. Fig 33 shows the corrected MaxEnt distribution by edge-number (black) for the case of the nefarious network.…”
Section: Can We Eliminate Degeneracy?mentioning
confidence: 90%
See 1 more Smart Citation
“…Since, for example, instead of requiring that the number of edges on average is 10, we now require that the square of the number of edges is 100 on average-we do not expect this to cause differences that are too big. Indeed, we can show that due to the fact that the new MaxEnt ensemble is concentrated around ⎯⎯ , the differences are actually rather small (see the last paragraph on pg 3 of our published paper [17]), and the new model still respects (10) with good approximation. Fig 33 shows the corrected MaxEnt distribution by edge-number (black) for the case of the nefarious network.…”
Section: Can We Eliminate Degeneracy?mentioning
confidence: 90%
“…As defined before by us also as part of this project [6], [7], the vulnerability backbone of a network is the smallest percolating/connected subgraph whose removal minimizes the size of the largest component in the graph remaining after the removal. One can show that finding the vulnerability backbone is an NP-hard problem, similar to computing the toughness measure of the graph [10]. Thus, we must work with heuristic algorithms.…”
Section: Network Vulnerability and Shatteringmentioning
confidence: 99%
“…We obtain here sufficient and necessary conditions for an adversarial infection to become pandemic (i.e., infect all nodes) by relating existence of such strategies to the notion of toughness of the graph. The notion of graph toughness was first introduced in order to study conditions for the existence of Hamiltonian cycles in graphs, see [4]. Given an undirected graph G = (V, E) and a subset of nodes S, let |S| be the size of S and c G (S) be the number of connected components in the graph induced on V \ S obtained from G by deleting all nodes in the set S. The toughness of the graph G, where G is not the complete graph, denoted by τ (G) is defined as follows…”
Section: Pandemic Infectionsmentioning
confidence: 99%
“…The best available lower bound is due to Bauer, Broersma and Veldman [3] who constructed non-Hamiltonian graphs with toughness arbitrarily close to 9 4 . Partial results related to Chvátal's conjecture have been obtained in various restricted classes of graphs (see the survey [2] for details). A number of these results concern chordal graphs.…”
Section: Introductionmentioning
confidence: 99%