2008
DOI: 10.1007/978-3-540-79456-1_28
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Multi-set Algorithm for the Dense Subset Sum Problem

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2010
2010
2016
2016

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 13 publications
(14 citation statements)
references
References 12 publications
0
14
0
Order By: Relevance
“…The failure probability given by Theorem 4.1 differs at first sight qualitatively from those in [11,14] for the special case of Wagner's algorithm in that it does not decay to zero with the list length. However, our bound applies to the optimal algorithm for given n, m, q and c, while in [11,14] the list length m is required to be larger by a factor of α in order to achieve the bound on the failure probability (which decays exponentially with α). Obviously, since we achieve a constant failure probability for the given list length m, increasing the list length by a factor of α would allow us to run α independent trials of our algorithm, which also causes the failure probability to decrease exponentially with α.…”
Section: Bounding the Failure Probabilitymentioning
confidence: 85%
See 2 more Smart Citations
“…The failure probability given by Theorem 4.1 differs at first sight qualitatively from those in [11,14] for the special case of Wagner's algorithm in that it does not decay to zero with the list length. However, our bound applies to the optimal algorithm for given n, m, q and c, while in [11,14] the list length m is required to be larger by a factor of α in order to achieve the bound on the failure probability (which decays exponentially with α). Obviously, since we achieve a constant failure probability for the given list length m, increasing the list length by a factor of α would allow us to run α independent trials of our algorithm, which also causes the failure probability to decrease exponentially with α.…”
Section: Bounding the Failure Probabilitymentioning
confidence: 85%
“…In this construction, the length of the lists has to be roughly the square of the length that Wagner's algorithm prescribes. In 2008, Shallue [14] modified Lyubashevsky's algorithm so that the merging step selects a larger subset of valid pairs. As a result, in order to achieve non-trivial failure probability the lists need to be of length O(m log m) where m is the length required by Wagner's algorithm.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The hardness of breaking SS(n, M ) depends on the ratio between n and log M , which is usually referred to as the density of the subset sum instance. When n/log M is less than 1/n or larger than n/log 2 n, the problem can be solved in polynomial time [LO85,Fri86,FP05,Lyu05,Sha08]. However, when the density is constant or even as small as O(1/log n), there are currently no algorithms that require less than 2 Ω(n) time.…”
Section: Introductionmentioning
confidence: 99%
“…The hardness of SS(n, q) depends on the so-called density, which is defined by the ratio δ := n/ log q. In case δ < 1/n or δ > n/ log 2 n, the problem can be solved in polynomial time [20,13,12,21,32]. In case δ is o(1) or even as small as O(1/ log n), the problem is considered to be hard.…”
Section: Introductionmentioning
confidence: 99%