2006
DOI: 10.1007/11818175_7
|View full text |Cite
|
Sign up to set email alerts
|

Rankin’s Constant and Blockwise Lattice Reduction

Abstract: Abstract. Lattice reduction is a hard problem of interest to both publickey cryptography and cryptanalysis. Despite its importance, extremely few algorithms are known. The best algorithm known in high dimension is due to Schnorr, proposed in 1987 as a block generalization of the famous LLL algorithm. This paper deals with Schnorr's algorithm and potential improvements. We prove that Schnorr's algorithm outputs better bases than what was previously known: namely, we decrease all former bounds on Schnorr's appro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
93
0

Year Published

2008
2008
2023
2023

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 50 publications
(93 citation statements)
references
References 27 publications
0
93
0
Order By: Relevance
“…Within blocks, one performs HKZ-reductions (or calls to an SVP solver), and blocks are handled in a LLL-manner. This vague description has been instantiated in several precisely analyzed hierarchies of reduction algorithms [74,22,23] and is also the basis of the famous heuristic BKZ algorithm [75] implemented in NTL [77]. The best complexity/quality trade-off currently known [23] allows one to find a basis…”
Section: Some Background On Euclidean Latticesmentioning
confidence: 99%
“…Within blocks, one performs HKZ-reductions (or calls to an SVP solver), and blocks are handled in a LLL-manner. This vague description has been instantiated in several precisely analyzed hierarchies of reduction algorithms [74,22,23] and is also the basis of the famous heuristic BKZ algorithm [75] implemented in NTL [77]. The best complexity/quality trade-off currently known [23] allows one to find a basis…”
Section: Some Background On Euclidean Latticesmentioning
confidence: 99%
“…Up to now, only approximation algorithms, such as [7,8,13,25], are efficient and all known exact algorithms are proven to cost exponential time. However, almost all known approximation algorithms (such as [8,25]) invoke some exact algorithm for solving SVP on some low dimensional lattices to improve the quantity of their outputs.…”
Section: Introductionmentioning
confidence: 99%
“…When k increases, the cost increases as well, but the quality of the bases improves. The recent hierarchies [9,10] achieve better trade-offs but follow the same general strategy. In practice, the Schnorr-Euchner BKZ algorithm [32] seems to be the best, at least for small values of k. The HKZ reduction uses the Kannan-Fincke-Pohst (KFP) enumeration of short lattice vectors [19,8].…”
Section: Introductionmentioning
confidence: 99%
“…The main reason is that the algorithmic facet of lattice reduction remains mysterious. In particular, the theoretically best algorithms [9,10] seem to remain slower than heuristic ones such as [32], whose practical behaviors are themselves suspicious. Let us discuss NTL's BKZ routine [33] which implements [32] and is the only publicly available such implementation: when the so-called block-size k is around 30, the number of internal calls to SVP in dimension k seems to explode suddenly (although the corresponding quantity decreases with k in the theoretical algorithms); when k increases, BKZ seems to require more precision for the underlying floating-point computations, although the considered bases should become more orthogonal, which implies a better conditioning with respect to numerical computations.…”
Section: Introductionmentioning
confidence: 99%