Abstract.We investigate the average-case complexity of a generalization of the compact knapsack problem to arbitrary rings: given m (random) ring elements a 1 , . . . , a m ∈ R and a (random) target value b ∈ R, find coefficients x 1 , . . . , x m ∈ S (where S is an appropriately chosen subset of R) such thatWe consider compact versions of the generalized knapsack where the set S is large and the number of weights m is small. Most variants of this problem considered in the past (e.g., when R = Z is the ring of the integers) can be easily solved in polynomial time even in the worst case. We propose a new choice of the ring R and subset S that yields generalized compact knapsacks that are seemingly very hard to solve on the average, even for very small values of m. Namely, we prove that for any unbounded function m = ω(1) with arbitrarily slow growth rate, solving our generalized compact knapsack problems on the average is at least as hard as the worst-case instance of various approximation problems over cyclic lattices. Specific worst-case lattice problems considered in this paper are the shortest independent vector problem SIVP and the guaranteed distance decoding problem GDD (a variant of the closest vector problem, CVP) for approximation factors n 1+ almost linear in the dimension of the lattice. Our results yield very efficient and provably secure one-way functions (based on worst-case complexity assumptions) with key size and time complexity almost linear in the security parameter n. Previous constructions with similar security guarantees required quadratic key size and computation time. Our results can also be formulated as a connection between the worst-case and average-case complexity of various lattice problems over cyclic and quasi-cyclic lattices.