2012 IEEE 53rd Annual Symposium on Foundations of Computer Science 2012
DOI: 10.1109/focs.2012.21
|View full text |Cite
|
Sign up to set email alerts
|

Higher Cell Probe Lower Bounds for Evaluating Polynomials

Abstract: Abstract-In this paper, we study the cell probe complexity of evaluating an n-degree polynomial P over a finite field F of size at least n 1+Ω(1) . More specifically, we show that any static data structure for evaluating P (x), where x ∈ F, must use Ω(lg |F|/ lg(Sw/n lg |F|)) cell probes to answer a query, where S denotes the space of the data structure in number of cells and w the cell size in bits. This bound holds in expectation for randomized data structures with any constant error probability δ < 1/2. Our… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

2
21
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 34 publications
(23 citation statements)
references
References 29 publications
2
21
0
Order By: Relevance
“…The main motivating factor for studying the pointer machine is that we can prove polynomially high and often tight lower bounds for range reporting problems using the framework of Chazelle [11]. This stands in sharp contrast to the highest query time lower bound proved for any static data structure problem in the word-RAM, a mere Ω(lg N ), even for linear space data structures [18]. Note that since any memory cell stores a constant number of pointers, one cannot generally hope for a query time below Θ(lg N ).…”
Section: Models Of Computationmentioning
confidence: 89%
“…The main motivating factor for studying the pointer machine is that we can prove polynomially high and often tight lower bounds for range reporting problems using the framework of Chazelle [11]. This stands in sharp contrast to the highest query time lower bound proved for any static data structure problem in the word-RAM, a mere Ω(lg N ), even for linear space data structures [18]. Note that since any memory cell stores a constant number of pointers, one cannot generally hope for a query time below Θ(lg N ).…”
Section: Models Of Computationmentioning
confidence: 89%
“…More specifically, we use ideas from the technique now formally known as cell sampling [18,38,32]. This technique derives lower bounds based on one key observation: if a data structure/streaming algorithm probes t memory words on an update, then there is a set C of t memory words such that at least m/S t updates probe only memory words in C, where m is the number of distinct updates in the problem (for data structure lower bound proofs, we typically consider queries rather than updates, and we obtain tighter lower bounds by forcing C to have near-linear size).…”
Section: Techniquementioning
confidence: 99%
“…The problem with this construction is that the Vandermonde matrix is dense, resulting in an evaluation time of Ω(k) if we simply store the coefficients of the polynomial. The lower bounds by Siegel [22], and later Larsen [17], as presented in Table 1, show that a data structure for evaluating a polynomial of degree k − 1 using time t < k must use space at least k(u/k) 1/t . The data structure of [15] presents a step in this direction, but is still far from the lower bound for k-independent functions.…”
Section: Background and Overviewmentioning
confidence: 99%