An important question in theoretical computer science is to determine the best possible running time for solving a problem at hand. For geometric optimization problems, we often understand their complexity on a rough scale, but not very well on a finer scale. One such example is the two-dimensional knapsack problem for squares. There is a polynomial time (1 + )-approximation algorithm for it (i.e., a PTAS) but the running time of this algorithm is triple exponential in 1/ , i.e., Ω(n 2 2 1/ ). A double or triple exponential dependence on 1/ is inherent in how this and several other algorithms for other geometric problems work. In this paper, we present an EPTAS for knapsack for squares, i.e., a (1+ )-approximation algorithm with a running time of O (1) · n O(1) . In particular, the exponent of n in the running time does not depend on at all! Since there can be no FPTAS for the problem (unless P = NP) this is the best kind of approximation scheme we can hope for. To achieve this improvement, we introduce two new key ideas: We present a fast method to guess the Ω(2 2 1/ ) relatively large squares of a suitable near-optimal packing instead of using brute-force enumeration. Secondly, we introduce an indirect guessing framework to define sizes of cells for the remaining squares. In the previous PTAS each of these steps needs a running time of Ω(n 2 2 1/ ) and we improve both to O (1) · n O(1) . We complete our result by giving an algorithm for twodimensional knapsack for rectangles under (1 + )-resource augmentation. In this setting, we also improve the best known running time of Ω(n 1/ 1/ ) to O (1) · n O(1) and compute even a solution with optimal profit, in contrast to the best previously known polynomial time algorithm for this setting that computes only an approximation. We believe that our new techniques have the potential to be useful for other settings as well.