This paper first surveys the near-total lack of superlinear lower bounds in complexity theory, for "natural" computational problems with respect to many models of computation. We note that the dividing line between models where such bounds are known and those where none are known comes when the model allows non-local communication with memory at unit cost. We study a model that imposes a "fair cost" for non-local communication, and obtain modest superlinear lower bounds for some problems via a Kolmogorov-complexity argument. Then we look to the larger picture of what it will take to prove really striking lower bounds, and pull from ours and others' work a concept of information vicinity that may offer new tools and modes of analysis to a young field that rather lacks them.