We consider an M server system in which each server can service at most one update packet at a time. The system designer controls (1) scheduling -the order in which the packets get serviced, (2) routing -the server that an arriving update packet joins for service, and (3) the service time distribution with fixed service rate. Given a fixed update generation process, we prove a strong age-delay and age-delay variance tradeoff, wherein, as the average AoI approaches its minimum, the packet delay and its variance approach infinity. In order to prove this result, we consider two special cases of the M server system, namely, a single server system with last come first server with preemptive service and an infinite server system. In both these cases, we derive sufficient conditions to show that three heavy tailed service time distributions, namely Pareto, lognormal, and Weibull, asymptotically minimize the average AoI as their tail gets heavier, and establish the age-delay tradeoff results. We provide an intuitive explanation as to why such a seemingly counter intuitive age-delay tradeoff is natural, and that it should exist in many systems.