This paper proposes a methodology for computing Maximally Allowable Transfer Intervals (MATIs) that provably stabilize nonlinear Networked Control Systems (NCSs) in the presence of disturbances and signal delays. Accordingly, given a desired level of system performance (in terms of Lpgains), quantitative MATI vs. delay trade-offs are obtained. By combining impulsive delayed system modeling with LyapunovRazumikhin type of arguments, we are able to consider even the so-called large delays. Namely, the computed MATIs can be smaller than delays existent in NCSs. In addition, our stability results are provided for the class of Uniformly Globally Exponentially Stable (UGES) scheduling protocols. The wellknown Round Robin (RR) and Transmit-Once-Discard (TOD) protocols are examples of UGES protocols. Apart from the inclusion of large delays, another salient feature of our methodology is the consideration of corrupted data. To that end, we propose the notion of Lp-stability with bias. Furthermore, the Zeno-free property of our methodology is demonstrated. Finally, a comparison with the state-of-the-art work is provided utilizing the benchmark problem of batch reactor.