This paper compares the service quality between 4G and 5G New Radio (NR) among different sub-6 GHz frequency bands in an urban micro-cellular outdoor setting. An updated version of LTE-Sim is considered to obtain the exponential effective signal-to-interference-plus-noise ratio in 4G while determining the modulation and coding scheme. System capacity is obtained by considering a video application at 3.1 Mb/s and the proportional fair (PF) scheduler while comparing 4G and 5G NR through system-level simulations (the 5G-air-simulator is considered for 5G NR). The modified largest weighted delay first (M-LWDF) scheduler is compared with the PF, though only in 4G. Optimal system performance is reached both in 4G and 5G NR for cell radii longer than two times the breakpoint distance (or beyond), which are preferable compared to the shortest values for the cell radius. We have learned that the packet loss ratio (PLR) is higher for the cell radii, R, shorter than breakpoint distance, d ′ BP . For d ′ BP ≤ R ≤ 1000 m, the PLR first decreases and then increases. For a target PLR < 2%, in 4G, the highest maximum average goodput is obtained with the M-LWDF scheduler (10-25% increase). This maximum occurs at the 2.6 GHz and 3.5 GHz frequency bands for 300 ≤ R ≤ 500 m, while at 5.62 GHz the highest goodput occurs for the longest Rs. With 5G NR and the PF, the maximum average goodput increases, in our simulations, from ≈ 14.1 (in 4G) to 26.1 Mb/s (20 MHz bandwidth).