Federated Learning (FL) is a machine learning paradigm that learns from data kept locally to safeguard the privacy of clients, whereas local SGD is typically employed on the clients' devices to improve communication efficiency. However, such a scheme is currently constrained by the slow and unstable convergence induced by clients' heterogeneous data. In this work, we identify three under-explored phenomena of the biased local learning that may explain these challenges caused by local updates in supervised FL. As a remedy, we propose FedAug, a novel unified algorithm that reduces the local learning bias on features and classifiers to tackle these challenges. FedAug consists of two components: AugMean and AugCA. AugMean alleviates the bias in the local classifiers by balancing the output distribution of models. AugCA learns client invariant features that are close to global features but considerably distinct from those learned from other input distributions. In a series of experiments, we show that FedAug consistently outperforms other SOTA FL and domain generalization (DG) baselines, in which both two components (i.e., AugMean and AugCA) have individual performance gains.