The evolution of antimicrobial resistance often occurs in a variable environment, as antimicrobial is given periodically to a patient or added and removed from a medium. This environmental variability has a huge impact on the microorganisms' fitness landscape, and thus on the evolution of resistance. Indeed, mutations conferring resistance often carry a fitness cost in the absence of antimicrobial, which may be compensated by subsequent mutations. As antimicrobial is added or removed, the relevant fitness landscape thus switches from a fitness valley to an ascending landscape or vice-versa.Here, we investigate the effect of these time-varying patterns of selection within a stochastic model. We focus on a homogeneous microbial population of fixed size subjected to a periodic alternation of phases of absence and presence of an antimicrobial that stops growth. Combining analytical approaches and stochastic simulations, we quantify how the time necessary for fit resistant bacteria to take over the microbial population depends on the period of the alternations. We demonstrate that fast alternations strongly accelerate the evolution of resistance, and that a plateau is reached once the period gets sufficiently small. Besides, the acceleration of resistance evolution is stronger for larger populations. For asymmetric alternations, featuring a different duration of the phases with and without antimicrobial, we shed light on the existence of a broad minimum of the time taken by the population to fully evolve resistance. At this minimum, if the alternations are sufficiently fast, the very first resistant mutant that appears ultimately leads to full resistance evolution within the population. This dramatic acceleration of the evolution of antimicrobial resistance likely occurs in realistic situations, and can have an important impact both in clinical and experimental situations.