Background and objectives:There is significant interest in identifying clinically effective drug treatment regimens that minimize the de novo evolution of antimicrobial resistance in pathogen populations. However, in vivo studies that vary treatment regimens and directly measure drug resistance evolution are rare. Here, we experimentally investigate the role of drug dose and treatment timing on resistance evolution in an animal model. Methodology: In a series of experiments, we measured the emergence of atovaquone-resistant mutants of Plasmodium chabaudi in laboratory mice, as a function of dose and timing of treatment with the antimalarial drug atovaquone. Results: Increasing the concentration of atovaquone increased the likelihood of high-level resistance emergence. Treating very early or late in infection reduced the risk of resistance, likely as a result of population size at time of treatment, but we were not able to exclude influence of the immune response in the latter. When we varied starting inoculum, resistance was more likely at intermediate inoculum sizes, but this did not correlate directly with population sizes at time of treatment. Conclusions and implications: (i) Higher doses do not always minimize resistance emergence and can result in competitive release of parasites with high-level resistance. (ii) Altering treatment timing affects the risk of resistance emergence, but not as a simple function of population size at the time of treatment. (iii) Finding the 'right' dose and 'right' time to maximize clinical gains and limit resistance emergence can vary depending on biological context and was non-trivial even in our simplified experiments.