High-fidelity photodetection enables the transfer of the low noise inherent to optical oscillators to the microwave domain. However, when photodetecting optical signals of the highest timing stability, photodiode flicker (1/f) noise can dominate the resulting timing jitter at timescales longer than ~1 ms. With the goal of improving femtosecond-level timing fidelity when transferring from the optical to microwave domain, we vary the duty cycle of a train of optical pulses and show that the photodetector flicker phase noise on a photonically generated 1 GHz microwave signal can be reduced by ~10 dB under ultrashort pulse illumination, reaching as low as -140/f dBc/Hz. In addition, a strong correlation between amplitude and phase flicker noise is found, implying a single baseband noise source can modulate both quadratures of the microwave carrier. These findings expand the limits of the ultimate timing stability that can be transferred from optics to electronics.