Multi-photon Bragg diffraction is a powerful method for fast, coherent momentum transfer of atom waves. However, laser noise, Doppler detunings, and cloud expansion limit its efficiency in large momentum transfer (LMT) pulse sequences. We present simulation studies of robust Bragg pulses developed through numerical quantum optimal control. Optimized pulse performance under noise and cloud inhomogeneities is analyzed and compared to analogous Gaussian and adiabatic rapid passage pulses in simulated LMT Mach-Zehnder interferometry sequences. The optimized pulses maintain robust population transfer and phase response over a broader range of noise, resulting in superior contrast in LMT sequences with thermal atom clouds and intensity inhomogeneities. Large optimized LMT sequences use lower pulse area than Gaussian pulses, making them less susceptible to spontaneous emission loss. The optimized sequences maintain over five times better contrast with tens of $\hbar k$ momentum separation and offers more improvement with greater LMT. Such pulses could allow operation of Bragg atom interferometers with unprecedented sensitivity, improved contrast, and hotter atom sources.