Quantitative analysis performance is considered the Achilles’ heel of laser-induced breakdown spectroscopy. Improving the raw spectral signal is fundamental to achieving accurate quantification. Signal-to-noise ratio enhancement and uncertainty reduction are two targets to improve the raw spectral signal. Most LIBS studies choose the maximum signal-to-noise ratio as the target to optimize the signal. However, there are no precise conclusions about how to optimize signal until now. It has been insisted by our group that the lowest signal uncertainty should be the optimization criterion, which is verified in this article. This study performed quantitative analysis on brass samples at three typical pressures: atmospheric pressure (100 kPa), pressure corresponding to the maximal signal-to-noise ratio (60 kPa), and pressure corresponding to the lowest signal uncertainty (5 kPa) under the optimal spatiotemporal window at each pressure based on a previous study. The results indicate that a pressure of 60 kPa led to a decrease in the accuracy and an increase in the precision of the quantitative analysis; the pressure of 5 kPa led to the highest accuracy and the best precision of the quantitative analysis. Reasons for changes in quantitative analysis are analyzed in detail through matrix effects and signal uncertainty. Therefore, selecting the pressure that corresponds to the lowest signal uncertainty can better improve the LIBS quantitative analysis performance. Signal uncertainty reduction is recommended as a more important direction for the LIBS community.