Severe matrix effects and high signal uncertainty are two key bottlenecks for laser-induced breakdown spectroscopy (LIBS) quantification and wide applications. Based on the understanding that the superposition of both matrix effects and signal uncertainty directly affects plasma parameters and further influences spectral intensity and LIBS quantification performance, a data selection method based on plasma temperature matching (DSPTM) was proposed to reduce both matrix effects and signal uncertainty. By selecting spectra with less plasma temperature differences for all samples, the proposed method was able to build up the quantification model more relying on spectra with less matrix effects and signal uncertainty, therefore improving final quantification performance. In application for quantitative analysis for zinc (Zn) content in brass alloys, it was found that both the accuracy and precision were improved using either univariate model or multiple linear regression (MLR). More specifically, for univariate model, the root-mean-square-error of prediction (RMSEP), the determination coefficients (R^2), relative standard derivation (RSD) were improved from 3.30%, 0.864, 18.8% to 1.06%, 0.986, 13.5%, respectively; while for MLR, RMSEP, R2, RSD were improved from 3.22%, 0.871, 26.2% to 1.07%, 0.986, 17.4%, respectively. Results proved that DSPTM can be used as an effective way to reduce matrix effects and to improve repeatability by selecting reliable data.