Climate trends estimated using historical radiosounding time series may be significantly affected by the choice of the regression method to use, as well as by a subsampling of the dataset often adopted in specific applications. These are contributions to the uncertainty of trend estimations, which have been quantified in literature, although on specific pairs of regression methods, and in the not very recent past characterized by smaller trends in temperature than those observed over the last two decades. This paper investigates the sensitivity of trend estimations to four linear regression methods (parametric and nonparametric) and to the artificial subsampling of the same dataset using historical radiosounding time series from 1978 onwards, available in the version 2 of the Integrated Global Radiosonde Archive (IGRA). Results show that long-term decadal trends may have not negligible uncertainties related to the choice of the regression method, the percentage of data available, the amount of missing data and the number of stations selected in the dataset. The choice of the regression methods increases uncertainties in the decadal trends ranging from −0.10 to −0.01 KÁda-1 for temperature in the lower stratosphere at 100 hPa and from 0.2 to 0.8% da-1 for relative humidity (RH) in the middle troposphere at 300 hPa. Differences can also increase up to 0.4 KÁda-1 at 300 hPa when the amount of missing data exceeds 50% of the original dataset for temperature, while for RH, significant differences are observed in the lower troposphere at 925 hPa for almost all datasets. Finally, subsampling effects on trend estimation are quantified by artificially reducing the size of the IGRA dataset: Results show that subsampling effects on trend estimations when at least 60 stations, up to 76% of data available, are considered for temperature and at least 40 stations for RH.