The coordinate time series determined with the Global Positioning System (GPS) contain annual and semi-annual periods that are routinely modeled by two periodic signals with constant amplitude and phase-lag. However, the amplitude and phaselag of the seasonal signals vary slightly over time. Various methods have been proposed to model these variations such as Wavelet Decomposition (WD), writing the amplitude of the seasonal signal as a Chebyshev polynomial that is a function of time (CP), Singular Spectrum Analysis (SSA), and using a Kalman Filter (KF). Using synthetic time series, we investigate the ability of each method to capture the time-varying seasonal signal in time series with different noise levels. We demonstrate that the precision by which the varying seasonal signal can be estimated depends on the ratio of the variations in the seasonal signal to the noise level. For most GPS time series, this ratio is between 0.05 and 0.1. Within this range, the WD and CP have the most trouble in separating the seasonal signal from the noise. The most precise estimates of the variations are given by the SSA and KF methods. For real GPS data, SSA and KF can model 49-84 and 77-90% of the variance of the true varying seasonal signal, respectively.