At L-band (1–2 GHz), and particularly in microwave radiometry (1.413 GHz), vegetation has been traditionally modeled with the τ-ω model. This model has also been used to compensate for vegetation effects in Global Navigation Satellite Systems-Reflectometry (GNSS-R) with modest success. This manuscript presents an analysis of the vegetation impact on GPS L1 C/A (coarse acquisition code) signals in terms of attenuation and depolarization. A dual polarized instrument with commercial off-the-shelf (COTS) GPS receivers as back-ends was installed for more than a year under a beech forest collecting carrier-to-noise (C/N0) data. These data were compared to different ground-truth datasets (greenness, blueness, and redness indices, sky cover index, rain data, leaf area index or LAI, and normalized difference vegetation index (NDVI)). The highest correlation observed is between C/N0 and NDVI data, obtaining R2 coefficients larger than 0.85 independently from the elevation angle, suggesting that for beech forest, NDVI is a good descriptor of signal attenuation at L-band, which is known to be related to the vegetation optical depth (VOD). Depolarization effects were also studied, and were found to be significant at elevation angles as large as ~50°. Data were also fit to a simple τ-ω model to estimate a single scattering albedo parameter (ω) to try to compensate for vegetation scattering effects in soil moisture retrieval algorithms using GNSS-R. It is found that, even including dependence on the elevation angle (ω(θe)), at elevation angles smaller than ~67°, the ω(θe) model is not related to the NDVI. This limits the range of elevation angles that can be used for soil moisture retrievals using GNSS-R. Finally, errors of the GPS-derived position were computed over time to assess vegetation impact on the accuracy of the positioning.