Vehicular networking technology using federated learning enhances data privacy and security compared to centralized methods. Yet, it requires further refinement to combat single-point failure and membership inference attacks, privacy concerns, and communication expenses. This paper employs federated differential privacy and blockchain integration to address these challenges, alongside ternary gradient technology and model compression to reduce communication costs. In differential privacy experiments, we have determined that federated differential privacy protection is closer to the accuracy of a no-privacy protection scheme compared to traditional differential privacy protection, especially when C ≥ 2. In ternary gradient experiments, we observed a reduction in training gradients of 14.99×, 15.54×, and 15.97× across three datasets. In layer sensitivity experiments, we found that the accuracy at top = 97%, 94%, and 91% is comparable to that at top = 100% (uncompressed). In blockchain experiments, The training effects and overall trends of blockchain models are similar to those of non-blockchain models.