Virtual reality (VR) services have become increasingly popular but presented challenges for wireless communications due to the large amounts of data requirements. In this work, we consider a dynamic changing VR scenario and propose a joint caching, computing, and communication (3C) strategy, subject to bounded latency, power, caching, and computing constraints, to minimize long-term discounted delay and energy consumption for VR projection. Our approach involves a threelayer communication system consisting of a cloud server, UAV (Unmanned Aerial Vehicle) base stations with mMIMO (massive Multiple-Input Multiple-Output) acting as edge servers, and mobile user devices. To satisfy different users' requirements, we design eight service routes for 3C decisions. We then employ federated multi-agent deep reinforcement learning (RL) to help users obtain optimal service routes influenced by their location, orientation, and content preference, with edge servers acting as learning agents. For the RL part, we design multi-input and output actor and critic networks deployed on edge servers. For the Federated Learning (FL) part, we present the federated average process and mathematically prove its convergence. Simulation results demonstrate our proposed algorithm can effectively reduce training loss, converge smoothly, and significantly reduce both delay and energy consumption by approximately 18.3% and 25.6%, respectively.