This paper considers the permissive supervisor synthesis for probabilistic systems modeled as Markov Decision Processes (MDP). Such systems are prevalent in power grids, transportation networks, communication networks and robotics. Unlike centralized planning and optimization based planning, we propose a novel supervisor synthesis framework based on learning and compositional model checking to generate permissive local supervisors in a distributed manner. With the recent advance in assume-guarantee reasoning verification for probabilistic systems, building the composed system can be avoided to alleviate the state space explosion and our framework learn the supervisors iteratively based on the counterexamples from verification. Our approach is guaranteed to terminate in finite steps and to be correct.