By offloading storage and computing resources to the edge of networks, mobile edge computing (MEC) is emerged as a promising architecture to reduce the transmission delay and bandwidth waste for mobile multimedia services. This paper focuses on a multi-service scenario in the MEC systems, where the MEC server can provide three multimedia services including live streaming, buffered streaming and low latency enhanced mobile broadband applications for edge users at the same time. In order to satisfy various quality of service (QoS) requirements for different multimedia applications, the 5G QoS model is applied. Notably, the packets from the multimedia applications with the same or similar requirements are mapped into the same QoS flow, and each QoS flow is processed individually. Therefore, how to effectively schedule the limited radio resource for QoS flows is an intractable problem. To address the problem above, a QoS evaluation model is designed, and a QoS maximization problem is formulated. Furthermore, a deep reinforcement learning method, deep-Q-network, is adopted to make decisions to allocate radio resource dynamically. Compared with round-robin and priority-based scheduling algorithms, the simulation results validate that the proposed algorithm outperforms other resource scheduling algorithms for multi-service scenario. INDEX TERMS Deep-Q-network, deep reinforcement learning, mobile edge computing, multimedia, quality of service.