Explicit signal coordination carries prior knowledge of traffic engineering and is widely accepted for global implementation. With the recent popularity of reinforcement learning, numerous researchers have turned to implicit signal coordination. However, these methods inevitably require learning coordination from scratch. To maximize the use of prior knowledge, this study proposes an explicit coordinated signal control (ECSC) method using a soft actor–critic for cycle length determination. This method can fundamentally solve the challenges encountered by traditional methods in determining the cycle length. Soft actor–critic was selected among various reinforcement learning methods. A single agent was administered to the arterials. An action is defined as the selection of a cycle length from among the candidates. The state is represented as a feature vector, including the cycle length and features of each leg at every intersection. The reward is defined as departures that indirectly minimize system vehicle delays. Simulation results indicate that ECSC significantly outperforms the baseline methods, as evident in system vehicle delay across nearly all demand scenarios and throughput in high demand scenarios. The ECSC revitalizes explicit signal coordination and introduces new perspectives on the application of reinforcement learning methods in signal coordination.