To solve the limitations of complexity and repeatability of existing broadcast filming systems, a new broadcast filming system was developed. In particular, for Korean music broadcasts, the shooting sequence is stage and lighting installation, rehearsal, lighting effect production, and main shooting; however, this sequence is complex and involves multiple people. We developed an automatic shooting system that can produce the same effect as the sequence with a minimum number of people as the era of un-tact has emerged because of COVID-19. The developed system comprises a simulator. After developing a stage using the simulator, during rehearsal, dancers’ movements are acquired using UWB and two-dimensional (2D) LiDAR sensors. By inserting acquired movement data in the developed stage, a camera effect is produced using a virtual camera installed in the developed simulator. The camera effect comprises pan, tilt, and zoom, and a camera director creates lightning effects while evaluating the movements of virtual dancers on the virtual stage. In this study, four cameras were used, three of which were used for camera pan, tilt, and zoom control, and the fourth was used as a fixed camera for a full shot. Video shooting is performed according to the pan, tilt, and zoom values of the three cameras and switcher data. Only the video of dancers recorded during rehearsal and that produced by the lighting director via the existing broadcast filming process is overlapped in the developed simulator to assess lighting effects. The lighting director assesses the overlapping video and then corrects parts that require to be corrected or emphasized. The abovementioned method produced better lighting effects optimized for music and choreography compared to existing lighting effect production methods. Finally, the performance and lighting effects of the developed simulator and system were confirmed by shooting using K-pop using the pan, tilt, and zoom control plan, switcher sequence, and lighting effects of the selected camera.