Cyber-physical systems (CPS) represent the integration of the physical world with digital technologies and are expected to change our everyday lives significantly. With the rapid development of CPS, the importance of artificial intelligence (AI) has been increasingly recognized. Concurrently, adversarial attacks that cause incorrect predictions in AI models have emerged as a new risk. They are no longer limited to digital data and now extend to the physical environment. Thus, they are pointed out to pose serious practical threats to CPS. In this paper, we focus on the “adversarial camera stickers attack,” a type of physical adversarial attack. This attack directly affixes adversarial noise to a camera lens. Since the adversarial noise perturbs various images captured by the camera, it must be universal. To realize more effective adversarial camera stickers, we propose a new method for generating more universal adversarial noise compared to previous research. We first reveal that the existing method for generating noise of adversarial camera stickers does not always lead to the creation of universal perturbations. Then, we address this drawback by improving the optimization problem. Furthermore, we implement our proposed method, achieving an attack success rate 2.5 times higher than existing methods. Our experiments prove the capability of our proposed method to generate more universal adversarial noises, highlighting its potential effectiveness in enhancing security measures against adversarial attacks in CPS.