Wrong parking incidents pose a pervasive challenge in urban environments, disrupting the smooth flow of traffic, compromising safety and contributing to various logistical issues. Unauthorized parking occurs when vehicles are parked in locations not designated for such purposes, leading to a myriad of problems for both authorities and the general public. This research introduces a pioneering approach to confront the persistent challenge of unauthorized parking incidents in urban environments. The study focuses on harnessing the advanced capabilities of the FusionNet model to enhance the accuracy of license plate detection. This paper introduces the YOLO v8 Model, a deep learning architecture designed to enhance urban parking management by accurately detecting vehicles parked in unauthorized slots. The objective is to enhance parking management efficiency by accurately detecting vehicles and their occupancy status in designated parking areas. The methodology begins with data collection and preprocessing of images of parking spaces, followed by the training of YOLO v8 to identify vehicles and parking spaces in real time. Leveraging a diverse dataset encompassing various parking scenarios, including instances of unauthorized parking, the model achieves an accuracy of 98.50% in identifying vehicles outside designated areas. This model segments characters from detected license plates, enabling the accurate extraction of alphanumeric information associated with each vehicle. The integrated system provides timely identification of parking violations and facilitates effective enforcement actions through captured license plate data. Results demonstrate the model's effectiveness in real‐world scenarios, showcasing its potential for improving urban safety and efficiency. The implementation of FusionNet in the Python programming language, the proposed solution aims to streamline parking management, improve compliance with parking regulations and enhance overall urban mobility., with robust precision 96.17%, specificity 97.42% and sensitivity 96.19%, surpassing other MobileNet, CNN, ANN, DNN and EfficientNet models.