The leap forward in research progress in real-time object detection and classification has been dramatically boosted by including Embedded Artificial Intelligence (EAI) and Deep Learning (DL). Realtime object detection and classification with deep learning require many resources and computational power, which makes it more difficult to use deep learning methods on edge devices. This paper proposed a new, highly efficient Field Programmable Gate Array (FPGA) based real-time object detection and classification system using You Only Look Once (YOLO) v3 Tiny for edge computing. However, the proposed system has been instantiated with Advanced Driving Assistance Systems (ADAS) for evaluation. Traffic light detection and classification are crucial in ADAS to ensure drivers' safety. The proposed system used a camera connected to the Kria KV260 FPGA development board to detect and classify the traffic light. Bosch Small Traffic Light Dataset (BSTLD) has been used to train the YOLO model, and Xilinx Vitis AI has been used to quantify and compile the YOLO model. The proposed system can detect and classify traffic light signals from a high-definition (HD) video streaming in 15 frames per second (FPS) with 99% accuracy. In addition, it consumes only 3.5W power, demonstrating the ability to work on edge devices. The on-road experimental results represent fast, precise, and reliable detection and classification of traffic lights in the proposed system. Overall, this paper demonstrates a low-cost and highly efficient FPGA-based system for real-time object detection and classification.INDEX TERMS FPGAs, Object detection and classification, YOLO, Edge computing.