The use of drones is becoming more present in modern daily life. One of the most challenging tasks associated with these vehicles is the development of perception and autonomous navigation systems. Competitions such as Artificial Intelligence Robotic Racing (AIRR) and Autonomous Drone Racing were launched to drive the advances in such strategies, requiring the development of integrated systems for autonomous navigation in a racing environment. In this context, this paper presents an improved integrated solution for autonomous drone racing, which focuses on simple, robust, and computationally efficient techniques to enable the application in embedded hardware. The strategy is divided into four modules: (i) A trajectory planner computes a path that passes through the sequence of desired gates; (ii) a perception system that obtains the global pose of the vehicle by using an onboard camera; (iii) a localization system which merges several sensed information to estimate the drone's states; and, (iv) an artificial vector field-based controller in charge of following the plan by using the estimated states. To evaluate the performance of the entire integrated system, we extended the FlightGoggles simulator to use gates similar to those used in the AIRR competition. A computational cost analysis demonstrates the high performance of the proposed perception method running on limited hardware commonly embedded in aerial robots. In addition, we evaluate the localization system by comparing it with ground truth and when there is a disturbance in the location of the gates. Results in a representative race circuit based on the AIRR competition showed the proposed strategy's competing performance, presenting itself as a feasible solution for drone racing systems.