In this paper, we propose a real-time visual mapping scheme which can be implemented on a low-cost embedded system for consumer-level ratio control (RC) drones. In our work, a 3-dimensional occupancy grid map is obtained based on an estimated trajectory from data fusion of multiple on-board sensors, composed of two downward-facing cameras, two forward-facing cameras, a GPS receiver, a magnetic compass and an inertial measurement unit (IMU) with 3-axis accelerometers and gyroscopes. Taking the advantages of the low-cost FPGA and ARM NEON intrinsics, we run our visual odometry and mapping algorithms at 10Hz on board. Meanwhile, we also present a hierarchical multi-sensor fusion algorithm to provide a robust trajectory for mapping usage. Finally, we verify the feasibility of our approaches and serval potential applications with experimental results in complex indoor/outdoor environments.