Unlike traditional visualization methods, augmented reality (AR) inserts virtual objects and information directly into digital representations of the real world, which makes these objects and data more easily understood and interactive. The integration of AR and GIS is a promising way to display spatial information in context. However, most existing AR-GIS applications only provide local spatial information in a fixed location, which is exposed to a set of problems, limited legibility, information clutter and the incomplete spatial relationships. In addition, the indoor space structure is complex and GPS is unavailable, so that indoor AR systems are further impeded by the limited capacity of these systems to detect and display location and semantic information. To address this problem, the localization technique for tracking the camera positions was fused by Bluetooth low energy (BLE) and pedestrian dead reckoning (PDR). The multi-sensor fusion-based algorithm employs a particle filter. Based on the direction and position of the phone, the spatial information is automatically registered onto a live camera view. The proposed algorithm extracts and matches a bounding box of the indoor map to a real world scene. Finally, the indoor map and semantic information were rendered into the real world, based on the real-time computed spatial relationship between the indoor map and live camera view. Experimental results demonstrate that the average positioning error of our approach is 1.47 m, and 80% of proposed method error is within approximately 1.8 m. The positioning result can effectively support that AR and indoor map fusion technique links rich indoor spatial information to real world scenes. The method is not only suitable for traditional tasks related to indoor navigation, but it is also promising method for crowdsourcing data collection and indoor map reconstruction.