Purpose This work aims to develop a system that enables improved visualization of Architectural Engineering and Construction (AEC) 3D-information for application in design, construction, and management of the built environment. Method A novel Augmented Reality (AR) system is presented that uses a single standard digital camera and which, contrary to other investigated approaches, does not rely on any markers inserted in the scene, nor on any positioning and inertial technologies. The system is solely image-based and consists of two stages. In a first offline stage, a 3D-map of the scene is automatically constructed from a set of digital images, and the augmenting information (e.g. the 3D-model of the building asset) is subsequently registered with this map. The 3D-map reconstruction employs structure-from-motion techniques with SURF features (the resulting map consisting of a set 3D-referenced SURF-features) followed by a Poisson mesh reconstruction procedure. The next step consists of online operations. The positions of target digital images (e.g. from video stream or head-mounted camera) are automatically calculated, using a robust SURF feature matching procedure that is optimized for three different situations (initialization, tracking, and resetting) implementing octrees for efficient 3D-pruning, and kd-trees for efficient feature matching. Once each input image is positioned within the map, the view is augmented. A notable feature of dense mesh scene reconstruction conducted in the present work is that it enables static occlusions of the scene on the augmenting data to be taken into account. Results & Discussion Several experiments validate the pr posed system and demonstrate its overall performance: a near real-time processing speed, very accurate and stable positioning. The limitations of the current system are also discussed including: the currently limited processing speed and the need for adequately textured scenes.