Many industrial sectors face increasing production demands and need to reduce costs, without compromising the quality. Whereas mass production relies on well-established protocols, small production facilities with small lot sizes struggle to update their highly changeable production at reasonable costs. The use of robotics and automation has grown significantly in recent years, but extremely versatile robotic manipulators are still not commonly used in small factories. Beside of the investments required to enable efficient and profitable use of robot technology, the efforts needed to program robots are only economically viable in case of large lot sizes. Generating robot programs for specific manufacturing tasks still relies on programming trajectory waypoints by hand. The use of virtual simulation software and the availability of the specimen digital models can facilitate robot programming. Nevertheless, in many cases, the virtual models are not available or there are unavoidable differences between virtual and real setups, leading to inaccurate robot programs and time-consuming manual corrections. This could be avoided by measuring the real-geometry and the position of the specimen, which creates the paradox of having to plan robot paths for surface mapping purposes, before the originally intended robot task can be approached. Previous works have demonstrated the use of robotically manipulated optical sensors to map the geometry of samples. However, the use of simple user-defined robot paths, which are not optimized to the part geometry, typically causes some areas of the samples to not be mapped with the required level of accuracy or to not be sampled at all by the optical sensor. This work presents an autonomous framework to enable adaptive surface mapping, without any previous knowledge of the part geometry being transferred to the system. The article gives an overview of the related work in the field, a detailed description of the proposed framework and a proof of its functionality through both simulated and experimental evidences.