Microassembly is an innovative alternative to the microfabrication process of MOEMS which is quite complex. It usually implies the usage of microrobots controlled by an operator. The reliability of this approach has been already confirmed for the micro-optical technologies. However, the characterization of assemblies has shown that the operator is the main source of inaccuracies in the teleoperated microassembly, so there is a great interest in automating the microassembly process. One of the constraints of automation in microscale is the lack of high precision sensors capable to provide the full information about the object position. Thus, the usage of visualbased feedback represents a very promising approach allowing to automate the microassembly process. The purpose of this paper is to characterize the techniques of object position estimation based on the visual data, i.e. visual tracking techniques from the ViSP library. These algorithms allows to get the 3D object pose using a single view of the scene and the CAD model of the object. The performance of three main types of modelbased trackers is analyzed and quantified: edge-based, texturebased and hybrid tracker. The problems of visual tracking in microscale are discussed. The control of the micromanipulation station used in the framework of our project is performed using a new Simulink block set. Experimental results are shown and demonstrate the possibility to obtain the repeatability below 1 µm.