Computer systems are used in Precision Agriculture (PA) to provide relational sampling, accuracy and data processing required for agricultural practices and schemes, which are not common to conventional agriculture, demanding higher costs of production and research directed to the remote sensing for mapping and inspection of crop rows. Tasks are carried out using a priori On-the-Go sensors, and proprioceptive and exteroceptive ones, embedded instrumentation, geographic information and existing implements on production and farming, which these activities ensure during the steps of planting, maturation, maintenance and harvesting of a particular culture. Also, these tasks are aided with the use of terrestrial agricultural mobile robots such as autonomous vehicle platforms for locomotion between the crop rows in the acquisition of field data. Given this information and relating the investment and development of these technologies, the goal of this work is to assist the inspection, quantification and qualification of agricultural crops (citrus) of planting area through analyzing and identifying data for sensor fusion, associated with digital image processing, optical thermal imaging and optical flow sensors and reflectance, based on the extraction of real natural scenes objects, identifying items such as fruits, grasses, stems, branches and leaves, thus providing a qualitative and quantitative set of data analyzed culture. As of a computer vision system and sensor fusion, Gigabit Ethernet cameras, sonars, thermal camera, infrared optical flow sensors and monochrome CMOS sensors are embedded in two sides of the vehicle structure, with the same geometric center ratio of the agricultural mobile robot. After data acquisition, image and signal processing techniques are applied for non homogeneous region segmentation, and pattern recognition through statistical classifiers. Such techniques are programmed in MATLAB and OpenCV, embedded in a computer platform. Cognitive classifiers that perform pattern and classes matching are intended on a combination of weighted data fusion techniques, that during locomotion of the agricultural mobile robot, the steps of image processing and combination of parameters for classification are manipulated for analysis retrospectively, conflicting existing data with likely changes resulting in culture and biomass. Figura 3.6: Na sequência de imagens acima, da esquerda para direita tem-se a imagem visível, imagem NIR e imagem UVFL; na sequência de imagens abaixo, da esquerda para direita temse a imagem do fruto com irregularidade, segmentação usando CART, e segmentação usando LDA (Adaptado de BLASCO et al., 2007b) Figura 3.7: Etapas de segmentação e classificação, considerando o índice de diferença normalizado, o filtro de variância, componentes Cb e Cr, e inserção de rótulo no objeto encontrado, separado do plano de fundo (PAYNE et al., 2013) Figura 3.8: Imagens multimodais registradas de folhas de macieiras, apresentando sete níveis de patógenos que causam irregularidades na superfície, 21 d...