Blender is a free and open-source 3D animation software, which can be used as a simulation tool in metrology, to build numerical models that can be used in the design and optimisation of camera-based measurement systems. In this work, the relevance of using Blender to model reality from an observer's point of view, according to the laws of optics, is explored. Two experiments were conducted in real-world and Blender modelling environments, one using individual cameras for a simple measurement task, the other considering multi-camera position optimisation. The objective was to verify whether the virtual cameras created in Blender can perceive and measure objects in the same manner as the real cameras in an equivalent environment. The results demonstrate that in its native modelling format Blender satisfies the optical metrology characteristics of measurement, but the correlation between Blender output and real-world results is highly sensitive to initial modelling parameters such as light intensity, camera definitions and object surface texture.