The article presents the influence of the distance between a measuring head and a tested object on the results of eddy current defectoscopy test. The tests were conducted on the two inner rings of a tapered roller bearing where the test defects were performed. A one-millimetre hole corresponded to the surface defect. Internal blind holes in the shape of a rectangle corresponded to a subsurface defect. The research was performed with the use of a SSEC III PC defectoscope connected to a mobile PC. The measuring device is part of the system for the automatic quality control of bearing rings. The test was conducted for a slit having following dimensions: 0.1, 0.2, 0.3, 0.4, and 0.5 mm. The obtained characteristics are the composition of the pulse caused by the detected defect and the sine wave caused by the axial runout of the tested bearing ring. The correlation between the slit size and the value of the signal of the defect and the sine wave caused by axial runout of the tested ring was observed. The concept of “a coefficient of defect detection” was introduced. The described coefficient is a quotient of the value of the defect signal and the peak-to-peak value of the sine wave caused by axial runout of the tested bearing ring. The increase in the dimensions of the slit from 0.1 to 0.5 mm causes a 35 - 50% decrease in the coefficient of defect detection, depending on the test defect. A handle was designed to ensure a constant pressure of measuring head on the tested surface was proposed.