The gaze point and gaze line, measured with an eye tracking device, can be used in various interaction interfaces, like mobile robot programming in immersive virtual environment. Path generation of the robot should be made without any tedious eye gestures, but rather it should be detected from the context. The obtained trajectory, influenced by the precision of the estimated gaze point, can be used by physically disabled people in moving with a wheelchair using their eyes. The goal of this study is to assess the accuracy of the gaze point computation based on eye tracking in an immersive virtual environment. The point in space where the two directions of left and right eye converge gives a measure for the distance to the gazed objects. This distance is needed whenever the user wants to indicate a point in space or in case two or more objects to be selected are placed one behind the other. In this work several experiments have been conducted to assess the accuracy of the convergence point detection in space.