Here, we propose an all-instrumental objective method (hand tracking) for the real time quantification of perception of stimuli presented in various modalities. The method is based on measuring the dynamic error of performance by a subject of motor control over a real or virtual (computer generated) stimulus with the help of a contactless manipulator. The method got its name from its previous analogue-the eye tracking; in both cases, the measurement of psychophysiological characteristics of the subject is carried out by monitoring the parameters of his/her motor activities. The above principle has been implemented in the form of a hardware-software complex Hand tracker, which has been tested for the ability to perceive the "segment orientation angle" and "figure color against the background" virtual stimuli. With the first stimulus, individual diagrams of perception accuracy were obtained in healthy subjects who were asked to guess the segment inclination angle. The results allowed us to identify the parameters of the neural model simulating the respective primary recognizer. With the second visual stimulus, we found that in patients with focal brain lesions, the thresholds of color differentiation were abnormally high and, unlike the normal ones, did not depend on the semantic content of the stimulus. Since the Hand tracker provides for the simultaneous control over several (up to eight) independent parameters of audial and visual stimuli, the proposed method can be used in studying the intermodal connections and/or the collective behavior, when different team members control different parameters of the stimulus, trying to solve a common problem. The article also addresses the prospect of implementing the hand tracking method in the form of special applications for smartphones with embedded movement sensors. Such a portable device could be used for epidemiological studies of perception (primary cognitive functions) to elucidate the age-and gender-related, regional, social and other stratification of populations.