Jitter in an electronic signal is any deviation in, or displacement of, the signal in time. This paper investigates on decomposition of two types of jitter, namely, periodic and random jitter in noisy signals. Generally, an oscilloscope generates an eye diagram by overlaying sweeps of different segments of a long data stream driven by the reference clock signal. We use the fast Fourier transform with time lag correlation of the signal since we do not have a clock reference signal and apply this technique to simulated noisy signals. We separately injected a random jitter (of known amount), periodic jitter (with known frequency and amount), and both together to various modulation frequencies of sinusoidal signals. The approach is validated by several experiments with numerous values in jitter parameters. When we separately inject random jitter (5 ps) and periodic jitter (5 ps at 4.37 MHz) to the signal, we obtained the results (4.52±0.25 ps) and (4.93±0.04 ps at 4.40±0.04 MHz), respectively.
The precision and accuracy of time-of-flight fullfield range cameras are important for many applications, however there are a number of noise sources that degrade both precision and accuracy. Many of the noise sources such as nonlinearity, multipath inferences and harmonic cancellation are well investigated. Barely investigated is jitter on the camera light and shutter signals. Here we measure periodic and random jitter on the light signal of a camera. We use signal processing techniques to construct a reference signal, hence find the jitter. The performance of the proposed method is examined using the MESA Imaging SwissRanger 4000. We found periodic jitter of two frequencies at 0.12 and 5.04 MHz and the random jitter of 164 ± 4 ps, on the light signal of the camera.
The determination of one's movement through the environment (visual odometry or self-motion estimation) from monocular sources such as video is an important research problem because of its relevance to robotics and autonomous vehicles. The traditional computer vision approach to this problem tracks visual features across frames in order to obtain 2-D image motion estimates from which the camera motion can be derived. We present an alternative scheme which uses the properties of motion sensitive cells in the primate brain to derive the image motion and the camera heading vector. We tested heading estimation using a camera mounted on a linear translation table with the line of sight of the camera set at a range of angles relative to straight ahead (0 • to 50 • in 10 • steps). The camera velocity was also varied (0.2, 0.4, 0.8, 1.2, 1.6 and 2.0 m/s). Our biologically-based method produced accurate heading estimates over a wide range of test angles and camera speeds. Our approach has the advantage of being a one-shot estimator and not requiring iterative search techniques for finding the heading.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.