This research exploits digital techniques to test analog circuits. A novel test methodology and an optimization algorithm to generate the stimulus have been developed. The aim is to detect as many manufacturing defects as possible, which might occur during the production process of mixed-signal systems. The test stimulus is a discrete-interval binary sequence identified by the optimization algorithm. Response from the analog circuit under test (CUT) is digitized with one-bit resolution by a comparator. Digitized responses from the actual circuit and from fault-free simulation are compared for fault recognition. A figure-of-merit, to measure the ability of a specified binary sequence to detect all possible faults, is defined. Input sequences with good performance will generally be too long to permit exhaustive search of all candidates. Instead, iterative optimization is employed. An optimum sequence has been discovered when no further modification can improve the figure-of-merit. This process of optimization is performed with computer-based simulation of the circuit under test. Consequently, faults and tolerances can be introduced as required and all aspects of behavior can be modelled under controlled conditions. The digitized response of the optimum sequence is stored to be used in actual test application. The methodology has been validated using analog filter. All catastrophic and parametric (deviations of component values from nominal by more than six times the normal tolerances) failures can be detected, with detection probability greater than 98%. Benefits of the methodology include ease of introducing binary signals to analog subsystems and reduction of the hardware required for both stimulus generation and response processing.