Abstract-As modern communication transceivers scale to multi-Gbps speeds, the power consumption and cost of highresolution, high-speed analog-to-digital converters (ADCs) become a crucial bottleneck in realizing "mostly digital" receiver architectures that leverage Moore's law. This bottleneck could potentially be alleviated by designing analog front ends for the more specific goal of analog-to-information conversion (i.e., preserving the digital information residing in the received signal). As one possible approach towards this goal, we consider a generalization of the standard flash ADC: instead of implementing n bit quantization of a sample by passing it through 2 n − 1 slicers as in a standard ADC, the slicers are dispersed in time as well as space (i.e., amplitude). Considering BPSK over a dispersive channel, we first show, using ideas similar to those underlying compressive sensing, that randomly dispersing enough one-bit slicers over space and time does provide information sufficient for reliable demodulation over a dispersive channel. We then propose an iterative algorithm for optimizing the design of the sampling times and amplitude thresholds, and provide numerical results showing that the number of slicers can be significantly reduced relative to a conventional flash ADC with comparable bit error rate (BER). These system-level results motivate further investigation, in terms of both circuit and system design, into looking beyond conventional ADC architectures when designing analog front-ends for high-speed communication.