The work identifies the first lattice decoding solution that achieves, in the general outage-limited MIMO setting and in the high-rate and high-SNR limit, both a vanishing gap to the error-performance of the (DMT optimal) exact solution of preprocessed lattice decoding, as well as a computational complexity that is subexponential in the number of codeword bits. The proposed solution employs lattice reduction (LR)-aided regularized (lattice) sphere decoding and proper timeout policies. These performance and complexity guarantees hold for most MIMO scenarios, all reasonable fading statistics, all channel dimensions and all full-rate lattice codes.In sharp contrast to the above very manageable complexity, the complexity of other standard preprocessed lattice decoding solutions is revealed here to be extremely high. Specifically the work is first to quantify the complexity of these lattice (sphere) decoding solutions and to prove the surprising result that the complexity required to achieve a certain rate-reliability performance, is exponential in the lattice dimensionality and in the number of codeword bits, and it in fact matches, in common scenarios, the complexity of ML-based solutions. Through this sharp contrast, the work was able to, for the first time, rigorously demonstrate and quantify the pivotal role of lattice reduction as a special complexity reducing ingredient.Finally the work analytically refines transceiver DMT analysis which generally fails to address potentially massive gaps between theory and practice. Instead the adopted vanishing gap condition guarantees that the decoder's error curve is arbitrarily close, given a sufficiently high SNR, to the optimal error curve of exact solutions, which is a much stronger condition than DMT optimality which only guarantees an error gap that is subpolynomial in SNR, and can thus be unbounded and generally unacceptable for practical implementations.
In the setting of computer vision, algorithmic searches often aim to identify an object of interest inside large sets of images or videos. Towards reducing the often astronomical complexity of this search, one can use pruning to filter out objects that are sufficiently distinct from the object of interest, thus resulting in a pruning gain of an overall reduced search space.Motivated by practical computer vision based scenarios such as time-constrained human identification in biometricbased video surveillance systems, we analyze the stochastic behavior of time-restricted search pruning, over large and unstructured data sets which are furthermore random and varying, and where in addition, pruning itself is not fully reliable but is instead prone to errors. In this stochastic setting we apply the information theoretic method of types as well as information divergence techniques to explore the natural tradeoff that appears between pruning gain and reliability, and proceed to study the typical and atypical gainreliability behavior, giving insight on how often pruning might fail to substantially reduce the search space. The result, as is, applies to a plethora of computer vision based applications where efficiency and reliability are intertwined bottlenecks in the overall system performance, and the simplicity of the obtained expressions allows for rigorous and insightful assessment of the pruning gain-reliability behavior in such applications, as well as for intuition into designing general object recognition systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.