Sparse superposition codes, or sparse regression codes (SPARCs), are a recent class of codes for reliable communication over the AWGN channel at rates approaching the channel capacity. Approximate message passing (AMP) decoding, a computationally efficient technique for decoding SPARCs, has been proven to be asymptotically capacity-achieving for the AWGN channel. In this paper, we refine the asymptotic result by deriving a large deviations bound on the probability of AMP decoding error. This bound gives insight into the error performance of the AMP decoder for large but finite problem sizes, giving an error exponent as well as guidance on how the code parameters should be chosen at finite block lengths. For an appropriate choice of code parameters, we show that for any fixed rate less than the channel capacity, the decoding error probability decays exponentially in n/(log n) 2T , where T , the number of AMP iterations required for successful decoding, is bounded in terms of the gap from capacity. these strong theoretical guarantees, the rates achieved by this decoder for practical block lengths are significantly less than C. Subsequently, a adaptive soft-decision iterative decoder was proposed by Cho and Barron [3], with improved finite length performance for rates closer to capacity. Theoretically, the decoding error probability of the adaptive soft-decision decoder was shown to decay exponentially in n/(log n) 2T , where T is the minimum number of iterations [4], [5].Recently, decoders for SPARCs based on Approximate Message Passing (AMP) techniques were proposed in [6]- [8]. AMP decoding has several attractive features, notably, the absence of tuning parameters, its superior empirical performance at finite block lengths, and its low complexity when implemented using implicitly defined Hadamard design matrices [7], [8]. Furthermore, its decoding performance in each iteration can be predicted using a deterministic scalar iteration called 'state evolution'.In this paper, we provide a non-asymptotic analysis of the AMP decoder proposed in [7]. In [7], it was proved that the state evolution predictions for the AMP decoder are asymptotically accurate, and that for any fixed rate R < C, the probability of decoding error goes to zero with growing block length. However this result did not specify the rate of decay of the probability of error. In this paper, we refine the asymptotic result in [7], and derive a large deviations bound for the probability of error of the AMP decoder (Theorem 1). This bound gives insight into the error performance of the AMP decoder for large but finite problem sizes, giving an error exponent as well as guidance on how the code parameters should be chosen at finite block lengths.The error probability bound for the AMP decoder is of the same order as the bound for the Cho-Barron softdecision decoder [4], [5]: both bounds decay exponentially in n/(log n) 2T , where T is the minimum number of iterations. However, the AMP decoder has slightly lower complexity and has been empirically found to h...