The gap created due to diminishing benefits from technology scaling on the one hand, and projected growth in computing demand from future workloads on the other, leads to a need for new sources of computing efficiency. Fortunately, the workloads that are driving demand across the computing spectrum also present new opportunities. At the server end, the demand for computing is driven by the need to organize, analyze, interpret, and search through exploding digital data. In mobile devices and deeply embedded systems, the creation and consumption of richer media and the need to interact more naturally and intelligently with users and the environment are trends that drive much of the computing demand.These workloads exhibit intrinsic application resilience, which is the ability to produce outputs of acceptable quality despite some of their computations being performed in an approximate or imprecise manner [1], [2]. This resilience to approximations stems collectively from the following characteristics: (i) The notion of a unique, golden numerical result simply does not exist; instead, a range of answers are acceptable (e.g., search, recommendation systems), (ii) Even when a golden result exists, the best algorithms fall well short of perfection, therefore users are conditioned to accept good-enough results (most recognition problems), (iii) The algorithms are designed to deal with significant noise in their input data, which endows them with a natural resilience to "errors" introduced by approximations to the computations themselves, and (iv) They utilize computation patterns (such as aggregation or iterative-refinement) that lead to the effects of approximations being self-healed or averaged down. In addition to the above qualitative factors, recent research efforts have quantitatively established the high degree of intrinsic resilience in many applications. For example, our analysis of a benchmark suite of 12 recognition, vision and multimedia applications shows that on average, 83% of the runtime is spent in computations that can tolerate at least some degree of approximation [3]. Therefore, there is a potential to leverage intrinsic resilience in a broad context.Approximate computing broadly refers to techniques that exploit the forgiving nature (intrinsic resilience) of appliThis work was supported in part by the National Science Foundation under grant no. 1018621. cations to realize improvements in performance or energy efficiency at various layers of the computing stack, spanning circuits, architecture and software. Approximate computing is a vibrant research topic that has attracted the attention of researchers from many different communities [4]-[10]. We will describe an integrated framework that we have developed for approximate computing in both software and hardware, including a supporting design methodology.Software: In software, approximate computing can reduce the run-time complexity of programs by skipping computations [1] or by relaxing synchronization or reducing communication between threads to ena...