Abstract-We propose algorithms for constructing linear embeddings of a finite dataset V ⊂ R d into a k-dimensional subspace with provable, nearly optimal distortions. First, we propose an exhaustive-search-based algorithm that yields a k-dimensional linear embedding with distortion at most opt(k)+δ, for any δ > 0 where opt(k) is the smallest achievable distortion over all possible orthonormal embeddings. This algorithm is space-efficient and can be achieved by a single pass over the data V . However, the runtime of this algorithm is exponential in k. Second, we propose a convexprogramming-based algorithm that yields an O (k/δ)-dimensional orthonormal embedding with distortion at most (1 + δ) opt(k). The runtime of this algorithm is polynomial in d and independent of k. Several experiments demonstrate the benefits of our approach over conventional linear embedding techniques, such as principal components analysis (PCA) or random projections.