This paper considers decentralized nonconvex optimization with the cost functions being distributed over agents. Noting that information compression is a key tool to reduce the heavy communication load for decentralized algorithms as agents iteratively communicate with neighbors, we propose three decentralized primal-dual algorithms with compressed communication. The first two algorithms are applicable to a general class of compressors with bounded relative compression error and the third algorithm is suitable for two general classes of compressors with bounded absolute compression error.We show that the proposed decentralized algorithms with compressed communication have comparable convergence properties as state-of-the-art algorithms without communication compression. Specifically, we show that they can find first-order stationary points with sublinear convergence rate O(1/T ) when each local cost function is smooth, where T is the total number of iterations, and find global optima with linear convergence rate under an additional condition that the global cost function satisfies the Polyak-Łojasiewicz condition. Numerical simulations are provided to illustrate the effectiveness of the theoretical results.