Stored process data in the form of high fidelity time trends are a resource for data-driven process analyses such as statistical monitoring, minimum variance control loop benchmarking, fault detection, data reconciliation and development of inferential sensors. However, many commercial data historians compress the data before archiving it and a question therefore arises of how useful the compressed data are for the intended purposes.This article examines the impact of compression on data-driven methods and presents an automated algorithm by which the presence of piecewise linear compression may be inferred during the pre-processing phase of a data-driven analysis.The results show that compression interferes with many types of data-driven analyses and the paper strongly recommends caution in the use of compressed process data archives.