Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce an increment entropy to measure the complexity of time series in which each increment is mapped onto a word of two letters, one corresponding to the sign and the other corresponding to the magnitude. Increment entropy (IncrEn) is defined as the Shannon entropy of the words. Simulations on synthetic data and tests on epileptic electroencephalogram (EEG) signals demonstrate its ability of detecting abrupt changes, regardless of the energetic (e.g., spikes or bursts) or structural changes. The computation of IncrEn does not make any assumption on time series, and it can be applicable to arbitrary real-world data.