Since the work of Page in the 1950s, the problem of detecting an abrupt change in the distribution of stochastic processes has received a great deal of attention. In particular, a deep connection has been established between Lorden's minimax approach to change-point detection and the widely used CUSUM procedure, first for discrete-time processes, and subsequently for some of their continuous-time counterparts. However, results for processes with jumps are still scarce, while the practical importance of such processes has escalated since the turn of the century. In this work we consider the problem of detecting a change in the distribution of continuous-time processes with independent and stationary increments, i.e. Lévy processes, and our main result shows that CUSUM is indeed optimal in Lorden's sense. This is the most natural continuous-time analogue of the seminal work of Moustakides [12] for sequentially observed random variables that are assumed to be i.i.d. before and after the change-point. From a practical perspective, the approach we adopt is appealing as it consists in approximating the continuous-time problem by a suitable sequence of change-point problems with equispaced sampling points, and for which a CUSUM procedure is shown to be optimal.