This paper deals with a Maximum Likelihood receiver for a nonlinearly distorted OFDM signal over a flat channel with AWGN. The nonlinearity destroys the orthogonality between subcarriers, consequently, a per subcarrier decision, used when the linear PA is considered, is no longer optimal. We propose a sub-optimal receiver based on the Maximum Likelihood (ML) criterion. The ML receiver has to find the minimum Euclidean distance between the received vector and a set of all possible OFDM symbols passed through the same nonlinearity. This approach has exponential complexity. To reduce the complexity, we propose a sub-optimal receiver that minimizes the Euclidean distance, seen as a cost function, by the gradient descent algorithm. Unfortunately, due to the nonlinearity, the cost function is non-convex. In order to overcome this obstacle, we propose a method to classify the solution, i.e., to decide if the achieved minimum is local or global. We modify the gradient descent algorithm to avoid convergence to a local minimum. It is shown that the proposed receiver outperforms the simple OFDM and iterative receivers in terms of symbol error rate (SER) performance.