Among the current technologies to analyse large data, the MapReduce processing model stands out in Big Data. MapReduce is implemented in frameworks such as Hadoop, Spark or Flink that are able to manage the program executions according to the resources available at runtime. The developer should design the program in order to support all possible nondeterministic executions. However, the program may fail due to a design fault. Debugging these kinds of faults is difficult because the data are executed non-deterministically in parallel and the fault is not caused directly by the code, but by its design. This paper presents a framework called MRDebug which includes two debugging techniques focused on the MapReduce design faults. A spectrum-based fault localization technique locates the root cause of these faults analysing several executions of the test case, and a Delta Debugging technique isolates the data relevant to trigger the failure. An empirical evaluation with 13 programs shows that MRDebug is effective in debugging the faults, especially when the localization is done with the reduced data. In summary, MRDebug automatically provides valuable information to understand MapReduce design faults as it helps locate their root cause and obtains a minimal data that triggers the failure.