The age of Big data testing is now coming. But the traditional software testing may not be able to handle such large quantities of unstructured data. Testing seeks to evaluate the quality of a product by locating enduring artifacts. The industry of software development is under constant pressure to provide fresh software that operates fairly competently while maintaining reliability. Testing necessitates the expansion of the majority of application expenses because it involves quality perfection, fidelity, cost reduction, etc. Software industries are employing testing methods including manual and automated testing tools in this regard. The question that now emerges is how to construct a high performance platform to efficiently test the big data and how to assess an appropriate method to locate the relevant items from big data. Software test cases are to find myths by applying verification of plans like. To fully explore this matter, this paper begins with a brief introduction to big data testing, followed by the discussions of Hadoop testing. Some important open issues and further research directions will also be presented for the next step of big data testing.