When relational database systems could no longer keep up with the huge amounts of unstructured data created by organizations, social media, and all other data-generating sources, big data came into being. The amount of data being added every day, together with Hadoop, makes for an urgent and growing need for more data processing solutions. The MapReduce programming model is one common approach for processing and handling huge amounts of data, especially when used to big data research. As HDFS, a distributed, scalable, and portable file system constructed in Java for the Hadoop architecture, is already useful, it is noteworthy that it is built using Java technology. This computing environment suffers from two issues. First, when intruders access the system, they can steal or corrupt the data stored in the system. The AES encryption mechanism has been implemented in HDFS to safeguard the security of data stored in HDFS. Some data saved in HDFS can be secured with the application of AES encryption technique. I conducted an extensive research on security challenges around large data in the context of Hadoop, along with numerous solutions and technologies utilized to secure it.