Hadoop Framework is used to process big data in parallel fashion. A big data is not only big in size but also it is in different format, different size and with different speeds. To process big data relational database management system is not a suitable one. The hadoop is a most popular framework to process big data. Hadoop framework architecture has many components like name node data, data node, job tracker and task tracker. The performance of hadoop is dependent on how these components execute. The challenge in Hadoop framework is to reduce processing time of job, but these challenges are depend upon various factors like scheduling, performance of map reduce after data encryption, resource allocation, and data encryption. Proposed research is focused on how to overcome these challenge of scheduling, resource allocation, and security. Hadoop data security is also a proposed research area i.e. to find a best suitable encryption algorithm which encrypts hadoop data without affecting hadoop performance.