Electroencephalography (EEG) is one of the most common and benign methods for analyzing and identifying abnormalities in the human brain. EEG is an incessant measure of the activities of the human brain. In contrast, when the measurement of EEG is bounded by time and the EEG is synchronized to an exterior stimulus, is known as Event-Related Potential (ERP). ERP has the capability to perceive and explore the human brain's responses to specific sensitive, cognitive, or motor events in real time with high temporal resolution. Among the various techniques, the oddball paradigm is very famous in EEG studies. In an oddball paradigm experiment, brain responses to frequent and infrequent stimuli are measured. However, the success of ERP research is very much dependent on the analysis of clean data sets and unfortunately, EEG is a combination of both neural and nonneural activities which introduce significant sources of noise that are not related to the brain's response to the external stimulus. These unrelated non-EEG components are acknowledged as artifacts and due to these, the quality of the EEG may damage by decreasing SNR (signal-to-noise ratio). In addition, these artifacts may mislead the actual information in the study. Addressing this problem, the purpose of this research is to introduce a machine learning algorithm (ML) that can screen EEG/ERP data to remove data epochs that are disrupted by artifacts and thus produce a clean data set. Overall, three unsupervised ML algorithms are applied to identify noisy epochs and it is found that the DBScan method performs best with 93.43% accuracy. Finally, the success of this study will allow the ERP study to have a cleaner ERP data set in normal laboratory conditions with less complexity in the ERP studies.