Analyzing Electroencephalography (EEG)/Magnetoencephalography (MEG) brain source signals allows for a better understanding and diagnosis of various brain-related activities or injuries. Due to the high complexity of the mentioned measurements and their low spatial resolution, different techniques have been employed to enhance the quality of the obtained results. The objective of this work is to employ state-of-the-art approaches and develop algorithms with higher analysis reliability. As a pre-processing method, subspace denoising and artifact removal approaches are taken into consideration, to provide a method that automates and improves the estimation of the Number of Component (NoC) for artifacts such as Eye Blinking (EB). By using synthetic EEG-like simulation and real MEG data, it is shown that the proposed method is more reliable over the conventional manual method in estimating the NoC. For Independent Component Analysis (ICA)-based approaches, the proposed method in this thesis provides an estimation for the number of components with an accuracy of 98.7%. The thesis is also devoted to improving source localization techniques, which aims to estimate the location of the source within the brain, which elicit time-series measurements. In this context, after obtaining a practical insight into the performance of the popular L2-Regularization based approaches, a post-processing thresholding method is introduced. The proposed method improves the spatial resolution of the L2-Regularization inverse solutions, especially for Standard Low-Resolution Electromagnetic Tomography (sLORETA), which is a well-known and widely used inverse solution. As a part of the proposed method, a novel noise variance estimation is introduced, which combines the kurtosis statistical parameter and data (noise) entropy. This new noise variance estimation technique allows for a superior performance of the proposed method compared to the existing ones. The algorithm is validated on the synthetic EEG data using well-established validation metrics. It is shown that the proposed solution improves the resolution of conventional methods in the process of thresholding/denoising automatically and without loss of any critical information.
Analyzing Electroencephalography (EEG)/Magnetoencephalography (MEG) brain source signals allows for a better understanding and diagnosis of various brain-related activities or injuries. Due to the high complexity of the mentioned measurements and their low spatial resolution, different techniques have been employed to enhance the quality of the obtained results. The objective of this work is to employ state-of-the-art approaches and develop algorithms with higher analysis reliability. As a pre-processing method, subspace denoising and artifact removal approaches are taken into consideration, to provide a method that automates and improves the estimation of the Number of Component (NoC) for artifacts such as Eye Blinking (EB). By using synthetic EEG-like simulation and real MEG data, it is shown that the proposed method is more reliable over the conventional manual method in estimating the NoC. For Independent Component Analysis (ICA)-based approaches, the proposed method in this thesis provides an estimation for the number of components with an accuracy of 98.7%. The thesis is also devoted to improving source localization techniques, which aims to estimate the location of the source within the brain, which elicit time-series measurements. In this context, after obtaining a practical insight into the performance of the popular L2-Regularization based approaches, a post-processing thresholding method is introduced. The proposed method improves the spatial resolution of the L2-Regularization inverse solutions, especially for Standard Low-Resolution Electromagnetic Tomography (sLORETA), which is a well-known and widely used inverse solution. As a part of the proposed method, a novel noise variance estimation is introduced, which combines the kurtosis statistical parameter and data (noise) entropy. This new noise variance estimation technique allows for a superior performance of the proposed method compared to the existing ones. The algorithm is validated on the synthetic EEG data using well-established validation metrics. It is shown that the proposed solution improves the resolution of conventional methods in the process of thresholding/denoising automatically and without loss of any critical information.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.