2022
DOI: 10.1007/s11517-022-02570-8
|View full text |Cite
|
Sign up to set email alerts
|

Efficient analysis of COVID-19 clinical data using machine learning models

Abstract: Because of the rapid spread of COVID-19 to almost every part of the globe, huge volumes of data and case studies have been made available, providing researchers with a unique opportunity to find trends and make discoveries like never before by leveraging such big data. This data is of many different varieties and can be of different levels of veracity, e.g., precise, imprecise, uncertain, and missing, making it challenging to extract meaningful information from such data. Yet, efficient analyses of this contin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
5

Relationship

1
9

Authors

Journals

citations
Cited by 31 publications
(9 citation statements)
references
References 39 publications
0
9
0
Order By: Relevance
“…K-Means [50,51] Clustering algorithms that can detect complex patterns based on a partition system to group data into several clusters PCAprincipal component analysis [26,[51][52][53] A statistical procedure, which relies on linear transformation for reducing the dimensionality of datasets while preserving crucial information AE-auto encoders [40] Perform dimensionality reduction similar to PCA. However, unlike PCA, which relies on linear transformation, AEs carry out non-linear transformation using deep neural networks SOM-selforganizing maps [54,55] Is an unsupervised machine learning technique to cluster the highdimensional data into low-dimensional outputs consisting of a similar structure like artificial neural networks (ANNs), with the difference that the organizing maps in SOM use competitive learning whereas the ANNs use error correction learning such as back-propagation with gradient descent LDA-Latent Dirichlet Allocation [56] Is a Bayesian unsupervised clustering method that is often employed to cluster topics of a set of documents in each cluster t-SNEt-stochastic neighborhood embedding [57] Is a kind of unsupervised non-linear embedding dimensionality reduction: It embeds the points from a higher dimension to a lower dimension trying to preserve the local structure of data UMAPuniform manifold approximation and projection [58,59] Is a flexible non-linear dimension reduction algorithm based on Riemannian geometry and algebraic topology to learn the manifold structure of the data and find a low dimensional embedding that preserves the essential topological structure of that manifold RFF-Random Fourier Features [60] An approximate kernel method, which maps the given data to a low dimensional randomized feature space based on Euclidean inner product space Docquier, Golenvaux, and Nijssen use a PCA analysis to reduce the dimensionality of the origin-and destination-specific containment measures, extract the first two components of the PCA, and propose that the first PCA component can be interpreted as an average index of the stringency of containment measures, and the second component captures testing and tracing policies [18]. Trajanoska, Trajanov, and Eftimov cluster countries with similarly balanced diets using SOM.…”
Section: Methods [Source] Explanationmentioning
confidence: 99%
“…K-Means [50,51] Clustering algorithms that can detect complex patterns based on a partition system to group data into several clusters PCAprincipal component analysis [26,[51][52][53] A statistical procedure, which relies on linear transformation for reducing the dimensionality of datasets while preserving crucial information AE-auto encoders [40] Perform dimensionality reduction similar to PCA. However, unlike PCA, which relies on linear transformation, AEs carry out non-linear transformation using deep neural networks SOM-selforganizing maps [54,55] Is an unsupervised machine learning technique to cluster the highdimensional data into low-dimensional outputs consisting of a similar structure like artificial neural networks (ANNs), with the difference that the organizing maps in SOM use competitive learning whereas the ANNs use error correction learning such as back-propagation with gradient descent LDA-Latent Dirichlet Allocation [56] Is a Bayesian unsupervised clustering method that is often employed to cluster topics of a set of documents in each cluster t-SNEt-stochastic neighborhood embedding [57] Is a kind of unsupervised non-linear embedding dimensionality reduction: It embeds the points from a higher dimension to a lower dimension trying to preserve the local structure of data UMAPuniform manifold approximation and projection [58,59] Is a flexible non-linear dimension reduction algorithm based on Riemannian geometry and algebraic topology to learn the manifold structure of the data and find a low dimensional embedding that preserves the essential topological structure of that manifold RFF-Random Fourier Features [60] An approximate kernel method, which maps the given data to a low dimensional randomized feature space based on Euclidean inner product space Docquier, Golenvaux, and Nijssen use a PCA analysis to reduce the dimensionality of the origin-and destination-specific containment measures, extract the first two components of the PCA, and propose that the first PCA component can be interpreted as an average index of the stringency of containment measures, and the second component captures testing and tracing policies [18]. Trajanoska, Trajanov, and Eftimov cluster countries with similarly balanced diets using SOM.…”
Section: Methods [Source] Explanationmentioning
confidence: 99%
“…Converting input data into fixed-length numerical vectors for applying different machine learning algorithms such as classification and clustering is a common practice across numerous fields like smart grid [14], [15], graph analytics [16], [17], [18], [19], [20], electromyography [21], clinical data analysis [22], network security [23], and text classification [24]. Authors in [5] use the position weight matrix-based approach to compute feature embeddings for spike sequences.…”
Section: Related Workmentioning
confidence: 99%
“…The probable number of COVID-19 cases are predicted with the help of different deep learning algorithms like recurrent neural network (RNN) and long short-term memory (LSTM) networks. Sarwan Ali et al [3] efficiently analyse the COVID-19 clinical data using machine learning algorithms. In big data analysis, usage of machine learning algorithm is considered as a natural approach which quickly extract the relevant information from the dataset to perform the analysis process.…”
Section: Literature Reviewmentioning
confidence: 99%