2022
DOI: 10.2174/2666796701999201204115545
|View full text |Cite
|
Sign up to set email alerts
|

Therapeutic Options for the Treatment of 2019-Novel Coronavirus in India: A Review

Abstract: Purpose: As of from 30 Jan to 31 May, 2020, more than 182,143 confirmed cases reported in India along with 86,984 recovered cases and 5164 deceased cases of COVID-19. More than 53 countries are also affected with this pandemic virus. However, the lack of specific drugs to prevent/treat this pandemic disease is a major problem at this current scenario. In this regard, this systemic review was conducted to identify the therapeutic approaches and researches which are ongoing in India against COVID-19. Methods: … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 35 publications
0
1
0
Order By: Relevance
“…[4] On the one hand, compared to GBDT which only uses first-order derivatives, the XGBoost algorithm carries out second-order derivatives to make the loss function more accurate and adds a regular term in the objective function, Ω(𝑓 𝑘 ) which is used to control the complexity of the model to avoid overfitting; on the other hand, the XGBoost algorithm is an advanced version of the gradient boosting decision tree, which grows the decision tree by constant feature splitting, and in the process of learning the decision tree, it fits the error between the actual value and the predicted value of the model to improve the accuracy of prediction as shown in Figure 1 and Figure 2. [5] Figure 1: XGBoost algorithm integration flow (Left) Figure 1: Flowchart of the gradient boosting decision tree model (Right)…”
Section: Principles Of the Methodologymentioning
confidence: 99%
“…[4] On the one hand, compared to GBDT which only uses first-order derivatives, the XGBoost algorithm carries out second-order derivatives to make the loss function more accurate and adds a regular term in the objective function, Ω(𝑓 𝑘 ) which is used to control the complexity of the model to avoid overfitting; on the other hand, the XGBoost algorithm is an advanced version of the gradient boosting decision tree, which grows the decision tree by constant feature splitting, and in the process of learning the decision tree, it fits the error between the actual value and the predicted value of the model to improve the accuracy of prediction as shown in Figure 1 and Figure 2. [5] Figure 1: XGBoost algorithm integration flow (Left) Figure 1: Flowchart of the gradient boosting decision tree model (Right)…”
Section: Principles Of the Methodologymentioning
confidence: 99%