2018
DOI: 10.1016/j.asoc.2018.07.005
|View full text |Cite
|
Sign up to set email alerts
|

Stationary Mahalanobis kernel SVM for credit risk evaluation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
19
0
2

Year Published

2019
2019
2022
2022

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 48 publications
(21 citation statements)
references
References 21 publications
0
19
0
2
Order By: Relevance
“…Then, only b must be found to obtain the prediction function f(x). According to the Karush-Kuhn-Tucker (KKT) conditions [18], several Equations are obtained as follows:…”
Section: Svrmentioning
confidence: 99%
“…Then, only b must be found to obtain the prediction function f(x). According to the Karush-Kuhn-Tucker (KKT) conditions [18], several Equations are obtained as follows:…”
Section: Svrmentioning
confidence: 99%
“…Despite major advancement towards credit scoring model development and their accuracy, the use of models in actual practice is still a challenge. This uptake is attributed to the number of computational methods currently developed, Neural Networks, Bayesian Networks and Genetic Algorithms operating in a black box manner making it difficult for users to have basic understanding of the system [5]. The approach makes the model developer at least the only one knowledgeable about the operation of the system which consequently shuns off third party users.…”
Section: Problem Statementmentioning
confidence: 99%
“…5 would have degree of membership of about 0.3 for Very Low i.e. VLOW membership function and membership of 0.7 for the Low membership function.…”
mentioning
confidence: 99%
“…Through a learning algorithm, it can automatically search for support vectors that have a superior discriminating ability to classify. A classifier constructed using SVM can maximise the distance between classes and has higher classification accuracy and better generalization; hence, it can obtain a superior learning and classification effect under the condition of small sample learning and can avoid the curse of dimensionality and over-learning problems (Guneralp et al, 2013;Jiang et al, 2018;Salimi et al, 2018). Application of SVM theory to hyperspectral remote sensing image classification can effectively overcome Hughes phenomenon and achieve superior results compared to conventional statistical pattern classification methods (Dalponte et al, 2008;Melgani & Bruzzone, 2004).…”
Section: Introductionmentioning
confidence: 99%