2018
DOI: 10.1016/j.ins.2018.04.063
|View full text |Cite
|
Sign up to set email alerts
|

Semi-supervised active learning for support vector machines: A novel approach that exploits structure information in data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 33 publications
(14 citation statements)
references
References 17 publications
0
14
0
Order By: Relevance
“…This is because, in some practical cases, it is difficult to obtain sufficient fault data and labels. Moreover, active learning [295][296][297][298] and transfer learning [230,299] which can address the issues of real-life fault detection and diagnosis cases using unlabeled data should be seriously considered. However, the negative transfer should be avoided in engineering scenarios.…”
Section: Discussionmentioning
confidence: 99%
“…This is because, in some practical cases, it is difficult to obtain sufficient fault data and labels. Moreover, active learning [295][296][297][298] and transfer learning [230,299] which can address the issues of real-life fault detection and diagnosis cases using unlabeled data should be seriously considered. However, the negative transfer should be avoided in engineering scenarios.…”
Section: Discussionmentioning
confidence: 99%
“…For this reason, Zhou et al [16] proposed a safe semi-supervised support vector machine (S4VM). Bernhard Sick et al [17] showed that a semi-supervised support vector machine (SemiSVM) can well exploit structure information in data and greatly improve identification performance with unlabeled data. On the other hand, the large number of instruments in the actual chemical process brings noise and redundant process variables.…”
Section: Background and Significancementioning
confidence: 99%
“…The process includes 22 continuous measured variables (Table 2) and 20 preset fault modes (Table 3). 3E Feed CMV (14) Separator underflow CMV 4A and C Feed CMV (15) Stripper level CMV (5) Recycle flow CMV (16) Stripper pressure CMV (6) Reactor feed CMV (17) Stripper underflow CMV 7Reactor pressure CMV (18) Stripper temperature CMV (8) Reactor level CMV (19) Stripper steam flow 5Recycle flow CMV (16) Stripper pressure CMV (6) Reactor feed CMV (17) Stripper underflow CMV 7Reactor pressure CMV (18) Stripper temperature CMV (8) Reactor level CMV (19) Stripper steam flow CMV (9) Reactor temperature CMV (20) Compressor work CMV (10) Purge flow CMV (21) Reactor cooling water outlet temperature CMV (11) Separator temperature CMV (22) Condenser cooling water outlet temperature Figure 4 shows that the total variance contribution rate of the first 12 principal components has reached 83.15% (more than 80%), so the first 12 principal components can reflect information about all variables. Figure 5 shows the eigenvalues of the first 12 principal components as well.…”
Section: Process Descriptionmentioning
confidence: 99%
“…Generally, multiple semi-supervised terms lead to major empirical improvements on real-world tasks [21], [22]. SALSVM [23] and SELSVM [24] realize an active learning and ensemble learning technique for SVM, respectively. Unfortunately, these offline semi-supervised learning algorithms are still unable to address long-playing large-scale OS 2 L problems directly because of the constraint of time and memory.…”
Section: Introductionmentioning
confidence: 99%