2024
DOI: 10.1109/tsc.2023.3331020
|View full text |Cite
|
Sign up to set email alerts
|

JARVIS: Joining Adversarial Training With Vision Transformers in Next-Activity Prediction

Vincenzo Pasquadibisceglie,
Annalisa Appice,
Giovanna Castellano
et al.

Abstract: In this paper, we propose a novel predictive process monitoring approach, named JARVIS, that is designed to achieve a balance between accuracy and explainability in the task of next-activity prediction. To this aim, JARVIS represents different process executions (traces) as patches of an image and uses this patch-based representation within a multi-view learning scheme combined with Vision Transformers (ViTs). Using multi-view learning we guarantee good accuracy by leveraging the variety of information recorde… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 36 publications
0
1
0
Order By: Relevance
“…Cremer et al tested an architecture on 3 datasets for drug toxicity classification [79]. A variation of Attention Rollout was employed by Pasquadibisceglie et al to generate heatmaps in a framework for next-activity prediction in process monitoring [80]. Attention Rollout was employed in conjunction with Grad-CAM (see Section 5.3) by Neto et al to detect metaplasia in upper gastrointestinal endoscopy [81].…”
Section: Attention-based Methodsmentioning
confidence: 99%
“…Cremer et al tested an architecture on 3 datasets for drug toxicity classification [79]. A variation of Attention Rollout was employed by Pasquadibisceglie et al to generate heatmaps in a framework for next-activity prediction in process monitoring [80]. Attention Rollout was employed in conjunction with Grad-CAM (see Section 5.3) by Neto et al to detect metaplasia in upper gastrointestinal endoscopy [81].…”
Section: Attention-based Methodsmentioning
confidence: 99%