2021 IEEE International Conference on Big Data (Big Data) 2021
DOI: 10.1109/bigdata52589.2021.9671925
|View full text |Cite
|
Sign up to set email alerts
|

Soft Sensing Transformer: Hundreds of Sensors are Worth a Single Word

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
4
0

Year Published

2021
2021
2025
2025

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 16 publications
(4 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…New transformer models have been developed, such as the Soft Sensing Transformer Model [ 24 ], which aims to transform sensor readings into structured formats in the same way that sentences are structured in natural language, thus providing a novel method to handle sensor data in industrial settings. Although this model demonstrates the potential of machine learning techniques in interpreting and structuring sensor data, they do not consider end users and their information needs.…”
Section: Related Workmentioning
confidence: 99%
“…New transformer models have been developed, such as the Soft Sensing Transformer Model [ 24 ], which aims to transform sensor readings into structured formats in the same way that sentences are structured in natural language, thus providing a novel method to handle sensor data in industrial settings. Although this model demonstrates the potential of machine learning techniques in interpreting and structuring sensor data, they do not consider end users and their information needs.…”
Section: Related Workmentioning
confidence: 99%
“…One task similar to ours is soft sensing classification. Soft Sensing Transformer (SST) [20] demonstrates the similarities between sensor readings and text data, and applies Transformer encoder [3] into this task. ConFormer [21] integrates the structures of both CNN and Transformer.…”
Section: Related Workmentioning
confidence: 99%
“…In the wafer manufacturing setting, advancements in state-of-the-art soft-sensing models have recently been made, but with a focus on classification, which is distinct from our soft-sensing regression problem. One such model, Soft-sensing Transformer (SST) [21], utilizes a Transformer encoder [3] to demonstrate the similarities between sensor readings and text data. Another model, ConFormer [22], leverages multi-head convolution modules to achieve fast and lightweight operations while still being able to learn robust representations through multi-head design, similar to transformers.…”
Section: Related Workmentioning
confidence: 99%