12th IEEE International on-Line Testing Symposium (IOLTS'06)
DOI: 10.1109/iolts.2006.32
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating One-Hot Encoding Finite State Machines for SEU Reliability in SRAM-based FPGAs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
22
0

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 60 publications
(22 citation statements)
references
References 4 publications
0
22
0
Order By: Relevance
“…Certain features, such as protocol, service or flag, are not presented numerically, which is why one-hot coding was used [47]. In addition, some of the characteristics, such as duration or src bytes (sbytes), present data with widely dispersed values over a wide numerical range, so they are normalized by both the min-max function…”
Section: Data Set Understudymentioning
confidence: 99%
“…Certain features, such as protocol, service or flag, are not presented numerically, which is why one-hot coding was used [47]. In addition, some of the characteristics, such as duration or src bytes (sbytes), present data with widely dispersed values over a wide numerical range, so they are normalized by both the min-max function…”
Section: Data Set Understudymentioning
confidence: 99%
“…Ideal methods would be robust to handling mixed data types as each value may contain meaningful information. Alternatively, data may be altered to a single type by using methods such as the categorization of continuous variables, and dummy or one-hot encoding[57] of continuous variables. However, the drawback of this approach is that the type of variable encoding used has been shown to affect results[11].…”
Section: Challengesmentioning
confidence: 99%
“…Its principle is to map words to fixed dense data vectors, namely Word Embedding [10]. Compared with traditional one-hot Encoding [11,12], bag-of-word model [13], and vector space model [14,15], etc., the distributed representation possesses relatively good semantic feature expression competence, and in the meantime data form can be read and processed efficiently by neural networks. In recent years, word vector is widely popular in the field of text semantic modeling, which can be attributed to Google's open-source Word2vec word vector tool [16].…”
Section: Introductionmentioning
confidence: 99%