1991
DOI: 10.1016/0020-0190(91)90073-q
|View full text |Cite
|
Sign up to set email alerts
|

SLR(k) covering for LR(k) grammars

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
16
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(16 citation statements)
references
References 0 publications
0
16
0
Order By: Relevance
“…One way to directly assess epistemic uncertainty is to calculate the distance between training and testing activations. As a model is unlikely to produce reasonable outputs for features far from any seen during training, this is a reliable signal for bad model performance ( Lee et al, 2018b ).…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…One way to directly assess epistemic uncertainty is to calculate the distance between training and testing activations. As a model is unlikely to produce reasonable outputs for features far from any seen during training, this is a reliable signal for bad model performance ( Lee et al, 2018b ).…”
Section: Methodsmentioning
confidence: 99%
“…Model activations have covariance, and they do not necessarily resemble the mode for high-dimensional spaces ( Wei et al, 2015 ), so the Euclidean distance is not appropriate for identifying unusual activation patterns. Instead, inspired by the work of Lee et al (2018b) , we make use of the Mahalanobis distance , which rescales samples into a space without covariance. Fig.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Softmax-based classifiers are known to generate overconfident posterior distributions when presented with out-of-distribution detection (ODD) data. To address this issue, a generative classifier, specifically Gaussian discriminant analysis, and a Mahalanobis distance-based framework were utilized to obtain confidence scores, as described by Lee et al (45). To evaluate the performance of this approach, 164 KSα sequences were designated ODD data, and 164 KSβ sequences were designated in-distribution (ID) data.…”
Section: Detection Of T2pks With Potentially New Skeletonsmentioning
confidence: 99%
“…However, such models may demonstrate overconfidence in discerning novel KSβ sequences in the open world(27). To address this concern, an ODD framework based on the Mahalanobis distance was implemented for multiclass novelty detection(45). Notably, certain samples (from 2597 KSβ sequences) are proximal to the labeled data (from 164 KSβ sequences) because such labeled T2PKs with entirely novel carbon skeletons have only been discovered in recent years, such as formicamycin(36) and dendrubin(46).…”
mentioning
confidence: 99%