2018
DOI: 10.1007/978-3-030-03769-7_23
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating Perception Systems for Autonomous Vehicles Using Quality Temporal Logic

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
41
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 38 publications
(41 citation statements)
references
References 11 publications
0
41
0
Order By: Relevance
“…A formal language for specifying requirements on the performance of object detection in the absence of reference data is proposed in Refs. [120,121].…”
Section: Concrete Approaches Of Specifying Perceptionmentioning
confidence: 99%
“…A formal language for specifying requirements on the performance of object detection in the absence of reference data is proposed in Refs. [120,121].…”
Section: Concrete Approaches Of Specifying Perceptionmentioning
confidence: 99%
“…While there is work on evaluating performance of perception with temporal logic, those formal specifications are defined over image data streams, and must be manually formalized for each scenario / data stream [15], [16]. Often, there is high variability in the performance of perception models in seemingly similar environments, such as variations in sun angle [17].…”
Section: Introductionmentioning
confidence: 99%
“…Timed Quality Temporal Logic (TQTL) [5], and Spatio-temporal Quality Logic (STQL) [14] are extensions to MTL that incorporate the semantics for reasoning about data from perception systems specifically. In STQL, which is in itself an extension of TQTL, the syntax defines operators to reason about discrete IDs and classes of objects, along with set operations on the spatial artifacts, like bounding boxes, outputted by perception systems.…”
Section: Introductionmentioning
confidence: 99%
“…1. We show how TQTL [5] and STQL [14] can be used to express correctness properties for perception algorithms. 2.…”
Section: Introductionmentioning
confidence: 99%