2009
DOI: 10.1109/tac.2009.2015563
|View full text |Cite
|
Sign up to set email alerts
|

Influence of Sampling Period on a Class of Optimal Fault-Detection Performance

Abstract: In this note, the influence of sampling period on a widely accepted class of optimal fault-detection performance is studied. The background of this study is the important role played by the sampling period in embedded networked control systems. It is shown that the optimal fault-detection performance index will become worse, if the sampling period is increased by an integer multiple. The main tool used for the analysis is the lifting technique, which bridges systems with different sampling periods.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2009
2009
2017
2017

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 14 publications
references
References 26 publications
0
0
0
Order By: Relevance