In this note, the influence of sampling period on a widely accepted class of optimal fault-detection performance is studied. The background of this study is the important role played by the sampling period in embedded networked control systems. It is shown that the optimal fault-detection performance index will become worse, if the sampling period is increased by an integer multiple. The main tool used for the analysis is the lifting technique, which bridges systems with different sampling periods.