This article presents the use of ARM's fast interrupt request (FIQ) to accomplish better jitter performance on real-time drivers without using patches for real-time extensions on the native Linux kernel code. Writing an FIQ interrupt handler is challenging due to the lack of Linux kernel support and the need to avoid page faults exception during its execution. We investigate and evaluate a mechanism that employs static mapping for peripherals and changes on the Linux kernel code to allow the FIQ interrupt handler to be written in the C language. Furthermore, the FIQ performance was evaluated by comparing it with a timer Interrupt Request on Linux PREEMPT-RT in full CONFIG_PREEMPT_RT mode. Both were applied on a Linux driver for data acquisition of a pipeline inspection gauge system. Results show that the FIQ approach was able to reduce in 97.49% the interrupt jitter and, as a result, allowed an increase in the data acquisition frequency from 1024 Hz to 2048 Hz, showing that the FIQ approach can be considered for real-time applications without resorting to real-time extensions.
K E Y W O R D Sembedded software, FIQ interrupt, jitter, Linux kernel, pipeline inspection gauge, real-time driver
INTRODUCTIONReal-time applications are widely employed by several industrial sectors, such as robotics, 1 aerospace, 2 and automotive, 3 using real-time operating systems (RTOSs). The timing correctness of systems is traditionally guaranteed by the worst-case execution time (WCET) analysis. The purpose of this analysis is to determine if the system meets its real-time constraints, that is, short response time of a task, short execution time of the interruption, and among others. Usually, the WCET analysis is mandatory for hard-real time systems, 4 however, other real-time applications can be less demanding regarding accomplishments of the real-time constraints, and their performance can be evaluated according to their latency and jitter. 5 Jitter caused by both hardware (caches and pipelines) and software effects (disabled interrupts and mechanisms for synchronizing processes) is typically in the order of microseconds, which can be long enough to affect a real-time application. For instance, in real-time control systems, feedback control loops are implemented as execution of periodic tasks. The impact of jitter on such application is typically analyzed to ensure control system's stability. [6][7][8] Latency is defined as the time elapsed between the occurrence of an external event and the correct reaction to thatThis is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.