Low-level helicopter operations are typical military missions, for example in forward air medical evacuation missions. Up to now, these types of missions are either carried out in manual flight or with a rather conservative automation on rare platforms. For the latter, the only possible intervention method for the pilot is to manually take over the helicopter. It can be envisaged that future platforms will provide more dynamic low-level automation capability. At the same time, it is very likely that the crew will have to fulfill other tasks like managing unmanned systems. This will fully decouple the pilot from the flight control task for periods of time and reduce the ability to quickly take over the helicopter under threat conditions. Therefore, automation functions need to be available to avoid threats and alter the planned path on short notice which further reduces a comprehensive system understanding and the ability for the pilot to intervene. This paper presents a multimodal cueing concept for human-machine shared control under automatic trajectory following low-level operation, which is being developed within the project "US-German Advanced Technologies for Rotorcraft Project Agreement". The system enables the helicopter to follow planned low-level paths and provides the pilot with tactile, auditive and visual cues. The trajectory following function is complemented with a collision avoidance method to create a "carefree" automation. Intermediate results for the multimodal cueing and interaction concept are presented, which were gathered from validation sessions and workshops with expert pilots at DLR's Air Vehicle Simulator (AVES).