In this paper an approach for creating natural interactions is discussed, which can be used to facilitate the authoring of interactive live audiovisual performances. The approach is supported by flexible, yet simple software tools and combines spatial mapping, gesture following and fuzzy logic in a way that enables the configuration and authoring of complex interactive mechanisms by demonstration. Due to the connectivity properties of the tools, the approach can be easily incorporated in almost any workflow for creating interactive installation or live performance applications, utilizing sensor and tracking data and combining it with other real-time parameters to drive the behavior of elements, which performers can interact with. This approach was used to facilitate the creation of a gestural interface for a conductor, which enabled him to control a virtual pianist during a live music performance by moving his hands in a natural manner. This gestural interface was configured by letting the conductor demonstrate the desired gestures and by feeding the resulting tracking data to the tools. The advantages and shortcomings of using such an approach are presented and discussed.