The present paper describes an empirical user study intended to compare the programming efficiency of our proposed domain-specific language versus a mainstream event language when it comes to modify multimodal interactions. By concerted use of observations, interviews, and standardized questionnaires, we managed to measure the completion rates, completion time, code testing effort, and perceived difficulty of the programming tasks along with the perceived usability and perceived learnability of the tool supporting our proposed language. Based on this experience, we propose some guidelines for designing comparative user studies of programming languages. The paper also discusses the considerations we took into account when designing a multimodal interaction description language that intends to be well regarded by its users.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.