We present a novel quantitative approach to assessing sensory components on e-commerce websites. The Online Sensory Marketing Index (OSMI) provides a valuable measure of the sensory output exhibited by text, images, and other media. The OSMI enables website creators and marketers to communicate about sensory marketing elements and related components. Accordingly, websites could be designed to achieve better sensory appeal and mitigate weaknesses. Our index allows for the creation of sensory templates for various industries. Utilizing a field study of 16 websites in the tech, automotive, fashion, and food industries, we present sensory measures for websites’ acoustical and visionary elements that are easy to comprehend. Additionally, we introduce a score to quantify haptic, olfactory, and gustatory components to assess the online sensory consumer experience. We conclude and propose weighting offline and online sensory parameters per industry. Accordingly, we study quantitative parameters for online sensory overload and deprivation. Our assessment offers a comfortable determination of websites’ OSMI.
Multisensory consumer engagement on e-commerce websites is technically limited to visual, acoustic, and written elements. Consumers communicate, buy, and share products and services via digital environments in which sensory information is limited. To improve consumers' online sensory experience, media types and the content need to be quantitatively assessed and adapted. This project aims to develop a quantitative model, an Online Sensory Marketing Index (OSMI), which assesses ecommerce websites in multisensory communication quality. The OSMI will be supported by an automatic procedure that is based on artificial intelligence. Content of texts, images, and videos is evaluated by natural language processing (NLP), natural language generation (NLG) as well as automatic machine learning (AutoML) procedures. Multiple e-commerce websites from various industries are examined.
We present how artificial intelligence (AI)-based technologies create new opportunities to capture and assess sensory marketing elements. Based on the Online Sensory Marketing Index (OSMI), a sensory assessment framework designed to evaluate e-commerce websites manually, the goal is to offer an alternative procedure to assess sensory elements such as text and images automatically. This approach aims to provide marketing managers with valuable insights and potential for sensory marketing improvements. To accomplish the task, we initially reviewed 469 related peer-reviewed scientific publications. In this process, manual reading is complemented by a validated AI methodology. We identify relevant topics and check if they exhibit a comprehensible distribution over the last years. We recognize and discuss similar approaches from machine learning and the big data environment. We apply state-of-the-art methods from the natural language processing domain for the principal analysis, such as word embedding techniques GloVe and Word2Vec, and leverage transformers such as BERT. To validate the performance of our newly developed AI approach, we compare results with manually collected parameters from previous studies and observe similar findings in both procedures. Our results reveal a functional and scalable AI approach for determining the OSMI for industries, companies, or even individual (sub-) websites. In addition, the new AI selection and assessment procedures are extremely fast, with only a small loss in performance compared to a manual evaluation. It resembles an efficient way to evaluate sensory marketing efforts.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.