Billions of people live with visual and/or hearing impairments. Regrettably, their access to systems remains delayed, leaving them socially excluded. The need for universal access of next-generation systems and users’ inclusion is paramount. We pose that a smart society should respond to this crucial need. Following ability-based design principles, we introduce a simulated social robot that adapts to users’ sensory abilities. Its working was assessed via a Rock–Paper–Scissors game in an Intelligent Environment (IE), using three modes: where the user is able to see and hear, only see, or only hear. With this game, two user-studies were conducted using the UMUX-LITE usability score, an expectation rating, and the gap between experience and expectation, complemented with two open questions. A repeated measures Multivariate ANalysis Of VAriance (MANOVA) on the data from study 1 unveiled an overall difference between the three modes, F ( 6 , 6 ) = 6.823, η p 2 = .872, p = .017. Users expected applications to be harder to use with a disability, especially a visual impairment. All modes were considered accessible, with the experience exceeding expectations for the mode with a hearing impairment. In parallel, substantial variance was observed across participants and the results from the open questions suggested improvements. To reduce this variance and increase system stability, study 2 was run with an enhanced design. A repeated measures MANOVA on the data from study 2 confirmed study 1’s findings, F ( 6 , 6 ) = 12.801, η p 2 = .928, p = .003. Moreover, experiences exceeded expectations in all modes and the variance among participants was substantially decreased. We conclude that IE applications managed by a social robot can be adapted to user’s sensory abilities, improving smart society’s accessibility, and, hence, reducing social exclusion.