This paper is devoted to the new methods of organizing interactive museum exhibits by means of additive technologies, programmable microelectronics and multimedia. Traditionally museum exhibition is distanced from the visitor appearing just as a visual image. But new trend is to let the visitor inside the exhibition, providing different forms of interactions. Modern museums constantly search for the new ways of presenting cultural and natural heritage taking into account that exhibitions should be at the same time scientifically accurate, attractive, memorable and, ideally, accessible for the wide audience including disabled people. In this regard, tangible user interfaces based on the Internet of Things and combined with the scientific visualization build up very promising set of technologies that brings to live cyberphysical museum exhibits. These exhibits are an alloy of physical museum items with virtual multimedia content, providing the visitor a unique experience of interactivity. The key feature of cyber-physical exhibits is their tangibility, whereby they become accessible for visually impaired people and deliver much more information for the regular visitors. There are a lot of successful attempts to build cyber-physical exhibits within the museum space. However, there is a lack of methodological basis for this, as well as the lack of highlevel tools to seamlessly integrate related technologies into the existing museum infrastructure. In this paper we propose using ontology-driven adaptive multiplatform scientific visualization system SciVi as a software basis for cyber-physical museum exhibits. This system was previously successfully used in solving problems related to steering hardware and software solvers in different application domains, monitoring lightweight robotics systems and supporting custom hardware human-machine interfaces. So, it contains necessary mechanisms to adapt to the third-party digital infrastructure (including the museum one) and build all the required middleware and visualizers within it. We tested our approach by developing two cyber physical exhibits: tangible bonobo skull in the State Darwin Museum (Moscow) and tangible titanophone skulls in the Museum of Permian Antiquities (Perm). Bonobo exhibit appears to be a custom joystick in a form of corresponding skull to steer the Sketchfab-rendered 3D model of bonobo monkey head. Titanophone exhibit allows visitors to discover the age variability of titanophone synapsid in an interactive way.