An essential component in the ubiquitous computing vision is the ability of detecting with which objects the user is interacting during his or her activities. We explore in this paper a solution to this problem based on wireless motion and orientation sensors (accelerometer and compass) worn by the user and attached to objects. We evaluate the performance in realistic conditions, characterized by limited hardware resources, measurement noise due to motion artifacts and unreliable wireless communication. We describe the complete solution, from the theoretical design, going through simulation and tuning, to the full implementation and testing on wireless sensor nodes. The implementation on sensor nodes is lightweight, with low communication bandwidth and processing needs. Compared to existing work, our approach achieves better performance (higher detection accuracy and faster response times), while being much more computationally efficient. The potential of the concept is further illustrated by means of an interactive multi-user game. We also provide a thorough discussion of the advantages, limitations and trade-offs of the proposed solution.