Abstract. In a series of papers, we have formalized a Bayesian perception approach for robotics based on recent progress in understanding animal perception. A main principle is to accumulate evidence for multiple perceptual alternatives until reaching a preset belief threshold, formally related to Bayesian sequential analysis methods for optimal decision making. Here we describe how this approach extends naturally to active perception, by moving the sensor with an active control strategy according to the accumulated beliefs during the decision making process. This approach can be seen as a method for solving problems involving Simultaneous Object Localization and IDentification (SOLID), or 'where' and 'what'. Considering an example in robot touch, we find that active perception gives an efficient and accurate solution to the SOLID problem, whereas passive perception was inaccurate and non-robust when the object location was uncertain. Thus, this general approach enables robust and accurate robot perception in unstructured environments.