In this paper we propose a new approach to realtime view-based pose recognition and interpolation. Pose recognition is particularly useful for identifying camera views in databases, video sequences, video streams, and live recordings. All of these applications require a fast pose recognition process, in many cases video real-time. It should further be possible to extend the database with new material, i.e., to update the recognition system online.The method that we propose is based on P-channels, a special kind of information representation which combines advantages of histograms and local linear models. Our approach is motivated by its similarity to information representation in biological systems but its main advantage is its robustness against common distortions such as clutter and occlusion. The recognition algorithm consists of three steps:1. low-level image features for color and local orientation are extracted in each point of the image 2. these features are encoded into P-channels by combining similar features within local image regions 3. the query P-channels are compared to a set of prototype P-channels in a database using a least-squares approach.The algorithm is applied in two scene registration experiments with fish-eye camera data, one for pose interpolation from synthetic images and one for finding the nearest view in a set of real images. The method compares favorable to SIFT-based methods, in particular concerning interpolation. The method can be used for initializing pose-tracking systems, either when starting the tracking or when the tracking has failed and the system needs to re-initialize. Due to its This work has been supported by the CENIIT project CAIRIS (http://www.cvl.isy.liu.se/Research/Object/CAIRIS), EC Grants IST-2003-004176 COSPAL and IST-2002-002013 MATRIS. This paper does not represent the opinion of the European Community, and the European Community is not responsible for any use which may be made of its contents.M. Felsberg · J. Hedborg Computer Vision Laboratory Linköping University, Sweden Tel.: +46-13-282460 Fax: +46-13-138526 E-mail: mfe@isy.liu.se real-time performance, the method can also be embedded directly into the tracking system, allowing a sensor fusion unit choosing dynamically between the frame-by-frame tracking and the pose recognition.