<p><b>Vision based systems are commonly used to control robot systems. The performance of many of these vision systems can be improved by utilising multiple concurrently sensing cameras. This thesis explores an alternative strategy, which uses multiple dissimilar sensors non-concurrently, and selects the appropriate sensor and control method for a given situation. This is especially useful in situations where one or more sensors are physically unable to perceive the target. </b></p>
<p>This strategy was demonstrated with a ball catching task. This required the construction of a mobile sensor platform to both perform the catch and to support the mobile component of the vision system. </p>
<p>Two separate vision based control systems were developed, one utilising a stereo camera and trajectory estimation and the other using visual servoing with a monocular camera. The vision systems were tested individually and then integrated using the proposed technique. The resulting system uses an appropriate sensor and control method depending on the phase of the throw the ball was in.</p>
<p>The integrated system showed an improvement in catching performance over both individual systems, incorporating advantages from each. Catching rate was increased from 60% and 43% for the individual vision systems to 75% for the non-concurrent sensing implementation. This demonstrates the viability of this control strategy.</p>