The plenoptic camera was originally created to allow the capture of the light field, a four-variable volume representation of all rays and their directions, which allows the creation by synthesis of an image of the observed object. This method has several advantages with regard to 3D capture systems based on stereo cameras since it does not need frame synchronization or geometric and color calibration. It also has many applications, from 3DTV to medical imaging. A plenoptic camera uses a microlens array to measure the radiance and direction of all the light rays in a scene. The array is placed at a distance from the principal lens, which is conjugated to the distance where the scene is situated, and the sensor is at the focal plane of the microlenses. We have designed a plenoptic objective that incorporates a microlens array and a relay system that reimages the microlens plane. This novel approach has proven successful. Placing it on a camera, the plenoptic objective creates a virtual microlens plane in front of the camera CCD, allowing it to capture the light field of the scene. In this paper we present the experimental results showing that depth information is perfectly captured when using an external plenoptic objective. Using this objective transforms any camera into a 3D sensor, opening up a wide range of applications from microscopy to astronomy.