Since the advent of modern seismic methods in the 1950s, we have seen the quality of our subsurface images increase dramatically. Along with improvements in engineering, computing and seismic processing techniques, this is largely due to the massive increase in the number of seismic measurements made per unit area. In the 1950s 2D seismic data consisted of hundreds of seismic recordings per sq. km – in 2011, BP acquired an onshore 3D survey in Jordan with around 10 million seismic measurements per sq. km. The first part of this paper describes how, using the Independent Simultaneous Source technique (ISS®) these data were acquired efficiently in a short time frame. Then in the second half of the paper, we describe the careful processing and analysis of a large number of data decimations. We then discuss whether there is an upper limit to the improvements in image quality seen as data density increases. Image quality is judged not only on the quality of the post stack data, but also on the quality of various pre-stack attributes that are increasingly being used for reservoir description and lithology and fluid prediction. We also discuss the challenges associated with managing and processing the huge data volumes that are generated with this style of dense acquisition - modern seismic surveys of this density are starting to contain Petabytes of data.