We present a novel system for browsing through a very large set of images according to similarity. The images are dynamically placed on a 2D canvas next to their nearest neighbors in a high-dimensional feature space. The layout and choice of images is generated on-the-fly during user interaction, reflecting the user's navigation tendencies and interests. This intuitive solution for image browsing provides a continuous experience of navigating through an infinite 2D grid arranged by similarity. In contrast to common multidimensional embedding methods, our solution does not entail an upfront creation of a full global map. Image map generation is dynamic, fast and scalable, independent of the number of images in the dataset, and seamlessly supports online updates to the dataset. Thus, the technique is a viable solution for massive and constantly varying datasets consisting of millions of images. Evaluation of our approach shows that when using DynamicMaps, users viewed many more images per minute compared to a standard relevance feedback interface, suggesting that it supports more fluid and natural interaction that enables easier and faster movement in the image space. Most users preferred DynamicMaps, indicating it is more exploratory, better supports serendipitous browsing and more fun to use.