Additive manufacturing has become increasingly popular and is opening doors for increased collaboration, quick manufacturing turnaround times, and rapid prototyping. New collaboration opportunities are enabled by 3D model databases that help users find, modify and manufacture parts upon request. Nevertheless, the current process relies heavily on user-defined text-based labels to describe and identify parts, yet user tagging is an expensive and laborious process. To address the limitations of traditional tag-based methods, this work proposes new shape-based search techniques that bring significant usability improvements over the current state of the art. In particular, our approach allows users to query for parts at different stages of the manufacturing process, including approximate models, GCode printer files, and real world objects. At the core of our technique is a generative adversarial network that flattens 3D shapes into depth-based 2.5D images, which are then cataloged and queried based on a frequency-domain representation and a locality-sensitive hashing scheme. We evaluate our methodology using a rich dataset of everyday objects, and our evaluation results report a high accuracy retrieval rate for our test set.