With the rise of artificial intelligence and related computational tools in everyday dealings with knowledge organisation, production, and distribution, incl. for example archives and historyrelated applications, we're concerned whether these computational methods 'colonize' and fundamentally change our common approaches to what constitutes studying and knowing a subject matter. We will unpack upon these concerns, looking at phenomena such as a lack of completion and categorisation in biodiversity archives, or new methods of creating artificial fossils as ways of filling gaps within historical datasets and potentially narratives. We also call back into how ontological architectures of computer science have emerged and how they defined ways in which knowledge is accessed. Via the examples of various case studies and thought experiments, the paper tries to examine the initial concern and predict its potential consequences, building upon the question as to what degree machine-learning-based approaches can augment our methods of analysis not just in history but in cultural behaviours.In other words, how might computational models of ontology be producing an epistemological shift within the quality of knowing by imposing a knowledge system of references, linked nodes, hashtags, and databases that are never entirely complete in representing subjects they are set to define. Thus, asking if we shall hold on to our approaches of comprehension of things and their emergence or instead succumb to the generative, on-demand, a click away, always-at-yourfingertips forms of knowing and comprehending? Domain Ontology. Computer Science. Meta-Archaeology. AI.