THEORIES IN AI FALL INTO TWO broad categories: mechanism theories and content theories. Ontologies are content theories about the sorts of objects, properties of objects, and relations between objects that are possible in a specified domain of knowledge. They provide potential terms for describing our knowledge about the domain. In this article, we survey the recent development of the field of ontologies in AI. We point to the somewhat different roles ontolo-gies play in information systems, natural-language understanding, and knowledge-based systems. Most research on ontologies focuses on what one might characterize as domain factual knowledge, because knowl-ede of that type is particularly useful in natural language understanding. There is another class of ontologies that are important in KBS-one that helps in sharing know-eldge about reasoning strategies or problem-solving methods. In a follow-up article, we will focus on method ontologies. Ontology as vocabulary In philosophy, ontology is the study of the kinds of things that exist. It is often said that ontologies "carve the world at its joints." In AI, the term ontology has largely come to mean one of two related things. First of all, ontology is a representation vocabulary, often specialized to some domain or subject matter. More precisely, it is not the vocabulary as such that qualifies as an ontology, but the concep-tualizations that the terms in the vocabulary are intended to capture. Thus, translating the terms in an ontology from one language to another, for example from English to French, does not change the ontology conceptually. In engineering design, you might discuss the ontology of an electronic-devices domain, which might include vocabulary that describes conceptual elements-transistors, operational amplifiers, and voltages-and the relations between these elements-operational amplifiers are a type-of electronic device, and transistors are a component-of operational amplifiers. Identifying such vocabulary-and the underlying conceptualizations-generally requires careful analysis of the kinds of objects and relations that can exist in the domain. In its second sense, the term ontology is sometimes used to refer to a body of knowledge describing some domain, typically a commonsense knowledge domain, using a representation vocabulary. For example, CYC 1 often refers to its knowledge representation of some area of knowledge as its ontology. In other words, the representation vocabulary provides a set of terms with which to describe the facts in some domain, while the body of knowledge using that vocabulary is a collection of facts about a domain. However , this distinction is not as clear as it might first appear. In the electronic-device example , that transistor is a component-of operational amplifier or that the latter is a type-of electronic device is just as much a fact about