Density functional theory (DFT) is an enticing subject. It appeals to chemists and physicists alike, and it is entrancing for those who like to work on mathematical physical aspects of problems, for those who relish computing observable properties from theory, and for those who most enjoy developing correct qualitative descriptions of phenomena. It is this combination of a qualitative model that at the same time furnishes quantitative reliable estimates that makes DFT particularly attractive for chemists. DFT is an alternative, and complementary, to wave function theory (WFT). Both approaches are variations of the basic theme of electronic structure theory, and both methods originated during the late years of the 1920s. Whereas WFT evolved rapidly and gained general popularity, DFT found itself in a state of shadowy existence. It was the appearance of the key papers by Hohenberg and Kohn (1964) and by Kohn and Sham (1965), generally perceived as the beginning of modern DFT, which changed the perception and level of acceptance of DFT. With the evolution of reliable computational technologies for DFT chemistry, and with the advent of the generalized gradient approximation (GGA) during the 1980s, DFT emerged as powerful tool in computational chemistry, and without exaggeration the 1990s can be called the decade of DFT in electronic structure theory. During this time period, despite the lack of a complete development, DFT was already competitive with the best WFT methods. Furthermore, the advancement of computational hardware as well as software has progressed to a state where DFT calculations of 'real molecules' can be performed with high efficiency and without major technical hurdles. At the end of the first decade of the new millennium, it appeared that DFT might have become a victim of its own success. DFT has transformed into an offthe-shelf technology and ready-to-crunch component, and often was and still is used as such. Yet it became clear that the happy days of black-box DFT are over,