The Semantic Web is arising over the pas few years as a realistic option for a world wide Information Infrastructure, with its promises of semantic interoperability and serendipitous reuse.In this paper we will analyse the essential ingredients of semantic technologies, what makes them suitable as the foundation for the Information Infrastructure, and what the alternatives to semantic technologies would be as foundations for the Information Infrastructure. We will make a survey of the most important achievements on semantic technologies in the past few years, and point to the most important challenges that remain to be solved.
Historical trend towards increasing demands on interoperabilityWhen Thomas Watson, the founder of IBM, was asked for his estimate of how many computers would be needed worldwide, his reply is widely claimed to have been: 'about five' 2 1 . Of course, this presumed reply was given in 1943, but it shows the enormous shift in perspective that has taken place since the very early days of computing. Right until the late 1970's, the dominant perspective on computing was that of mainframe computing: large machines that provided centralised means of computing and data storage. In such a centralised perspective, interoperability is not the main concern: data are locked up in a centralised location, movement of data is rare, and if data is to be integrated, a special purpose ad hoc transformation procedure is applied to transform the data into the required format.The first revolution that was a major upset to the centralised perspective was the advent of the PC in the 1980's (ironically enough, also dominated by IBM). Suddenly, there were millions of small computing devices, each of which was capable of storing its own data, without recourse to centralised data storage. In this context, interoperability of data was becoming a problem: how to combine the data set stored in (or generated on) one PC with those of another PC, where another user in a different organisation, would be generating their own data? However, the low degree of connectivity between the different PCs still kept the interoperability problem at bay. It was only the second revolution that really caused the data interoperability problem to bite, namely the advent of the Internet (also arising in the 1980's), culminating in the rapid growth of the Web in the 1990's. The Internet has solved most wide-area networking problems with its nearly universally supported TCP/IP Internet Protocol and its DNS (Domain Name System) host-addressing scheme.Suddenly, it became possible to exchange information from any computer to any other computer, and between any two users on the planet. In such a setting, special purpose and ad hoc 1 although this quote is widely questioned now , it makes the point 3 transformation procedures to import data are no longer a feasible alternative, and more principled mechanisms to ensure interoperability are required.
Interoperability at different abstraction layersThe pro...