When INFROSS began in the autumn of 1967, although a large number of studies had been conducted into the requirements of scientists for information, very little had been done in the field of social science information. There are a number of possible reasons for this. Social scientists, faced with a much smaller total volume of information, were much less information‐conscious and less inclined to seek for solutions. There are very few specialist libraries in the social sciences, and few librarians were therefore confronted with social scientists' information needs in the same way as librarians in scientific libraries were confronted with users and their problems. Finally, until OSTI came along there was little in the way of funds to support this kind of research. This almost total absence of previous research had its disadvantages and advantages. There were very few clues to guide us, and we were therefore working to a certain extent in the dark. On the other hand, we had a clean and open field uncorrupted by confusing and non‐comparable studies. There is something to be said for being one of the first in a field. (For an extended review of relevant work previously carried out, Michael Brittain's book should be consulted.)
The expression ‘half‐life’, borrowed from physics, has appeared quite frequently in the literature on documentation since 1960, when an article by Burton and Kebler on The ‘half‐life’ of some scientific and technical literatures was published, although it had certainly been used previously. Burton and Kebler point out that literature becomes obsolescent rather than disintegrating (as in its original meaning), so that ‘half‐life’ means ‘half the active life’, and this is commonly understood as meaning the time during which one‐half of the currently active literature was published. Numerous studies have been carried out, mainly by the analysis of citations, to establish obsolescence rates of the literature of different subjects. Bourne points out that different studies have given widely different results, so that many of the ‘half‐life’ figures reported are not valid beyond the particular sample of literature or users surveyed; certainly they cannot be used as accurate measures for discriminating between different subject‐fields.
Most citation analyses are based on references taken from two or three source journals. There are good theoretical reasons for believing that these may not be representative of all references. In the social science citation analyses carried out as part of the DISISS programme, references were collected from 140 journals, including forty‐seven drawn at random from a comprehensive list, and also from 148 monographs. Analyses of references drawn from high ranking and randomly selected journals showed differences in date distribution, forms of material cited and rank order of journals cited. Analyses of references drawn from journals and monographs showed differences, some of them large, in date distributions, forms of material cited, subject self‐citation and citations beyond the social sciences, and countries of publication cited. These differences may be peculiar to the social sciences, but any citation analyses that are based on only a limited number and type of sources without specific justification must be regarded with suspicion.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.