are reliable, efficient, user-friendly,
and maintainable has been, and probably always will be, a grand challenge
in computing research. Moreover, it
typically takes 10 to 20 years for a technology to mature from being a good
research idea to being widely used in
practice. 14, 15 This fundamental aspect
of computing, combined with the importance of software in modern society, means there is no reason funding
for computing research should not be
at a level comparable to that found in
other scientific disciplines, including
physics and clinical medicine.
These results have further consequences. First, half of the citations
in the computing literature are more
than seven years old. Publications older than seven years may be viewed as
old but still considered relevant by the
authors citing them. Therefore, one
should take care criticizing or ignoring
literature just because it is “old”; other
criteria must be used to judge quality
The relatively long cited half-life of
computing literature also indicates that
the time lag between submitting a paper
to a journal and it being published in
that journal should not be a major con-
cern; such work is rarely obsolete before
publication. In any case, the delay may be
significantly shorter in the future, as an
increasing number of journals publish
their articles online shortly after ac-
cepting them for publication.
I thank Chris Wright for help clarifying
basic concepts and stimulating comments; Gilles Brassard and anonymous
referees for valuable comments; and
Alexander Ottesen, Birgitte Refsland,
and Bjørnar Snoksrud for help collecting the reported data.
How the Study Was Done
i used thomson’s JCR Science Edition ( 6,417 journals in 172 categories) and Social
Sciences Edition ( 1,865 journals 55 in categories) for 2007, with most journals in the
(natural) sciences covered in the selection. the coverage in the social sciences was
less comprehensive. 13 to comprehensively compare the overall computing discipline
with other scientific disciplines, i first aggregated the Jcr journal categories into 22
disciplines, per Science Watch ( http://sciencewatch.com/about/met/fielddef/) and
discarded eight of the 227 Jcr categories because i could not fit them into the scheme
of the aggregated disciplines.
Jcr provided citation data at the level of both journals and categories but did not
provide half-lives for new journals or journals cited fewer than 100 times. among
the 382 cS journals, 8% and 2%, respectively, lacked cited and citing half-lives. in
the calculations of the aggregated results by discipline (see table 1), i weighted the
categories with respect to their number of journals.
For half-lives > 10, Jcr used only the value 10 in the calculation of aggregated half-lives. in the aggregation of half-life values from the Jcr categories into the disciplines,
i used the same approximation. a half-life “> 10” was reported for individual categories
in nine of the 22 disciplines; on average, 25% of the categories had the value “> 10.” note
that even if these nine disciplines were registered with exact values, it would not affect
the cS position relative to the other disciplines in table 1.
Jcr focused on journals for citation data. however, though a study7 reported
that conference proceedings were less cited in the computing literature than
books and journals, conferences play an important role in computing research. i
therefore investigated the proceedings in the acM digital library (from conferences
and workshops in 2007) to make the data comparable with the data from the Jcr
2007 edition. i included all scientific papers with at least one reference where the
publication year was given; i thus excluded 2.7% of the papers on this ground. a script
crawled the Web sites and extracted the references of each article in 307 proceedings.
i then analyzed the output using a regular expression to identify the year of publication,
enabling me to calculate the citing half-life. the 0.9% of the references lacking a clear
year of publication required manual inspection.
1. barnett, g.A. and fink, e.l. Impact of the Internet
and scholar age distribution on academic
citation age. Journal of the American Society for
Information Science and Technology 59, 4 (feb.
2. burrell, Q. Stochastic modelling of the first-citation
distribution. Scientometrics 52, 1 (Sept. 2001), 3–12.
3. cunningham. S.j. and bocock, d. obsolescence of
computing literature. Scientometrics 34, 2 (oct. 1995),
4. de Solla price, d.j. citation measures of hard
science, soft science, technology and nonscience.
In Communication Among Scientists and Engineers,
c.e. nelson and d.K. pollack, eds. d.c. heath and
company, lexington, MA, 1970, 3–22.
5. de Solla price, d.j. networks of scientific papers:
the pattern of bibliographic references indicates the
nature of the scientific research front. Science 149,
3683 (july 1965), 510–515.
6. glänzel, w. towards a model for diachronous and
synchronous citation analyses. Scientometrics 60, 3
(dec. 2004), 511–522.
7. goodrum, A.A., Mccain, K.w., lawrence, S., and giles,
c.l. Scholarly publishing in the Internet age: A citation
analysis of computer science literature. Information
Processing and Management 37, 5 (Sept. 2001),
8. ISI web of Knowledge. Journal Citation Reports on
the Web 4. 2. the thomson corporation, 2008; http://
9. ladwig, j.p. and Sommese, A.j. using cited half-life
to adjust download statistics. College & Research
Libraries 66, 6 (nov. 2005), 527–542.
10. larivière, V., Archambault, É., and gingras, y. long-term variations in the aging of scientific literature:
from exponential growth to steady-state science
(1900–2004). Journal of the American Society for
Information Science and Technology 59, 2 (jan. 2008),
11. lazowska, e.d. and patterson, d.A. An endless frontier
postponed. Science 308, 5723 (May 2005), 757.
12. Misa, y.j. understanding ‘how computing has changed
the world.’ IEEE Annals of the History of Computing
29, 4 (oct.–dec. 2007), 52–63.
13. Moed, h.f. Citation Analysis in Research Evaluation.
Springer, dordrecht, the netherlands, 2005.
14. oster weil, l.j., ghezzi, c., Kramer, j., and wolf, A.l.
determining the impact of software engineering
research on practice. IEEE Computer 41, 3 (Mar.
15. redwine jr., S.t. and riddle, w.e. Software technology
maturation. In Proceedings of the Eighth International
Conference on Software Engineering (london, Aug.
28–30). Ieee computer Society press, los Alamitos,
cA, 1985, 189–200.
16. Spinellis, d. the decay and failures of web references.
Commun. ACM 46, 1 (jan. 2003), 71–77.
17. Stinson, r. and lancaster, f.w. Synchronous
versus diachronous methods in the measurement
of obsolescence by citation studies. Journal of
Information Science 13, 2 (Apr. 1987), 65–74.
18. Száva-Kováts, e. unfounded attribution of the
‘half-life’ index-number of literature obsolescence to
burton and Kebler: A literature science study. Journal
of the American Society for Information Science and
Technology 53, 13 (nov. 2002), 1098–1105.
19. weingarten, f. government funding and computing
research priorities. ACM Computing Surveys 27, 1
(Mar. 1995), 49–54.
Dag I.K. Sjøberg ( firstname.lastname@example.org) is a professor of
software engineering in the department of Informatics at
the university of oslo, norway.