It is also reasonable to believe CS
subareas (we call them “CS areas”
from here on) that deal mainly with
data (such as image processing) are
more likely to have greater productivity than areas in which evaluation
procedures require users (such as
human-computer interaction), programmers (such as software engineering), and organizations (such as
management information systems).
Researcher productivity in these hu-man- and organization-based areas
is bounded by the difficulty of carrying out the empirical evaluations the
fields require. Though these beliefs
are all reasonable, it is all they are, as
they are as yet unproved.
Along with expected differences in
productivity, we also often hear that
different CS areas prefer and value
conferences and journal publications
in different ways; for example, bioinformatics seems more journal-oriented, while computer architecture seems
more conference-oriented.
If there are indeed significant differences in publishing practices among
the various CS areas, then a single pro-duction-based evaluation criterion for
all CS researchers would favor some
areas and disfavor others. A probable
consequence, beyond possible unfair-ness to the disfavored areas, is that
researchers would tend to avoid those
areas in the future. Barbosa and Souza1
discussed this problem with respect
to a uniform publication evaluation
standard in Brazil and its negative impact on human-computer interaction
among Brazilian researchers.
Beyond publication practices, citation practices might also differ among
areas. Areas with fewer researchers
probably reflect fewer citations of papers published in these areas; a uniform evaluation-criteria impact of
one’s research across different CS areas would favor some areas while disfavoring others.
How to evaluate CS researchers has
been discussed elsewhere, including
Meyer et al., 9 Patterson et al., 10 and
Wainer et al., 14 emphasizing the dif-
ferences in scientific publication cul-
ture between CS and other scientific
domains; for example, Meyer et al. 9
discussed the importance of confer-
ence publication in CS, saying, a con-
ference publication is, in some cases,
more prestigious than a journal pub-
lication. The same general guideline
of attributing importance to confer-
ences is included in the Computing
Research Association (CRA) guideline
to faculty promotion in CS. 10 Wainer
et al. 14 showed that a typical CS re-
searcher’s work is not represented in
the standard citation services (such as
Scopus and Thomson Reuters) com-
pared to, say, mathematics and phys-
ics; thus, when using metrics based on
these services, a CS researcher or uni-
versity department could be unfairly
evaluated, especially when competing
against other disciplines. The role of
conferences in CS has also been dis-
cussed by others; Grudin6 collected
many of the relevant articles and dis-
cussions, especially those published
in Communications.
General Description
Our methodology, as described here,
relied on a sampling approach to evalu-
ate the productivity and impact metrics
of researchers and papers in different
CS areas; we considered productiv-
ity as the number of articles published
(in English) per year in journals, con-
ferences, and workshops. Other than
the distinction between journals and
conferences (including workshops),
we did not take into account any mea-
sures of venue quality (such as impact
factor for journals and acceptance rate
or scientific society sponsorships of
conferences). The first step was to de-
fine the CS areas; the second to define
the set of researchers working in each
area, along with the set of conferences
and journals associated with each area;
the third to sample the set of research-
ers working in an area and collect from
their own webpages the number of pa-
pers published from 2006 to 2010; and,
finally, from the set of conferences and
journals associated with a particular
area, we sampled a set of papers and
collected information about their cita-
tion counts. We briefly expand on each
step; for a more detailed explanation of
our methods, see Wainer et al. 15