Computing Education Will Not Be One Size Fits All
phones, computers, radios, laptops, mobile phones, satellites,
routers, etc. ICT, as a term and in the understanding that users
have of it, tends to encompass both hardware and software,
all under the guise of usage and information flow [ 20]. This
term avoids the possible narrowness of computing science, and avoids the potential
machine focus of computer
science. On the other hand,
there is nothing inherent in
the term that makes it clear
that computing is involved at
all, which can make it difficult to talk about what we are
educating people to do, what
we are educating them about.
It would be helpful if the 21st
century saw the various disciplinary communities coalesce
around a name that serves as
an effective and flexible umbrella over these many perspectives on computing, one
that points effectively toward the underlying disciplinary (and,
increasingly, interdisciplinary) content.
IT’S A BIG WORLD OUT THERE!
As argued previously, much of computing education to date
has been driven by both the historical development of the field
and the recent development of personal uses of technology and
interdisciplinary applications of computing. During the last 15
years much of the computing industry in the developed world
has focused on monetizing the user. This is done through the
development of many consumer products which presumably
improve our lives (talk to your TV remote! let your thermostat
learn your habits so that you never have to set it! let your refrigerator tell you what to buy at the store! let your appliances directly
order products!). Monetizing is also done through analysis of
user data, both at the individual user level and in aggregate,
while we are left with little control over personal data once it
heads out to commercial entities (see the work of Debra Estrin [ 8]). As technologically interesting as these developments
might be, they are fundamentally intended to make more comfortable lives that are already fairly comfortable. Yet for every
new application that we presumably cannot live without in the
developed world, there is an article or book that describes the
negative effects on social interaction [ 18], the stunted development of empathy and communication skills among today’s
children [ 17], the need for digital detox [ 5], and the potential
uses of the vast quantity of data we are generating [ 12].
What about the rest of the world? There is clear evidence
[ 15, 20] that a facile uncritical application of certain technol-
ogies in developing world contexts will be unsuccessful. In
some cases, such as the introduction of computers into el-
ming, and compilers. As the underlying theory evolved, new
data structures were designed, algorithmic analysis came of
age, language paradigms and new languages were created, and
courses were developed on these topics.
The PC era changed the
way we teach computing because it became easier to provide equipment for students,
we could have closed labs like
other sciences, and students
could do class work outside
of formal academic spaces.
Meanwhile, the curriculum
continued to evolve. Hardware improvements and new
applications supported interest in topics such as computer graphics, human computer interaction, and artificial
intelligence, while courses
on software engineering and
algorithmic analysis became
more common at the undergraduate level. This period also saw an increase in discussions
about computer literacy and information literacy, along with a
seemingly perennial debate about the role of computer science
departments in post-secondary institutions, particularly in liberal arts colleges.
In the current period the demand for computing education
has increased significantly and it is no longer only at the undergraduate and graduate level. The ubiquity of computing across
application areas means that many people feel the need to know
something about computing. Meanwhile, personal and Io T devices require more widespread knowledge about hardware, embedded computing, and security. These are no longer niche areas. We also must reconsider ethical concerns considering the
deluge of data that is available and the issues that arise when
designing and building devices that reside in people’s homes
and pockets [ 9, 13].
But wait a moment! Everything I have written above is a very
Western developed-world view of computing education. In the
history laid out above, computing education evolved at the undergraduate and graduate levels, and is now trickling down into
elementary and secondary education. But when we consider
the rest of the world, with an eye toward the future, there is
much more that should shape computing education.
WHAT’S IN A NAME?—PART 3
There’s a very important term missing from the disciplines
listed above, namely ICT (Information and Communication
Technologies). ICT is a term that has been adopted globally
and is used much the way that “IT” (information technology)
is used in the U.S. ICT has proven to be a very flexible umbrella term, adjusting over time to include television, fixed-line
It would be helpful if the 21st
century saw the various
disciplinary communities coalesce
around a name that serves
as an effective and flexible umbrella
over these many perspectives
on computing, one that points
effectively toward the underlying
disciplinary (and, increasingly,
interdisciplinary) content.