Vviewpoints
C
O
L
L
A
G
E
B
Y
A
N
D
R
I
J
B
O
R
Y
S
A
S
S
O
C
I
A
T
E
S
,
U
S
I
N
G
T
U
X
B
Y
L
A
R
R
Y
E
W
I
N
G
,
G
N
U
B
Y
A
U
R
E
L
I
O
A
.
H
E
C
K
E
R
T
Kode Vicious
GNL Is Not Linux
What’s in a name?
linker, assembler, and debugger—tools
we now take for granted. All of these advances were significant and important,
but they had one thing holding them
back from even broader acceptance: licensing.
The Unix system and its tools were
written and owned by AT&T, which, at
that time, was the largest monopoly
in the U.S. It did not have to license
anything, of course, because it was the
phone company, and the only phone
company, which put it in a unique posi-
tion, best summed up by Lily Tomlin in
an old comedy sketch: “We don’t care,
we don’t have to.”
The truth was many people inside
AT&T did care and they were able to get
AT&T to sell “cheap” licenses for insti-
tutions such as universities that had to
pay only $1,000 to get the entire source.
Companies had to pay quite a bit more,
but they were able to spread the cost
over their entire computing infrastruc-
ture. Having the source meant you
could update and modify the operating
I keep seeing the terms “Linux” and
“GNU/Linux” online when I am reading
about open source software. The terms
seem to be mixed up or confused a lot
and generate a lot of angry mail and fo-
rum threads. When I use a Linux distro
am I using Linux or GNU? Does it mat-
ter?
What’s in a Name?
Dear Name,
What, indeed, is in a name? As you have
already seen, this quasi-technical topic
continues to cause a bit of heat in the
software community, particularly in
the open source world. You can find the
narrative from the GNU side by utilizing
the link provided in the postscript appearing at the end of this column, but
KV finds that narrative lacking, and so,
against my better judgment about pigs
and dancing, I will weigh in with a few
comments.
If you want the real back story on the
GNU folks and FSF (Free Software Foundation), let me suggest you read Steven
Levy’s Hackers: Heroes of the Computer
Revolution, which is still my favorite
book about that period in the history
of computing, covering the rise of the
minicomputer in the 1960s through the
rise of the early microcomputers in the
1970s and early 1980s. Before we get to
the modern day and answer your question, however, we have to step back in
time to the late 1960s, and the advent of
the minicomputer.
Once upon a time, as all good sto-
ries start, nearly all computer software
cost some amount of money and was
licensed to various entities for use. That
time was the 1950s and 1960s, when, in
reality, very few individuals could afford
a computer, and, in fact, the idea that
anyone would want one was scoffed at
by the companies who made them, to
their later detriment. Software was de-
veloped either by the hardware manu-
facturer to make its very expensive ma-
chines even moderately usable, or by
the government, often in collaboration
with universities.
About the time of the advent of the
minicomputer—which came along
about the same time KV was born,
screaming, because he knew he would
have to fix them and their brethren
someday—two key innovations occurred. Ken Thompson and Dennis
Ritchie invented Unix, a now well-known reaction to the development
of Multics, and computer hardware
took one of its first steps toward af-fordability. No longer would the whole
university have to share and time-slice
a single large mainframe; now each
department, if that department had
$30,000 or so, could share a less powerful machine, but among a much smaller
group of people.
Before Unix, operating systems were
large, complicated, and mostly nonportable. Because Unix was simple and
written in a new, portable assembler—
which we now call C—it was possible
for much smaller groups of people to
write significant software with a lot less
effort. There were good tools for writing portable operating systems and
systems software, including a compiler,
DOI: 10.1145/2892559
Article development led by
queue.acm.org