Article development led by
To move forward with programming
languages we must first break free from
the tyranny of ASCII.
By PouL-henninG KamP
me I have a tough row to hoe, but I will
attempt to argue that this time Pike is
merely rearranging the deckchairs of
the Titanic and that he missed the next
big thing by a wide margin.
Pike got fed up with C++ and Java
and did what any self-respecting hacker
would do: he created his own language—
better than Java, better than C++, better
than C—and he called it Go.
But did he go far enough?
The Go language does not in any way
look substantially different from any
of the other programming languages.
Fiddle a couple of glyphs here and there
and you have C, C++, Java, Python, Tcl, or
Programmers are a picky bunch
when it comes to syntax, and it is a sobering thought that one of the most rapidly adopted programming languages
of all time, Perl, barely had one for the
longest time. Ironically, what syntax designers are really fighting about is not so
much the proper and best syntax for the
expression of ideas in a machine-under-standable programming language as it
is the proper and most efficient use of
the ASCII table real estate.
oNe oF The naughty details of my Varnish software
is that the configuration is written in a domain-specific language that is converted into C source
code, compiled into a shared library, and executed
at hardware speed. That obviously makes me a
programming language syntax designer, and just as
obviously I have started to think more about how we
express ourselves in these syntaxes.
Rob Pike recently said some very pointed words
about the Java programming language, which if you
think about it, sounded a lot like the pointed words
James Gosling had for C++, and remarkably similar to
what Bjarne Stroustrup said about good ol’ C.
I have always admired Pike. He was already a giant in
the field when I started, and his ability to foretell the
future has been remarkably consistent. 1 In front of
it’s all ascii to me…
There used to be a programming language called ALGOL, the lingua franca
of computer science back in its heyday.
ALGOL was standardized around 1960
and dictated about a dozen mathematical glyphs such as ×, ÷, ¬, and the very
readable subscripted 10 symbol, for
use in what today we call scientific notation. Back then computers were built
by hand and had one-digit serial numbers. Having a teletypewriter customized for your programming language
was the least of your worries.
A couple of years later came the APL
programming language, which included an extended character set containing
a lot of math symbols. I am told that APL
still survives in certain obscure corners
of insurance and economics modeling.
Then ASCII happened around 1963,
and ever since, programming languages
have been trying to fit into it. (Wikipedia
claims that ASCII grew the backslash [\]